Backend Engineer II – Data Platform
Remote - India
Full Time
4 days ago
Mid LevelEngineeringWorldwide
$40K - $80K

USD per year

Job Description

Backend Engineer II – Data Platform

Location: Remote - India Department: Product Engineering – Product Development Employment Type: Full-time Remote Work: Yes

About the Company:

Netomi is the leading agentic AI platform for enterprise customer experience. We work with the largest global brands like Delta Airlines, MetLife, MGM, United, and others to enable agentic automation at scale across the entire customer journey. Our no-code platform delivers the fastest time to market, lowest total cost of ownership, and simple, scalable management of AI agents for any CX use case. Backed by WndrCo, Y Combinator, and Index Ventures, we help enterprises drive efficiency, lower costs, and deliver higher quality customer experiences. Want to be part of the AI revolution and transform how the world’s largest global brands do business? Join us! Netomi is seeking a highly analytical and detail-oriented candidate to join the Analytics team in Gurugram. As part of the team, you will work with data science, product, engineering, and customer success teams to drive complex data and trend analyses to propose ways to improve and thereby contribute to improving the experience. You will also be responsible for benchmarking and measuring the performance of various product operations projects, building and publishing detailed scorecards and reports, identifying and driving new opportunities based on customer and business data. We are looking for an Engineer with a passion for using data to discover and solve real-world problems. You will enjoy working with rich data sets, modern business intelligence technology, and the ability to see your insights drive features for our customers. You will also have the opportunity to contribute to developing policies, processes, and tools to address product quality challenges in collaboration with teams.

Responsibilities:

  • Architect and implement clean, modular, and scalable backend services using Java Spring Boot and modern microservice principles.
  • Design efficient database schemas and write optimized queries for RDS (MySQL/PostgreSQL) and optionally NoSQL databases like Elasticsearch MongoDB or DynamoDB.
  • Integrate Kafka or RabbitMQ to build robust and loosely-coupled event-driven architectures.
  • Architect and implement scalable, secure, reliable data pipelines using modern data platforms (e.g., Spark, Databricks, Airflow, Snowflake).
  • Develop ETL/ELT processes to ingest data from various structured/unstructured sources.
  • Perform Exploratory Data Analysis (EDA) to uncover trends validate data integrity derive insights that inform data product development/business decisions.
  • Collaborate closely with data scientists analysts software engineers to design data models supporting high-quality analytics real-time insights.
  • Profile/tune backend performance across databases APIs infrastructure.
  • Write clean maintainable code with comprehensive unit/integration tests ensuring reliability/stability.
  • Thrive in an agile collaborative environment taking ownership of end-to-end feature delivery.

Requirements:

  • Bachelor's or Master's degree in Computer Science Engineering or related field.
  • Experience with Java Spring Boot microservices architecture.
  • Knowledge of RDS (MySQL/PostgreSQL) NoSQL databases Elasticsearch MongoDB DynamoDB.
  • Experience integrating Kafka RabbitMQ event-driven systems.
  • Skilled in building scalable secure reliable data pipelines using Spark Databricks Airflow Snowflake etc.
  • Proficient in ETL/ELT processes handling structured/unstructured data sources.
  • Strong Exploratory Data Analysis (EDA) capabilities.
  • Ability collaborating designing high-quality analytics real-time insights data models.
  • Experience profiling tuning backend performance databases APIs infrastructure.
  • Writing clean maintainable code with unit/integration tests.
  • Agile methodology collaborative team player ownership mindset.

Skills:

Java Spring Boot Microservice principles RDS MySQL PostgreSQL NoSQL Elasticsearch MongoDB DynamoDB Kafka RabbitMQ Spark Databricks Airflow Snowflake ETL ELT Exploratory Data Analysis EDA Data modeling Data warehousing Distributed systems Python SQL Apache Airflow Luigi Prefect AWS Redshift GCP BigQuery DevOps CI/CD Infrastructure as Code Docker Kubernetes AI/ML-integrated solutions Data security GDPR HIPAA Prompt engineering LLM Analytical Detail-oriented Collaborative Ownership

How to Apply