At Netomi AI, we are on a mission to create artificial intelligence that builds customer love for the world’s largest global brands.
Some of the largest brands are already using Netomi AI’s platform to solve mission-critical problems. This would allow you to work with top-tier clients at the senior level and build your network.
Backed by the world’s leading investors such as Y-Combinator, Index Ventures, Jeffrey Katzenberg (co-founder of DreamWorks) and Greg Brockman (co-founder & President of OpenAI/ChatGPT), you will become a part of an elite group of visionaries who are defining the future of AI for customer experience. We are building a dynamic, fast growing team that values innovation, creativity, and hard work. You will have the chance to significantly impact the company’s success while developing your skills and career in AI.
Want to become a key part of the Generative AI revolution? We should talk.
Netomi is seeking a highly analytical and detail-oriented candidate to join the Analytics team in Gurugram. As part of the team, you will work with product, engineering, and customer success teams to drive complex data and trend analyses to propose ways to improve and, thereby contributing to improve the experience. You will also be responsible for benchmarking and measuring the performance of various product operations projects, building and publishing detailed scorecards and reports, identifying and driving new opportunities based on customer and business data.
We are looking for a Data Engineer with a passion for using data to discover and solve real-world problems. You will enjoy working with rich data sets, modern business intelligence technology, and the ability to see your insights drive the features for our customers. You will also have the opportunity to contribute to the development of policies, processes, and tools to address product quality challenges in collaboration with teams.
- You will partner with teammates to create complex data processing pipelines in order to solve our clients’ most ambitious challenges.
- You will collaborate with Data Scientists in order to design scalable implementations of their models.
- You will pair to write clean and iterative code based on TDDLeverage various continuous delivery practices to deploy, support and operate data pipelines.
- Advise and educate clients on how to use different distributed storage and computing technologies from the plethora of options available.
- Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions.
- Create data models and speak to the tradeoffs of different modeling approaches.
- Seamlessly incorporate data quality into your day-to-day work as well as into the delivery process.
- 3+ Years of Experience
- Expertise in SQL, PL/SQL & General Software Engineering (proficiency coding in Python/Java any one language)
- Experience in MySQL 8.0+, and AWS Aurora required
- Expert SQL query optimization
- Have a good understanding of data modeling and experience with data engineering tools
- You are comfortable taking a data-driven approach and applying data security strategy to solve business problems
- You’re genuinely excited about data infrastructure and operations with familiarity working in cloud environments
- Working with data excites you: you can build and operate data pipelines, and maintain data storage, all within distributed systems
- Assure effective collaboration between Netomi and the client’s teams, encouraging open communication and advocating for shared outcomes
- Experience writing data quality units and functional tests.
- Strong Experience with any Relational Database (preferably Aurora MySQL)
- Experience with AWS Lambda, Kinesis, RDS, EC2, Quicksight
- Experience with Streaming Platforms such as Kafka, Kinesis, etc.
Netomi is an equal opportunity employer committed to diversity in the workplace. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, disability, veteran status, and other protected characteristics.