Hadoop Developer
“The limit of our growth is going to be our imagination and product ideas – not technology”
– Mohit Kapoor (CIO)
What we’ll bring:
- A work environment that encourages collaboration and innovation. We consistently explore new technologies and tools to be agile.
- Flexible time off, workplace flexibility, an environment that welcomes continued professional growth through support of tuition reimbursement, conferences and seminars.
- Our culture encourages our people to hone current skills and build new capabilities, while discovering their genius.
What you’ll bring:
- At least eight years of database/application development experience in an complex enterprise environment
- Experience writing in SQL, stored procedures, query performance tuning preferably on SQL Server
- Strong familiarity with working in a Linux and Windows environment which includes shell and power shell scripting
- At least two years of hands on experience designing and implementing data pipelines in production using tools from the Hadoop ecosystem such as MapReduce, Hive, HBase, Spark, Sqoop, Oozie, and Pig.
- Broad knowledge of software development including software architecture, functional and non-functional aspects, CI/CD, principles and tools
What we’d prefer to see:
- Experience within the Healthcare industry
- Bachelor’s and/or Master’s degree in computer science or a related discipline
Impact you’ll make:
- We have built a Big Data platform that brings together data from disparate sources and allow us to design and run complex algorithms to provide insights into the healthcare business operations. You’ll help us design the next generation of data lake solutions to support this healthcare product line.
- You’ll partner with internal business, product, and technical teams to analyze requirements to deliver solutions to complex issues
- You’ll evaluate and identify ways to automate, streamline, and improve maintenance and administrative practices
- You’ll Analyze complex business processes and design modularized solution for better scalability and maintenance
- You’ll build and test highly performant and scalable enterprise-grade ETL pipelines on a Hadoop platform
- You’ll actively participate and contribute in Agile Scrum ceremonies