Data Engineer
Overview
Good people, working with good people, for our common good.
Sound good?
KeHE-a natural, organic, specialty and fresh food distributor-is all about "good" and is growing, so there's never been a more exciting time to join our team. If you're enthusiastic about working in an environment with a people-first culture and an organization committed to good living, good food and good service, we'd love to talk to you!
Primary Responsibilities
The Data Engineer will assist in the design and implementation of a modern cloud data architecture that will enable KeHE to continue pushing the limits in the advanced analytics space. This role will work in close conjunction with the Data Science and Enterprise Data teams while also closely collaborating with other departments in the organization to construct scalable solutions that leverage both internal and external data sources. This person will possess a wide range of skills such as; creating reliable pipelines, sources and integrates the data, design and build optimized data delivery solutions. We are looking for an experienced data professional who will integrate traditional and emerging technologies to unlock greater efficiency and scalability of the data.
Essential Functions
- Develop, construct, test and maintain optimal data pipeline/ETL architectures
- Work closely within the team to prepare data for predictive and prescriptive modeling
- Optimize AWS data delivery infrastructure for greater scalability
- Utilize SQL as well as big data tools and frameworks to optimize data acquisition and preparation from enterprise data lake and data warehouse
- Work with Enterprise Cloud Architecture teams to strive for greater functionality in our data systems
- Develop architecture required to return data to data warehouse for front-end product utilization
- Curate data models in the data warehouse to be used by front-end advanced analytics designers
- Provide production level code reviews for the team
- Help design, maintain and implement quality assurance and testing approaches
- Deploy scripts and architectures to production via Jenkins
Minimum Requirements, Qualifications, Additional Skills, Aptitude
- Bachelor’s Degree in Computer Science, Mathematics, Engineering, Management Information Systems or related field
- 1-3 years of experience building data pipelines within the AWS ecosystem
- 1-3 years of experience designing and implementing data warehouse solutions
- Advanced SQL and data design concepts
- Proficient programing experience using Python, R or similar language with experience building production level code
- Proficient working with Jenkins and deploying to production via Jenkin’s jobs
- Desire to stay up to date with current technologies and best practices for data management and data science
- Drive innovation and efficiency through new approaches
- Ability to work in a team environment that promotes collaboration
Preferred Experience and Abilities:
- Experience implementing AWS architecture using Serverless Framework
- Understanding of C programming language
- Experience utilizing big data tools such as PySpark, Scala or others