Data Engineer (Chicago, IL)
Mastery Logistics Systems
Parse, standardize, and analyze large volumes of unstructured and semi-structured text data.
Design, implement, and maintain data processing pipelines in Python, Scala or Ruby.
Build cloud infrastructure necessary to process data from a wide variety of data sources utilizing Docker, Kubernetes, Terraform, Azure DevOps and orchestration tools such as Argo and Airflow.
Collaborate with product managers, designers, and engineers to define requirements, formalize problems, collect data, and implement solutions.
Actively contribute to and participate in code review and team architecture and data strategy discussions.