Send resume to [email protected]
Apply now

We’re looking for a Data Engineer to be the primary developer for our data platform, designed to support analytical tasks for our innovative monitoring application, led by our Chief Analytics Officer.  You will be a key voice as the company builds out this exciting application.

The ideal candidate is someone who has diverse database experience, including database/schema design and implementation, has an advanced Python programming skillset, and understands how to efficiently and effectively coordinate data preparation tasks.

This person's main focus will be designing data transformation processes, developing/modifying data models, and automating common analysis workflows with Apache Airflow. This will include developing data ETL/ELT pipelines via SQL/Python, developing automated processes to consume new data and augment / increment existing data, identifying opportunities for efficiency, keeping accurate and complete project documentation, and performing quality testing and data assurance.

While you will primarily have internally focused interactions (engineering, quality, project/product managers), you will also interact with customers in support of progressing data acquisition / pipeline needs or troubleshooting issues. 

Key Responsibilities:

  • Quickly bridge-the-gap between source data and the models needed to analytically process the data.

  • Ensure all efforts contemplate a multi-client approach and the automation potential.

  • Master the ingestion to analysis process, manifesting in the data platform.

  • Ensures integrity and security of data.

  • Enhance, standardize, and join different data sets to achieve data science and analytics objectives.

  • Research and build efficient and scalable data storage and retrieval systems that enable interactive reporting on high dimensional data.

  • Educate peers and management on technical tools, processes and best practices.

  • Effectively translate technology terms and functionality into a business vocabulary understandable by non-technical staff.

  • Maintain accurate, complete, and current documentation.

Qualifications:

  • 2-4+ years experience in ETL development, specifically designing ETL/ELT automated data transformation processes.

  • Experience and proficiency in SQL (on a variety of database platforms).

  • Expertise designing and implementing database schemas, ideally with Postgres.

  • Expertise developing with Python, specifically working with packages like Pandas, SQLAlchemy, boto3, and NumPy.

  • Experience with RESTful APIs.

  • Experience supporting self-service reports using tools such as Tableau, Looker, Periscope, Power BI, Spotfire, etc.

  • Experience with the following are desired: Apache Airflow, Tableau, and AWS services like RDS, S3, ECR/ECS, and SageMaker.

  • Proficiency acquiring, organizing, cleansing and manipulating large amounts of data.

  • Demonstrated partnering and communication skills; able to shift readily between technical and non-technical terminology depending on the audience.

  • Experience and proficiency in requirements elicitation and documentation for data processing.

  • Experience and proficiency in developing automated data validation test scenarios and scripts.

  • Familiarity with financial systems integrations (SAP, Concur, Oracle, etc.) a plus.

  • Experience working with financial transactions, PII, or in a regulated industry a plus.

  • An entrepreneurial spirit and experience with young companies is ideal.

  • Influential communicator, Fluent in Remote: You’ve worked closely with distributed teams that emphasize online communication (Slack, Github, Zoom, Jira, GoogleDocs).

  • Excitement for the code you write, and willingness to work hard at making it maintainable for your future self as well as your colleagues on the data team

  • Solid understanding and love of test driven development (TDD) building unit and integration tests

  • Enjoy working in an agile environment and strong understanding of agile practices

  • Clear communicator, Fluent in Remote: You’ve worked closely with distributed teams that emphasize online communication (Slack, Github, Zoom, Jira, GoogleDocs)

  • Ability to function effectively in a fast-paced environment and manage multiple projects simultaneously

Read Full Job Description
Send resume to [email protected]
Apply now
loading ...
Emailed

Location

1 North Dearborn, Chicago, IL 60602
Similar Jobs in Chicago
Data + Analytics
new
Data + Analytics
new
Data + Analytics
new
Data + Analytics
new
Data + Analytics
new
Data + Analytics
new
Data + Analytics
new
Data + Analytics
new
Data + Analytics
new
Data + Analytics
new
Data + Analytics
new
Send resume to [email protected]
Apply now
Save job
loading ...
Emailed
View Lextegrity's full profileFind similar jobs