Oh Snap!
This job is no longer active - but you can still view the details below.

Senior Data Engineer

| Chicago

Summary:

We are fast-paced, forward thinking and driven by data.
We are accelerating the used car industry.
We are looking for creative, talented, hard-working individuals to join us.
Buckle up. It's going to be a great ride.

Based in Chicago, DRIVIN is accelerating the used car industry by bringing data and technology together in a spectacular, first-of-its-kind fashion to help dealers acquire the right cars, at the right price, right now. We are committed to delivering data-driven solutions with a high-touch, personalized level of service to each of our clients. We are looking for people that will help us create a culture that is exciting, empowering, motivating and challenging. Are you interested in learning more?

 

 

Job Description:

DRIVIN is looking to expand our data team as we continue to grow our data platform.  The candidate should have a strong background with Python and SQL.  As a member of the data team the main responsibilities are implementing/maintaining ETL jobs, using Python to ingest external data sources into the Data Warehouse, and working closely with the Product and Data Science teams to deliver data in useable formats and to the appropriate data sources.

 

DRIVIN has a polyglot data model using many cutting-edge data platforms.  We are currently using MPP Postgres (Greenplum, Netezza, DBX) as our Data Warehouse, Elastic Search for location based searching, Postgres for transactional data.

 

This candidate should be a self-starter who is interested in learning new systems/environments and building new solutions. 

The candidate should also work closely with the Data Science team to identify interesting data points for use by the Data Science team.

 

DRIVIN tech stack is very cutting edge.  MPP Postgres drives the Data Warehouse, ElasticSearch enables our location based searching/metrics, and Apache Spark is used to train our models.  All environments are run on AWS EC2/RDS/S3 and data processing framework is written in Python.

 

RESPONSIBILITIES:

  • Implement ETL jobs for various functions
  • Support and maintain daily ETL jobs
  • Support the development teams by optimizing data access
  • Work with data science teams to deliver metrics to consumers

 

RESPONSIBILITIES:

  • Implement ETL jobs for various functions
  • Support and maintain daily ETL jobs
  • Support the development teams by optimizing data access
  • Work with data science teams to deliver metrics to consumers
Read Full Job Description

Location

600 W. Chicago, Chicago, IL 60654

What are KAR Perks + Benefits

Health Insurance & Wellness Benefits
Flexible Spending Account (FSA)
Dental Benefits
Vision Benefits
Health Insurance Benefits
Wellness Programs
Retirement & Stock Options Benefits
401(K)
401(K) Matching
Employee Stock Purchase Plan
Vacation & Time Off Benefits
Generous PTO
Paid Volunteer Time
Paid Holidays
Perks & Discounts
Casual Dress
Company Outings
Stocked Kitchen
Happy Hours
Professional Development Benefits
Tuition Reimbursement
More Jobs at KAR12 open jobs
All Jobs
Data + Analytics
Dev + Engineer
Product
Project Mgmt
Sales
Developer
new
Chicago
Data + Analytics
new
Chicago
Developer
new
Chicago
Developer
new
Chicago
Developer
new
Chicago
Project Mgmt
new
Chicago
Developer
new
Chicago
Developer
new
Chicago
Product
new
Chicago