Lower Logo

Lower

Senior Data Engineer

Posted 24 Days Ago
Hybrid
2 Locations
Senior level
Hybrid
2 Locations
Senior level
The Senior Data Engineer will optimize data infrastructure, design ETL pipelines, and manage data models using Snowflake and AWS technologies.
The summary above was generated by AI

Here at Lower, we believe homeownership is the key to building wealth, and we’re making it easier and more accessible than ever. As a mission-driven fintech, we simplify the home-buying process through cutting-edge technology and a seamless customer experience.

With tens of billions in funded home loans and top ratings on Trustpilot (4.8), Google (4.9), and Zillow (4.9), we’re a leader in the industry. But what truly sets us apart? Our people. Join us and be part of something bigger.

Job Description:

We are seeking a Senior Data Engineer to play a key role in building and optimizing our data infrastructure to support business insights and decision-making. In this role, you will design and enhance denormalized analytics tables in Snowflake, build scalable ETL pipelines, and ensure data from diverse sources is transformed into accurate, reliable, and accessible formats. You will collaborate with business and sales stakeholders to gather requirements, partner with developers to ensure critical data is captured at the application level, and optimize existing frameworks for performance and integrity. This role also includes creating robust testing frameworks and documentation to ensure quality and consistency across data pipelines.

What you'll do:

  • Data Pipeline Engineering:  

  • Design, develop, and optimize high-performance ETL/ELT pipelines using Python, dbt, and Snowflake. 

  • Build and manage real-time ingestion pipelines leveraging AWS Lambda and CDC systems. 

  • Cloud & Infrastructure:  

  • Develop scalable serverless solutions with AWS, adopting event-driven architecture patterns. 

  • Manage containerized applications using Docker and infrastructure as code via GitHub Actions. 

  • Advanced Data Management:  

  • Create sophisticated, multi-layered Snowflake data models optimized for scalability, flexibility, and performance. 

  • Integrate and manage APIs for Salesforce, Braze, and various financial systems, emphasizing robust error handling and reliability. 

  • Quality Assurance & Operations:  

  • Implement robust testing frameworks, data lineage tracking, monitoring, and alerting. 

  • Enhance and manage CI/CD pipelines, drive migration to modern orchestration tools (e.g., Dagster, Airflow), and manage multi-environment deployments. 

Who you are:

  • 5+ years of data engineering experience, ideally with cloud-native architectures. 

  • Expert-level Python skills, particularly with pandas, SQLAlchemy, and asynchronous processing. 

  • Advanced SQL and Snowflake expertise, including stored procedures, external stages, performance tuning, and complex query optimization. 

  • Strong proficiency with dbt, including macro development, testing, and automated deployments. 

  • Production-grade Pipeline Experience specifically with Lambda, S3, API Gateway, and IAM. 

  • Proven experience with REST APIs, authentication patterns, and handling complex data integrations. 

Preferred Experience 

  • Background in financial services or fintech, particularly loan processing, customer onboarding, or compliance. 

  • Experience with real-time streaming platforms like Kafka or Kinesis. 

  • Familiarity with Infrastructure as Code tools (Terraform, CloudFormation). 

  • Knowledge of BI and data visualization tools (Tableau, Looker, Domo). 

  • Container orchestration experience (ECS, Kubernetes). 

  • Understanding of data lake architectures and Delta Lake. 

Technical Skills 

  • Programming: Python (expert), SQL (expert), Bash scripting. 

  • Cloud: AWS (Lambda, S3, API Gateway, CloudWatch, IAM). 

  • Data Warehouse: Snowflake, dimensional modeling, query optimization. 

  • ETL/ELT: dbt, pandas, custom Python workflows. 

  • DevOps: GitHub Actions, Docker, automated testing. 

  • APIs: REST integration, authentication, error handling. 

  • Data Formats: JSON, CSV, Parquet, Avro. 

  • Version Control: Git, GitHub workflows. 

What Sets You Apart 

  • Systems Thinking: You see the big picture, designing data flows that scale and adapt with the business. 

  • Problem Solver: You quickly diagnose and resolve complex data issues across diverse systems and APIs. 

  • Quality Advocate: You write comprehensive tests, enforce data quality standards, and proactively prevent data issues. 

  • Collaborative: You thrive working alongside analysts, developers, and product teams, ensuring seamless integration and teamwork. 

  • Continuous Learner: You actively seek emerging data technologies and best practices to drive innovation. 

  • Business Impact: You understand how your data engineering decisions directly influence and drive business outcomes. 

Benefits & Perks 

  • Competitive salary and comprehensive benefits (healthcare, dental, vision, 401k match) 

  • Hybrid work environment (primarily remote, with two days a week in downtown Columbus Ohio 

  • Professional growth opportunities and internal promotion pathways 

  • Collaborative, mission-driven culture recognized as a local and national "best place to work" 

If you don't think you meet all of the criteria below but still are interested in the job, please apply. Nobody checks every box, and we're looking for someone excited to join the team. 

Lower provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.

Privacy Policy

Top Skills

AWS
Dbt
Docker
Domo
ETL
Github Actions
Kafka
Kinesis
Looker
Python
Snowflake
SQL
Tableau

What you need to know about the Chicago Tech Scene

With vibrant neighborhoods, great food and more affordable housing than either coast, Chicago might be the most liveable major tech hub. It is the birthplace of modern commodities and futures trading, a national hub for logistics and commerce, and home to the American Medical Association and the American Bar Association. This diverse blend of industry influences has helped Chicago emerge as a major player in verticals like fintech, biotechnology, legal tech, e-commerce and logistics technology. It’s also a major hiring center for tech companies on both coasts.

Key Facts About Chicago Tech

  • Number of Tech Workers: 245,800; 5.2% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: McDonald’s, John Deere, Boeing, Morningstar
  • Key Industries: Artificial intelligence, biotechnology, fintech, software, logistics technology
  • Funding Landscape: $2.5 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Pritzker Group Venture Capital, Arch Venture Partners, MATH Venture Partners, Jump Capital, Hyde Park Venture Partners
  • Research Centers and Universities: Northwestern University, University of Chicago, University of Illinois Urbana-Champaign, Illinois Institute of Technology, Argonne National Laboratory, Fermi National Accelerator Laboratory

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account