Analytics Engineer (Remote) at Fetch Rewards
Who We Are:
We reward shoppers for digitizing their shopping experience.
Our mission is to delight the world’s shoppers with a free smartphone app that is easy, smart and fun.
Why Join the Fetch Family?
We make it better for users even when that's difficult for us
We empower people with information and trust
We challenge ideas, not people
We think bigger and keep building
We find ways to bring the fun to Fetch!
We're committed to building an empowered and inclusive community of innovative and passionate people. As a growing organization, we need team players who can go above and beyond their individual responsibilities to help our company build towards its vision. If you are a creative, hard-working, and fun-seeking person interested in working with a close-knit group of highly talented people, this is the right place for you.
Fetch Rewards is an equal employment opportunity employer.
In this role, you can expect to:
- Model and analyze data utilizing SQL best practices for OLAP / OLTP query and database performance
- Leverage Data Build Tool (DBT), Snowflake, Airflow, AWS infrastructure, CI/CD, testing, and engineering best practices to accomplish your work
- Generate innovative approaches to datasets with millions of daily active users and terabytes of data
- Translate business requirements for near-real-time actionable insights into data models and artifacts
- Communicate findings clearly both verbally and in writing to a broad range of stakeholders
- Administrative duties for Snowflake, Tableau, and DBT/Airflow infrastructure
- Test, monitor, and report on data health and data quality
- Lead the charge on data documentation and data discovery initiatives
You are a good fit if you:
- Are proficient in SQL and understand the difference between SQL that works and SQL that performs
- Have worked with data modeling and orchestration tools
- Have experience with relational (SQL), non-relational (NoSQL), and/or object data stores (e.g., Snowflake, MongoDB, S3, HDFS, Postgres, Redis, DynamoDB)
- Have a solid understanding of ETL vs. ELT processes, data warehouses, and business intelligence tools
- Have prior experience clearly communicating about data with internal and external customers
- Are highly motivated to work autonomously, with the ability to manage multiple work streams
- Interest in building and experimenting with different tools and tech, and sharing your learnings with the broader organization
- Love Dogs! . . . Or at least tolerate them. We're a very canine-friendly workplace!
You have an edge if you:
- Have developed and maintained DBT or Airflow in production environments
- Have experience programmatically deploying cloud resources on AWS, Azure, or GCP
- Have successfully implemented data quality, data governance, or disaster recovery initiatives
- Are proficient in at least one imperative programming language (i.e., Python)