OneMagnify Logo

OneMagnify

Senior Data Engineer

Posted Yesterday
Remote
Hiring Remotely in USA
Senior level
Remote
Hiring Remotely in USA
Senior level
Design, build, and operationalize large-scale data platforms and pipelines on GCP. Mentor engineers, ensure data quality and system performance, and deploy production data solutions using BigQuery, Dataflow, Airflow, Terraform, and related tools.
The summary above was generated by AI

OneMagnify is a global performance marketing organization working at the intersection of brand marketing, technology, and analytics. The Company’s core offerings accelerate business, amplify real-time results, and help set their clients apart from their competitors. OneMagnify partners with clients to design, implement and manage marketing and brand strategies using analytical and predictive data models that provide valuable customer insights to drive higher levels of sales conversion.

OneMagnify’s commitment to employee growth and development extends far beyond typical approaches. We take great pride in fostering an environment where each of our 700+ colleagues can thrive and achieve their personal best. OneMagnify has been recognized as a Top Workplace, Best Workplace and Cool Workplace in the United States for 10 consecutive years and recently was recognized as a Top Workplace in India.

About You

The Senior Data Engineer will be a key collaborator, working cross-functionally with Product Managers, Product Owners, Software Engineers, Data Engineers, Data Architects, Business Stakeholders and analytics users. The ideal candidate possesses extensive experience in the design and operationalization of small and large-scale data solutions, encompassing data lakes, data warehouses, data marts, and analytics platforms on GCP.  We are looking for professionals with a broad technical acumen and a proven ability to strategically combine GCP and third-party technologies to craft the right solutions for cloud deployment.

What you’ll do:

  • Work as part of a GCP implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment of Data Platform automation.
  • Provide technical guidance, mentorship, and code-level support to the development team.
  • Drive effective and efficient delivery, focusing on speed, identifying risks, implementing mitigation/contingency plans and test/compare competing solutions.
  • Design and build production data engineering solutions to deliver our pipeline patterns using GCP Services
  • Ensure data quality, maintain profiles of data products, monitor system performance

What you’ll need:

  • Experience in data engineering pipelines, data warehouse systems,  ETL principles and complex SQL queries
  • GCP experience working in Big Data deployments leveraging BigQuery, Google Cloud Storage, Dataflow, Dataproc, Cloud Run
  • Python, VS Code, GitHub, Tekton and Terraform to deploy data solutions and products via DAGs with Astronomer and Apache Airflow
  • GCP Certification Preferred - Professional Data Engineer or Associate Cloud Engineer
  • Experience in Qlik Sense, Looker and Power BI preferred but not required
  • Understands data architecture and design independent of the technology
  • Experience working with Agile and Lean methodologies
  • Experience with Test-Driven Development
  • Exposure to AI/LLM
  • Problem solving and communication skills and management of multiple stakeholders
  • Ability to provide analytic and creative solutions to business problems by deep-dives into data

Benefits

We offer a comprehensive benefits package including medical, dental, 401(k), paid holidays, vacations, and more.

About us

Whether it’s awareness, advocacy, engagement, or efficacy, we move brands forward with work that connects with audiences and delivers results. Through meaningful analytics, engaging communications, and innovative technology solutions, we help clients tackle their most ambitious projects and overcome their biggest challenges.

We are an equal opportunity employer

We believe that innovative ideas and solutions start with unique perspectives. That’s why we’re committed to providing every employee a workplace that’s free of discrimination and intolerance. We’re proud to be an equal opportunity employer and actively search for like-minded people to join our team.

We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform job functions, and to receive benefits and privileges of employment. Please contact us to request accommodation.

Top Skills

Ai/Llm
Apache Airflow
Astronomer
BigQuery
Cloud Run
Dataflow
Dataproc
ETL
GCP
Git
Google Cloud Storage
Looker
Power BI
Python
Qlik Sense
SQL
Tekton
Terraform
Vs Code

Similar Jobs

4 Hours Ago
Easy Apply
Remote or Hybrid
USA
Easy Apply
130K-165K Annually
Senior level
130K-165K Annually
Senior level
Artificial Intelligence • Insurance • Machine Learning • Software • Analytics
Lead design and implementation of scalable, HIPAA-compliant data pipelines and platforms for healthcare ML. Build ETL, orchestration, and tooling for processing EHR, claims, pharmacy, and bioinformatics data; collaborate with data scientists to produce modeling-ready datasets and ensure data quality, reliability, and operational excellence.
Top Skills: Python,Sql,Apache Spark (Pyspark),Databricks,Snowflake,Airflow,Dagster,Prefect,Terraform,Docker,Kubernetes,Aws,Dbt,Ci/Cd
Yesterday
In-Office or Remote
Raleigh, NC, USA
Senior level
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design and build enterprise data models, ETL/ELT pipelines and high-performance PySpark jobs on Azure/Databricks. Enable ML/AI with MLflow and model serving, secure restricted-data environments using Unity Catalog and access controls, deploy AI agents, and evaluate emerging data and AI trends.
Top Skills: Azure,Databricks,Databricks Workflows,Azure Data Factory,Python,Pyspark,Sql,Databricks Sql,Delta Lake,Unity Catalog,Mlflow,Model Serving,Github Copilot,Databricks Genai,Mlops
8 Days Ago
Easy Apply
Remote or Hybrid
United States
Easy Apply
165K-175K Annually
Senior level
165K-175K Annually
Senior level
AdTech • Artificial Intelligence • Marketing Tech • Software • Analytics
The Senior Data Engineer will design, build, and operate data pipelines for Zeta's AdTech platform, focusing on high-scale data processing and analytics-ready datasets.
Top Skills: AirflowAthenaAWSCassandraDagsterDeltaDynamoDBEmrFlinkGlueGoHudiIcebergJavaKafkaKinesisMySQLParquetPostgresPythonRedisRedshiftS3ScalaSparkSQLStep Functions

What you need to know about the Chicago Tech Scene

With vibrant neighborhoods, great food and more affordable housing than either coast, Chicago might be the most liveable major tech hub. It is the birthplace of modern commodities and futures trading, a national hub for logistics and commerce, and home to the American Medical Association and the American Bar Association. This diverse blend of industry influences has helped Chicago emerge as a major player in verticals like fintech, biotechnology, legal tech, e-commerce and logistics technology. It’s also a major hiring center for tech companies on both coasts.

Key Facts About Chicago Tech

  • Number of Tech Workers: 245,800; 5.2% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: McDonald’s, John Deere, Boeing, Morningstar
  • Key Industries: Artificial intelligence, biotechnology, fintech, software, logistics technology
  • Funding Landscape: $2.5 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Pritzker Group Venture Capital, Arch Venture Partners, MATH Venture Partners, Jump Capital, Hyde Park Venture Partners
  • Research Centers and Universities: Northwestern University, University of Chicago, University of Illinois Urbana-Champaign, Illinois Institute of Technology, Argonne National Laboratory, Fermi National Accelerator Laboratory

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account