Data Engineer

Sorry, this job was removed at 4:45 p.m. (CST) on Tuesday, June 16, 2020
Find out who's hiring in Chicago.
See all Data + Analytics jobs in Chicago
Apply
By clicking Apply Now you agree to share your profile information with the hiring company.

As a Data Engineer you will be joining a motivated team of professionals working to design and build solutions for entire breath of the firm, including Portfolio Managers, Analysts, Quants, Marketing, Compliance and Operations. You will gain exposure to a variety of partners at all levels, from Senior Technology leaders to Senior Business partners. In addition, you will obtain exposure and experience in the latest technologies. We are adopting a wide array of technology and approaches from some of the most proven Open Source projects to tried-and-true enterprise platforms.

Responsibilities:

The team delivers change to a range of business areas, front, middle and back-office. This role is an experienced hands-on contributor within the data engineering domain. The candidate is someone with a passion for data, data pipeline tools, data engineering patterns and understands that data-driven organizations lead to a competitive advantage.

This role will be responsible for re-engineering and modernization of data stores and data pipelines already in place as well as the implementation of net new solutions targeting customer focused value and drive the next generation of products and services. It will involve the build out of both strategic distributed systems as a greenfield development and the improvement and migration of legacy systems.

The candidate must be able to build strong relationships with product teams, business stakeholders and end-users to ensure that the data product meets their requirements and expectations.

Our evolving data engineering tech-stack:

AWS S3, AWS Glue, AWS Lambda, AWS Athena, AWS EMR, Python, PySpark, Scala, Confluent Kafka and AWS RDS (Aurora and Postgres).

Required Skills

Proven Technology Development Experience with Minimum Years of Experience:
• Python: 4+ years.
• PySpark: 3+ years.
• AWS Boto 3, AWS CLI: 3+ years.
• AWS S3: 4+ years.
• AWS EMR and/or AWS Glue: 2+ years.
• AWS Lambda: 2+ years.
• AWS Athena: 1+ years.
• Procedural SQL (including set-oriented processing, table folding via normalization/de-normalization, exception handling and transaction control): 4+ years.
• Experience with Parquet or ORC file formats: 4+ years.
• Experience with JSON file format: 2+ years.
• GIT source code control: 2+ years.

Proven Data Engineering Patterns and Concepts Experience with Minimum Years of Experience:
• Implementation of canonical, object-oriented/parameterized data pipelines (ETL/ELT) to transform custom complex file formats into common standard storage formats for Big Data processing: 4+ years.
• Experience building streaming and event-driven data pipelines: 2+ years.
• Experience building batch data pipelines: 4+ years.
• Experience implementing change data capture (CDC) methods: 4+ years.
• Implementation of data pipelines to load Dimensional (Star) data models: 4+ years.
• Relational database management system (RDBMS) concepts including scenario-based performance tuning (transaction control and exception handling, data integrity constraints, indexing, triggers, data types, transaction logging). Understanding of set oriented vs. sequential processing: 4+ years.
• Ability to read, understand and performance-tune data models in the third normal form (3NF) or higher: 4+ years.
• Object-oriented patterns including encapsulation/parameterization and loose coupling: 4+ years.
• CAP theorem concepts and understanding of capabilities and limitations of distributed systems: 2+ years.
• Data lineage/flow documentation and code documentation for full requirements traceability: 4+ years.
• Data profiling with ability to effectively document via data model reverse engineering: 4+ years.
• Agile and DevOps: 4+ years.

Not Required but Nice to Have Technology Skills and Patterns
• Scala
• AWS Cloud Formation, AWS CloudFront, AWS Step Functions.
• Confluent Kafka.
• Apache NiFi, Apache Airflow, Apache Beam, Apache Presto.
• Cloud-native columnar databases like AWS Redshift and Snowflake.
• Microservices, REST API development for data provisioning and exchange.
• Development with in-memory cache data stores like Redis and Memcached.
• NoSQL databases like MongoDB, Cassandra.
• Data visualization in Tableau, Python/Jupyter, D3.js or Highcharts.js.
• CI/CD, DevOps.
• Open source data lineage and metadata discovery tools like Marquez and Amundsen.
• Bash Shell/PowerShell.
• AWS Associate or Professional Certification.

Behavioural Competencies
• Excellent verbal and written communication skills.
• Experience consistently learning, experimenting with and applying new design patterns and technologies.
• Preference and desire to work with open source (Apache Foundation) applications.
• Strong team player, collaborative and self-motivated.
• Strong desire to dissect production data issues and improve data quality.
• Ability to handle stress and pressure gracefully while achieving continuous progress.
• Ability to effectively work through ambiguity, road-blocks and contribute under limited supervision.
• Proactive, desire to document and present improvement recommendations consistently.
• Ability to commit and manage expectations with full ownership and follow-up.

Other Competencies and Characteristics
• Learning doesn’t stop. You are perpetual student to all aspects of data engineering and methodologies, it is a passion of yours.
• You understand the importance and impact of modern data architectures (e.g. event-driven architectures, data democratization, platform approaches to support ML/AI, stream processing and integrating real-time analytics into business applications).
• You’ll be drawing on all of your passion for technology, hands-on experience and knowledge of latest Big Data and Engineering best practices to help you gain the respect and credibility of those around you.
• Possess a passion for data modelling and design from conceptualization to optimization.
• Communication is critical to our success. You must be able to demonstrate excellent verbal and written communication skills and the ability to interact professionally with a diverse group of partners, managers and subject matter experts.
• You are constantly looking for a new and exciting challenge. You must be able to demonstrate the ability to contribute within the change team to tackle challenging requirements and tight timescales.

Minimum Academic Qualifications
• Bachelor’s Degree in one or more of the following domains: Computer Science, Software Engineering, Information Systems or similar.

Read Full Job Description
Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.

Location

71 S Wacker, Chicago, IL 60606

Similar Jobs

Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.
Learn more about Legal & General Investment Management AmericaFind similar jobs