Associate Principal, Data Analytics Engineering
What You'll Do
The candidate will be responsible for designing and delivering scalable and resilient hybrid and Cloud-based applications and data solutions supporting critical financial market clearing and risk activities; helping to drive the strategy of transforming the enterprise into a data-driven organization; lead through innovative strategic thinking in building data solutions.
Primary Duties and Responsibilities:
To perform this job successfully, an individual must be able to perform each primary duty satisfactorily.
You will be part of the Data team, a diverse group of dedicated engineers who are very passionate about data. As the Data team member, you will be responsible for crafting and building cloud-based applications and data systems that will serve as the backbone for the enterprise data management and analytics capabilities. You will join the core team responsible for the design, development, and implementation. You will work closely with internal and external business and technology partners. We will define system architecture, technology stack and its tactical implementation. You will help us take on unique technical challenges associated with handling large datasets and managing streaming data in public cloud and hybrid environments, build large and complex data pipelines, integrate data coming from diverse sources in different formats, implement continuous integration/continuous delivery pipelines, automate everything we can get our hands on, and be part of the API-first design principle implementation. You will have a rare and challenging opportunity to apply your technical skills, knowledge and experience, acquire new skills and grow with us.
Supervisory Responsibilities:
None
Qualifications:
The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions.
Work across the full stack, building highly scalable distributed solutions
Understanding of the software development life cycle (SDLC) in Waterfall, Lean, and Agile work environments
Strong analytical and problem-solving skills
Desire and ability to learn other programming languages
Excellent oral and written communication skills
Work experience in the capital markets, preferred
Technical Skills:
Experience working with Cloud ecosystems (AWS, Azure, GCP)
5+ years of hands-on experience with Big Data and distributed stream processing frameworks such as Hadoop, Kafka, Hive, and Presto.
3+ years of hands-on experience with stream processing engines such as Apache Storm, Spark, Flink, or Beam
Experience with multiple programming languages such as Python, Java, and Scala
Strong knowledge of SQL
Experience with data storage formats such as Apache Parquet, Avro, or ORC
Knowledge and understanding of DevOps tools and technologies such as Terraform, Git, and Jenkins
Familiarity with Kubernetes and container orchestration technologies such Rancher, EKS, or GKE
Experience with table formats such as Iceberg, Delta Lake, or Hudi is a plus
Good understanding of data integrations patterns, technologies, and tools
Education and/or Experience:
BS degree in Computer Science, similar technical field, or equivalent practical experience
5+ years of software development experience
Certificates or Licenses:
AWS Certified Solution Architect Associate Level is a plus