You will be responsible for designing and delivering scalable and resilient hybrid and Cloud-based data solutions supporting critical financial market clearing and risk activities; helping to drive the strategy of transforming the enterprise into a data-driven organization; lead through creative thinking in building data solutions.
Primary Duties and Responsibilities:
To perform this job successfully, an individual must be able to perform each primary duty satisfactorily.
You will be part of the Data team, a diverse group of dedicated engineers who are very passionate about data. As the Data team member, you will be responsible for crafting and building large cloud-based data systems that will serve as the backbone for the enterprise data management and analytics capabilities. You will join the core team who will do design, development, and implementation. You will work closely with business and technical partners, internal and external. We will define system architecture, technology stack and its tactical implementation. You will help us take on unique technical challenges associated with handling large datasets and handling streaming data in public cloud and hybrid environments, build large and complex data pipelines, integrate data coming from different sources in different formats, implement continuous integration/continuous delivery pipeline, automate everything we can get our hands on, and be part of the API-first design principle implementation. You will have a rare and challenging opportunity to apply your technical skills, knowledge and experience, acquire new skills and grow with us.Supervisory Responsibilities:None
- BS degree in Computer Science, similar technical field or equivalent experience.
- Excellent oral and written communication skills.
- 2+ years of software development experience using Python, Java, Scala or other programming language.
- 2+ years of software development experience with SQL.
- Experience or familiarity with Cloud ecosystem (AWS, Azure, or GCP).
- Familiarity with Big Data technologies such as Hadoop, MapReduce, Spark.
- Familiarity with DevOps tools and technologies such as GIT, Jenkins, Docker, Nexus/Artifactory and CI/CD pipeline.
- Strong analytical and problem solving skills.
- Desire and ability to learn other programming languages and software development tools.
- Understanding of designing and implementing RESTful APIs.
- Knowledge of BI tools, such as Tableau, Microsoft PowerBI, or Qlik.
- Knowledge of data warehousing design concepts, various data management systems (structured and semi-structured) and integration with various database technologies (Relational, NoSQL).
- Knowledge of financial markets with trade and settlement lifecycle is a plus.
- AWS Certification is a plus