The Options Clearing Corporation (OCC) is the world's largest equity derivatives clearing organization. Founded in 1973, OCC is dedicated to promoting stability and market integrity by delivering clearing and settlement services for options, futures and securities lending transactions. As a Systemically Important Financial Market Utility (SIFMU), OCC operates under the jurisdiction of the U.S. Securities and Exchange Commission (SEC), the U.S. Commodity Futures Trading Commission (CFTC), and the Board of Governors of the Federal Reserve System. OCC has more than 100 clearing members and provides central counterparty (CCP) clearing and settlement services to 19 exchanges and trading platforms. More information about OCC is available at www.theocc.com.
What We Offer
A highly collaborative and supportive environment developed to encourage work-life balance and employee wellness. Some of these components include:
A hybrid work environment, up to 3 days per week of remote work
Tuition Reimbursement to support your continued education
Student Loan Repayment Assistance
Technology Stipend allowing you to use the device of your choice to connect to our network while working remotely
Generous PTO and Parental leave
Competitive health benefits including medical, dental and vision
What you will do:
The incumbent will be responsible for designing and delivering scalable and resilient hybrid and Cloud-based applications and data solutions supporting critical financial market clearing and risk activities; helping to drive the strategy of transforming the enterprise into a data-driven organization; lead through innovative strategic thinking in building data solutions.
You will be part of the Data team, a diverse group of dedicated engineers who are very passionate about data. As the Data team member, you will be responsible for crafting and building cloud-based applications and data systems that will serve as the backbone for the enterprise data management and analytics capabilities. You will join the core team responsible for the design, development, and implementation. You will work closely with internal and external business and technology partners. We will define system architecture, technology stack and its tactical implementation. You will help us take on unique technical challenges associated with handling large datasets and managing streaming data in public cloud and hybrid environments, build large and complex data pipelines, integrate data coming from diverse sources in different formats, implement continuous integration/continuous delivery pipelines, automate everything we can get our hands on, and be part of the API-first design principle implementation. You will have a rare and challenging opportunity to apply your technical skills, knowledge and experience, acquire new skills and grow with us.
- Work across the full stack, building highly scalable distributed solutions
- Understanding of the software development life cycle (SDLC) in Waterfall, Lean, and Agile work environments
- Strong analytical and problem-solving skills
- Desire and ability to learn other programming languages
- Excellent oral and written communication skills
- Work experience in the capital markets, preferred
- 7+ years of technical experience building data-centric solutions within regulated industries
- 5+ years of solutions design and architecture experience
- Hands-on development experience with multiple programming languages such as Python, Java, and Scala
- Hands-on experiences designing and implementing RESTful APIs
- Knowledge and understanding of DevOps tools and technologies such as Terraform, Git, Jenkins, Docker, Harness, Nexus/Artifactory, and CI/CD pipelines
- Knowledge of SQL, data warehousing design concepts, various data management systems (structured and semi structured) and integrating with various database technologies (Relational, NoSQL)
- Experience working with Cloud ecosystems (AWS, Azure, GCP)
- Familiarity with Big Data processing technologies and frameworks such as Presto, Hadoop, MapReduce, and Spark.
- Familiarity with stream processing technologies and frameworks such as Kafka, Spark Streaming, Flink
- Familiarity with in-memory aggregation and calculation frameworks and tools used for analytical processing, visualization, and BI & reporting
- Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, SignalFX, and AppDynamics
- Good understanding of data integrations patterns, technologies, and tools
- BS degree in Computer Science, similar technical field, or equivalent practical experience
- 7+ years of software development experience
- AWS Certified Solution Architect Associate Level is a plus