Who We Are
The Options Clearing Corporation (OCC) is the world's largest equity derivatives clearing organization. Founded in 1973, OCC is dedicated to promoting stability and market integrity by delivering clearing and settlement services for options, futures and securities lending transactions. As a Systemically Important Financial Market Utility (SIFMU), OCC operates under the jurisdiction of the U.S. Securities and Exchange Commission (SEC), the U.S. Commodity Futures Trading Commission (CFTC), and the Board of Governors of the Federal Reserve System. OCC has more than 100 clearing members and provides central counterparty (CCP) clearing and settlement services to 19 exchanges and trading platforms. More information about OCC is available at www.theocc.com.
What We Offer
A highly collaborative and supportive environment developed to encourage work-life balance and employee wellness. Some of these components include:
A hybrid work environment, up to 3 days per week of remote work
Tuition Reimbursement to support your continued education
Student Loan Repayment Assistance
Technology Stipend allowing you to use the device of your choice to connect to our network while working remotely
Generous PTO and Parental leave
Competitive health benefits including medical, dental and vision
What You'll Do
The candidate will be responsible for designing and delivering scalable and resilient hybrid and Cloud-based applications and data solutions supporting critical financial market clearing and risk activities; helping to drive the strategy of transforming the enterprise into a data-driven organization; lead through innovative strategic thinking in building data solutions.
Primary Duties and Responsibilities:
To perform this job successfully, an individual must be able to perform each primary duty satisfactorily.
You will be part of the Data team, a diverse group of dedicated engineers who are very passionate about data. As the Data team member, you will be responsible for crafting and building cloud-based applications and data systems that will serve as the backbone for the enterprise data management and analytics capabilities. You will join the core team responsible for the design, development, and implementation. You will work closely with internal and external business and technology partners. We will define system architecture, technology stack and its tactical implementation. You will help us take on unique technical challenges associated with handling large datasets and managing streaming data in public cloud and hybrid environments, build large and complex data pipelines, integrate data coming from diverse sources in different formats, implement continuous integration/continuous delivery pipelines, automate everything we can get our hands on, and be part of the API-first design principle implementation. You will have a rare and challenging opportunity to apply your technical skills, knowledge and experience, acquire new skills and grow with us.
Supervisory Responsibilities: N/A
The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions.
• Backend development experience, building highly scalable distributed data solutions
• Ability to handle moderate frontend development tasks when required such as debugging data application workflow
• Understanding of the software development life cycle (SDLC) in Waterfall, Lean, and Agile work environments
• Strong analytical and problem-solving skills
• Desire and ability to learn other programming languages
• Excellent oral and written communication skills
• Work experience in the capital markets, preferred
• 7+ years of software development experience using modern programming languages such as Java, C++, Python or Scala
• Hands-on development experience with multiple programming languages such as Python, Java, C++, or Scala
• Hands-on experiences designing and implementing RESTful APIs
• Hands-on experience with Kubernetes and container orchestration technologies
• Knowledge and understanding of DevOps tools and technologies such as Terraform, Git, Jenkins, Docker, Harness, Nexus/Artifactory, and CI/CD pipelines
• Knowledge of SQL, data warehousing design concepts, various data management systems (structured and semi structured) and integrating with various database technologies (Relational, NoSQL)
• Experience working with Cloud ecosystems (AWS, Azure, GCP)
• Familiarity with Big Data processing technologies and frameworks such as Presto, Hadoop, MapReduce, and Spark.
• Familiarity with stream processing technologies and frameworks such as Kafka, Spark Streaming, Flink
• Familiarity with monitoring related tools and frameworks like Splunk, ElasticSearch, SignalFX, and AppDynamics
• Good understanding of data integrations patterns, technologies, and tools
Education and/or Experience:
• BS degree in Computer Science, similar technical field, or equivalent practical experience
• Experience with MuleSoft Anypoint, Apigee Edge, or similar API management solution is a plus
Certificates or Licenses:
• AWS Certified Solution Architect Associate Level is a plus
• Big data related certification is a plus