As a Software Engineer you will work as part of Data Technology Team and will be responsible for designing and developing solutions that cater to variety of business requirements/data content. This position provides opportunity to work on variety of projects, across disciplines developing products to help investors make right decision. You will have the responsibility for ensuring process and quality adherence through peer reviews, test coverage and best practices. Experience with Java, multi-threaded application is strongly preferred in addition to familiarity with cloud technologies such as AWS to build, test, deploy and host solutions. This position will require you to act as the software guardian, leading non-functional requirements and the quality of the code while addressing technical debt. You’ll interact daily with product managers, business stakeholders to understand our domain and create technical solutions that push us forward. You will write a suite of automated unit tests to individual units of the application. Guide and mentor junior members in the team.
This position provides to work on most exciting analytical, data quality related projects and most recent or advanced technologies topics involving Data Lake, EMR, lake formation and build tools for data quality/reconciliation tools to ensure the data completeness, availability between source and destination sources.
- Bachelor’s degree in MIS, Computer Science, or a related field.
- Hands on ETL experience on Apache Spark or AWS Glue.
- Experience in data warehousing.
- 3 years of experience building applications in Java.
- 2 years of experience with Continuous Integration Tools Bitbucket, GitHub, Jenkins, IntelliJ, unit testing, and Jira.
- 3 years of experience working in an Agile environment.
- 1 year of experience with Message Driven Architectures such as Apache Kafka.
- 1 year of experience working with Amazon Web Services, including S3, CloudFront, ELB, SQS, EC2, AWS Lambda.
- 3 years of experience with databases, including Microsoft SQL Server, Postgres.
- Hands on experience with an AWS database, preferably Athena or RedShift.
- Develop efficient, high throughput data pipelines to push transactional data to the cloud
- Experience using Python/scripting language.
- Manage database code consistency between Dev, QA, STG & PROD environments.
- Strong knowledge of database design principles.
- Troubleshoot production issues related to data and Java/SQL code.
- Design and develop for quick deployment with assistance from the Devops team
- Knowledge/experience adding quality checks to data pipelines.
- 3+ years of experience in design, development, and maintenance of cloud-based data systems.
- 3+ years of experience on Microsoft SQL server.
- Good communication skills.
- AWS Certification or desire to become AWS certified.
- Ability to work independently, communicate effectively, and produce superior results.
- Experience working in Scrum-based methodologies Ability to write clean code and provide insights in code reviews.