Principal Cloud Data Engineer
Discover. A more rewarding way to work.
At Discover Financial Services, you’ll find yourself in the company of some of the industry’s smartest and most reliable professionals. And at a company that rewards dedication, values innovation and supports growth.
Thrive in an environment that promotes teamwork and shared success. Build on a foundation of mutual respect. Join the company that understands rewarding careers like no other, with this exceptional opportunity:
Job Description:
At Discover, be part of a culture where diversity, teamwork and collaboration reign. Join a company that is just as employee-focused as it is on its customers and is consistently awarded for both. We’re all about people, and our employees are why Discover is a great place to work. Be the reason we help millions of consumers build a brighter financial future and achieve yours along the way with a rewarding career.
We’re a direct banking and payment services company built on a legacy of innovation and customer service. Our employees have always played a big part in our success. We support, challenge and inspire employees to continually develop their skills, advance their career and help grow our business.
Discover Financial Services is seeking a Lead Cloud Data engineer to join our Data Operations team. Successful candidates will partner with our business partners to understand their data needs, and build data pipelines using cutting edge technologies like Kinesis/Kafka both in our on-Premise environment as well as in the Cloud. Additionally you will provide engineering leadership to create and enhance data solutions enabling seamless integration and flow of data across our data ecosystem. You also will provide lead level technical consulting to peer data engineers during design and development for highly complex and critical data projects. Some of these projects will include designing and developing data ingestion and processing/transformation frameworks leveraging open source tools such as Java, Python, Spark, Scala, pySpark, etc.
Responsibilities
- Be part of Data Engineering team translating business and technology requirements into our ETL/ELT architecture.
- Collaborate with cross functional teams such as AWS cloud & platform engineering, DBAs and business teams.
- Develop real-time and batch data ingestion and stream-analytic solutions leveraging technologies such as Kafka, Apache Spark, Kinesis, Java, NoSQL DBs, AWS EMR.
- Develop data driven solutions utilizing current and next generation technologies to meet evolving business needs.
- Ability to quickly identify an opportunity and recommend possible technical solutions.
- Develop Custom Data pipeline (Cloud and locally hosted).
- Provide support for deployed data applications and analytical models by being a trusted advisor to Data Scientists and other data consumers by identifying data problems and guiding issue resolution with partner Data Engineers and source data providers.
- Provide subject matter expertise in the analysis, preparation of specifications and plans for the development of data processes.
- Ensure proper data governance policies are followed by implementing or validating Data Lineage, Quality checks, classification, etc.
Skills:
Minimum Qualifications
At a minimum, here’s what we need from you:
- BS in Computer Science or related field
- 6+ years of data engineering experience working with structured and un-structured data.
- Experienced in Agile methodologies.
- Proficiency in one of the scripting languages such as Shell/Python/Scala/Java.
- Good understanding of Big Data technology trends, with knowledge of technologies such as Kinesis, Kafka, Spark, Hive, pySpark.
- Knowledge of Infrastructure as Code, Immutable Infrastructure, & continuous integration/deployment practices
- Experience with one of the ETL tools (Abinitio, Informatica or Data Stage) and tools/frameworks within the Big Data ecosystem (
- Hands on experience with AWS based solutions such as RDS, Lambda, Dynamodb, Snowflake or S3.
- Experience in version control systems such as Git, GitLab, etc.
- Ability to work in a fast-paced, rapidly changing environment.
Leadership Skills
- 4+ years of experience of being a lead engineer amongst a team of equal or junior level developers
- 2+ years of experience using Cloud technologies
- Collaborative individual who excels in working within a team and with business partners and identify, develop and deliver innovative data solutions.
- Strong ability to build and leverage external relationships.
- Strong verbal and written skills, with a demonstrated ability to create presentations and persuasive papers. Ability to clearly articulate information or solution.
- Ability to work independently to complete final output of the research (white papers, concept decks, architectural diagrams).
- Ability to deliver presentations in a thorough and concise manner to a cross functional or senior level audience.
- Passionate learner who enjoys education through class room training and self-discovery on a variety of emerging technologies.
Preferred Qualifications
If we had our say, we’d also look for:
- Prior experience with Banking or Financial domain is a big plus.
- AWS Cloud Services certification is a big plus.
#LI-KE
Discover Financial Services is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, protected veteran status, among other things, or as a qualified individual with a disability.
So, what are you waiting for? Apply today!