DATA ENGINEER
JOB DESCRIPTION
Description
Job Description:
This position will be responsible for building scalable, high performance infrastructure and data driven and predictive analytics applications that provide actionable insights across all Caterpillar businesses. The position will be part of Caterpillar’s fast-moving and engineering-driven digital organization with highly motivated engineers who tackle challenges and problems that are critical to realizing significant business outcomes. Data engineers work with data scientists, business analysts, and others as part of a team that assembles large, complex data sets that provide competitive advantage.
This position is the third level in a family of three Data Engineer positions. As an incumbent moves through these positions they will be expected to gain more knowledge and be able to work across various and progressively more difficult platforms.
Job Related Statistics:
Indeterminate
Job Duties:
• Build infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
• Design, develop, and maintain performant and scalable applications
• Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability
• Perform debugging, troubleshooting, modifications and unit testing of integration solutions
• Operationalize the developed jobs and processes and processes.
• Create databases and infrastructure to processing data at scale
• Create solutions and methods to monitor systems and solutions
• Automate code testing and pipelines
• Engage directly with business partners to participate in design and development of data integration/transformation solutions per functional requirements.
• Work in a scaled Agile environment accountable to deliver results in sprints.
• Engage and actively seek industry perspectives through external engagements such as hackathons, peer groups, etc.
• Generate, prepare, and catalog APIs
• Work with UI Designer to build user interfaces per design specifications
• Employee is also responsible for performing other job duties as assigned by Caterpillar management from time to time.
Qualifications
Basic Qualifications:
•A 4-year degree from an acreddited college or university
• 3+ years software development experience with object-oriented/object function scripting languages: Python, Java, Javascript, C++, Scala, etc.
• Understanding of data structures, algorithms, profiling & optimization.
• Understanding of SQL, ETL design, and data modeling techniques
Top Candidates Will Also Have:
• Experience managing continuous integrations systems (Jenkins, etc.).
• Strong background working with revision control systems (Git, etc.).
• Experience with automated build automation tools (Maven, etc.).
• Advanced level of experience with object oriented programming, data structures and algorithms.
• Passion for acquiring, analyzing, and transforming data to generate insights.
• Thrive in a fast-paced environment that delivers results and has fun.
• Strong analytical ability, judgment and problem analysis techniques.
• Working knowledge of Agile Software development methodology.
• Solid understanding of concepts of cloud computing.
• Great verbal and written communication skills to collaborate cross functionally and enhance scalability.
• Interpersonal skills with the ability to work effectively in a cross functional team.
• Knowledge of enterprise data sources and uses
• Strong working relationships with data owners/stewards
• Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
• A successful history of manipulating, processing and extracting value from large disconnected datasets.
• Experience with at least 5 of the software tools listed under desired skills
• Experience with big data tools: Hadoop, Spark, Kafka, etc.
• Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
• Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
• Experience with AWS cloud services: EC2, EMR, RDS, Redshift
• Experience with stream-processing systems: Storm, Spark-Streaming, etc.
• Experience with API managers: Apigee, Azure, Catana, etc