Arity - Sr. Data Flow Engineer
Founded by The Allstate Corporation in 2016, Arity is a data and analytics company focused on improving transportation. We collect and analyze enormous amounts of data, using predictive analytics to build solutions with a single goal in mind: to make transportation smarter, safer and more useful for everyone.
At the heart of that mission are the people that work here—the dreamers, doers and difference-makers that call this place home. As part of that team, your work will showcase both your intelligence and your creativity as you tackle real problems and put your talents towards transforming transportation.
That’s because at Arity, we believe work and life shouldn’t be at odds with one another. After all, we know that your unique qualities give you a unique perspective. We don’t just want you to see yourself here. We want you to be yourself here.
The Role
As a Sr. Data Flow Engineer, you will be responsible for assisting with the design, architecture, implementation, performance tuning, monitoring and on-going support of Arity’s Data Flow Platform and Real-time Data Streaming applications running on AWS
Key Responsibilities
- Collaborate with Product Managers, Testers, Designers and other Engineers to develop full stack solutions
- Handle all Data Flow and Data Streaming builds, including design, architecture, implementation, performance tuning and ongoing monitoring and support
- Work in Agile/Scrum environment, design, estimate, unit test, and develop user stories
- Participate in team growth through regular internal technical presentations
- Develop data infrastructure to ingest, sanitize and normalize a diverse set of entities pertinent to the insurance market
- Build high performance, secure and expressive interfaces to the data
- Monitor the data quality and overall data flows and data streaming applications
- Build highly relieable, secure and optimized data ingestion and data streaming pipelines from disparate sources
- Demonstrate an understanding of databases and large-scale data processing and data streaming frameworks
Technical Experience and Skills
- Strong knowledge of NiFi and Flink Architecture, Administration, Features and Characteristics
- Experience in Python, Shell Scripting or other scripting language
- Knowledge of AWS (Cloud Formation Templates, EC2, Cloudwatch, S3)
- Knowledge and experience working with NiFi User Interface, Building data flows in NiFi and optimizing the data flows
- Good knowledge of NiFi Expression Language, Spring Context Processors in NiFi and exposure to NiFi APIs
- Understand the concepts of a NiFi Cluster, Nodes and NiFi Security
- Strong knowledge in build data pipelines and setting up NiFi instrastructure and NiFi Clusters using Cloud Formation Templates
- Experience in troubleshooting data flow and NiFi infrastructure issues and effectively Monitor NiFi data flows
- Ability to build end to end data flow pipeline (data ingestion, data routing, data processing, data transformation and data persistence) using NiFi
- Deep expertise in Real-time stream and batch processing with Flink and understand the concepts of Job Managers and Task Managers, Watermarks and Savepoints
- Experience working with Flink’s DataStream API writing applications that implement transformations on data streams (e.g., filtering, updating state, defining windows, aggregating)
- Experience working with Flink’s DataSet API writing applications that implement transformations on data sets (e.g., filtering, mapping, joining, grouping)
- Knowledge of effectively monitoring, debugging and troubleshooting Flink applications and jobs
- Strong Knowledge with Hadoop Technologies, Apache Kafka and Cassandra
- Understand data transformation specifications and validate data against the specs
- Understand standard data formats (JSON, Avro, XML, CSV, UTF-8, Base-64 encoded), standard HTTP response/ error codes
- Understand Apigee API Framework for data ingestion and data processing
- Experience generating and setting up test data to test the data processing systems
Qualifications
- Bachelor’s Degree in Computer Science or a related discipline, or equivalent experience/combined education, with some Relevant professional experience and/or specialized training that is commensurate with the assignment
- 4-6 years of experience in application development, ideally in Java, and two years in Apache NiFi, Apache Flink
- Passion for software development and ability to write source code, reuse existing components, and evaluate and integrated open source software into solutions
- Proven ability to lead multiple high priorities initiative with aggressive timelines leveraging an agile/scrum framework
- Comfortable performing in a fast paced, dynamic and ambiguous business environment
- Ability to concentrate on a wide range of loosely defined complex situations, which require creativity and originality, where guidance and counsel may be unavailable
- Excellent listening and communication skills
- Strong problem solving skills with the ability to offer why one technical solutions is more preferable than another
- Ability to work in a team environment in a collaborative manner and remote teams
- Proven track record of sound, effective decision making
Heads up: when you click “apply now” you’ll be directed to the Allstate careers site. You’re still looking at an Arity job, but because we were founded by Allstate, we share the same application system.