- Design, build, and maintain processes and components of a streaming data/ETL pipeline to support real-time analytics (from requirements to data transformation, data modeling, metric definition, reporting, etc.)
- Ensure the security, confidentiality, and integrity of client information for yourself and team
- Develop code using Big Data Technologies such as Hadoop and Storm/Spark to write MapReduce and In-memory applications using Java, SQL, Pig, Hive, Impala, etc.
- Collaborate with data scientists & solution developer to design and develop processes to further the company's common data platform initiatives.
- Develop compelling PoC’s for data solutions using emerging technologies for real-time and big data ingestion and processing.
- Help define standards and best practices for enterprise usage.
Desired Skills & Experience:
- 1 to 3 years of applicable experience in Data Analytic, Data Science, or Data Engineering
- Experience programming with R is required
- Strong programming experience with Python, Scala or Java is preferred.
- Experience with emerging big data processing technologies (Spark, Storm, Kafka, Flume, Pig, Hive, Sqoop, Hadoop/MapReduce, etc) is preferred
- Experience with Data Visualization tools such Tableau, D3.js is preferred.
- Strong experience in Linux shell scripting
- Professional Services Consulting experience preferred, Payor experience a big plus
- Awareness of business issues as they impact overall project plans
- Excellent communication and interpersonal skills - Ability to clearly and concisely communicate information and ideas orally and in writing
- Eagerness to mentor/train other developers
- Complex Problem Solving - Ability to identify and solve problems by reviewing related information, evaluating options, and implementing solutions
- Critical Thinking - Ability to use logic and reason to identify strengths and weaknesses of alternative solutions, conclusions, or approaches to problems
- Deductive Reasoning – Ability to apply general rules to specific problems to produce answers that make sense
- Inductive Reasoning – Ability to combine pieces of information to form general rules or conclusions (includes finding a relationship among seemingly unrelated information or events)
- Quantitative and Analytical Skills - Ability to apply quantitative and statistical analysis techniques to unstructured problems
- Quality Assurance – Ability to perform systematic self-review to ensure work product is accurate and satisfies its intended purpose
- BA or BS is strongly preferred
- Degree in Computer Science, MIS, Engineering or other analytical disciplines is preferable
Pareto Intelligence is a leading healthcare solutions company modernizing the way health plans and providers succeed in value-based care. These solutions are supported by proprietary algorithms, predictive models and advanced data science that have analyzed over $100 billion in claims/medical costs and touched over 14 million lives. Through this, Pareto demystifies complex healthcare data and delivers actionable strategies to achieve complete and accurate revenue, communicate critical patient information seamlessly, and inspire more informed business decisions. Pareto was launched in 2013 by HealthScape Advisors, a privately held management consulting firm based in Chicago. That means Pareto’s pedigree is deep expertise and a pragmatic approach to executing solutions.
Pareto Intelligence is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability status, protected veteran status, or any protected status as defined by law.