Arity-Hadoop Systems Engineer
Manage large scale multi-tenant Hadoop cluster environments residing on AWS.
Handle all Hadoop environment builds, including design, security, capacity planning, cluster setup, performance tuning and ongoing monitoring.
Perform high-level, day-to-day operational maintenance, support, and upgrades for the Hadoop Cluster.
Research and recommend innovative, and where possible, automated approaches for system administration tasks.
Creation of key performance metrics, measuring the utilization, performance and overall health of the cluster.
Deploy new/upgraded hardware and software releases and establish proper communication channels.
Work with appropriate stakeholders to ensure we have solid capacity planning and can manage our TCO.
Ability to collaborate with product managers, lead engineers and data scientists on all facets of the Hadoop Eco-System.
Ensure existing data/information assets are secure and adhering to a best in class security model.