Educational Qualification: B.S. or higher in Computer Science, Engineering
Experience: minimum 5 years
Person will be responsible to provide Hadoop Platform Support and Perform administration on Production Hadoop clusters.
Install and configure Hortonworks clusters
Apply proper architecture guidelines to ensure highly available services
Plan and execute major platform software and operating system upgrades and maintenance across physical environments
Develop and automate processes for maintenance of the environment
Implement security measures for all aspects of the cluster (SSL, disk encryption, Kerberos, role-based access via Apache Ranger policies)
Ensure proper resource utilization between the different development teams and processes
Design and implement a toolset that simplifies provisioning and support of a large cluster environment
Review performance stats and query execution/explain plans; recommend changes for tuning Apache Spark, MapReduce and Hive queries
Create and maintain detailed, up-to-date technical documentation
5+ years of relevant experience – Hadoop, System administration with sound knowledge in Unix based.
In-depth knowledge of Apache Hadoop, MapReduce, Hive, Spark, Kerberos, Ranger.
Ability to shell script with Linux.
Day-to-day operational support of Hortonworks Hadoop clusters in dev/test and production, at petabyte scale.
Hadoop patches and upgrades and troubleshooting Hadoop job failures.
Analysing Log files and finding the root cause and taking/recommending course of actions.
Hadoop Certification is preferred.