*Position: Hadoop Administrator*
*Location: Princeton, NJ. *
*Duration: 6+ month contract *
*EAD, GC-EAD,GC,USC ONLY*
*MOI: Skype*
Hadoop System Administrator is responsible for implementation and support of the Enterprise Hadoop environment. Involves designing, capacity arrangement, cluster set up, performance fine-tuning, monitoring, structure planning, scaling and administration.
*Responsibilities*
*• *Development & Production env’s...
*Position: Hadoop Administrator*
*Location: Princeton, NJ. *
*Duration: 6+ month contract *
*EAD, GC-EAD,GC,USC ONLY*
*MOI: Skype*
Hadoop System Administrator is responsible for implementation and support of the Enterprise Hadoop environment. Involves designing, capacity arrangement, cluster set up, performance fine-tuning, monitoring, structure planning, scaling and administration.
*Responsibilities*
*• *Development & Production env’s - must have administration exp with Hortonworks.
• Understanding troubleshooting and memory issues, utilization aspect.
• Administering and maintaining some other platforms – such as Syncsort.
• Will be doing ongoing administration, maintenance of the platforms.
• Will document and formalize processes for the environments.
• Document processes and surrounding administration of platforms.
• Complete understanding of all application in Hadoop ecosystem.
*Requirements*
*• *Unix, linux – 5 years
• Hadoop administration (not necessarily all Hortonworks) but must be with Hadoop – 3+ years
• NiFi – 3+ year
• Applications that sit on top of Hadoop – a plus
• Unix and Linux administration a must have – must know how to write shell scripts.
• Need to have experience with Security – how to implement and maintain security across these platforms.
• Implementing apache ranger and integrating with Ldapm, Hdfs and hive in particular.
• Ambari is the metrics and monitory tool – MUST have this. K
• Microsoft Azure is their cloud platform.
• Microsoft nodes is a plus.
• Must have all components of Hadoop ecosystem: Pig, Hive, Hbase, Cassandra.
• Scripting language – Shell scripting at minimum and python or another program lang is a plus. Jenkins tool – continuous integration and delivery. Needs some administration for that.
• Exp with continuous integration, continuous delivery platforms is a +.
• Ability to perform fresh installs from ground up, new clusters, adding nodes and what that means.
*Thanks & Regards *
*Geetika Rajpoot**||Sr. Technical Recruiter*
*Source Infotech Inc.*
*geetika@sourceinfotech.com** ||**Phone: 609-945-0706*
*Hangout ID :* *gitirajpoot@gmail.com*
*Gmail:* *geetikarajpoot35@gmail.com*
*Yahoo Id:* *geetrajpoot35@yahoo.com*
*Location: Princeton, NJ. *
*Duration: 6+ month contract *
*EAD, GC-EAD,GC,USC ONLY*
*MOI: Skype*
Hadoop System Administrator is responsible for implementation and support of the Enterprise Hadoop environment. Involves designing, capacity arrangement, cluster set up, performance fine-tuning, monitoring, structure planning, scaling and administration.
*Responsibilities*
*• *Development & Production env’s - must have administration exp with Hortonworks.
• Understanding troubleshooting and memory issues, utilization aspect.
• Administering and maintaining some other platforms – such as Syncsort.
• Will be doing ongoing administration, maintenance of the platforms.
• Will document and formalize processes for the environments.
• Document processes and surrounding administration of platforms.
• Complete understanding of all application in Hadoop ecosystem.
*Requirements*
*• *Unix, linux – 5 years
• Hadoop administration (not necessarily all Hortonworks) but must be with Hadoop – 3+ years
• NiFi – 3+ year
• Applications that sit on top of Hadoop – a plus
• Unix and Linux administration a must have – must know how to write shell scripts.
• Need to have experience with Security – how to implement and maintain security across these platforms.
• Implementing apache ranger and integrating with Ldapm, Hdfs and hive in particular.
• Ambari is the metrics and monitory tool – MUST have this. K
• Microsoft Azure is their cloud platform.
• Microsoft nodes is a plus.
• Must have all components of Hadoop ecosystem: Pig, Hive, Hbase, Cassandra.
• Scripting language – Shell scripting at minimum and python or another program lang is a plus. Jenkins tool – continuous integration and delivery. Needs some administration for that.
• Exp with continuous integration, continuous delivery platforms is a +.
• Ability to perform fresh installs from ground up, new clusters, adding nodes and what that means.
*Thanks & Regards *
*Geetika Rajpoot**||Sr. Technical Recruiter*
*Source Infotech Inc.*
*geetika@sourceinfotech.com*
*Hangout ID :* *gitirajpoot@gmail.com*
*Gmail:* *geetikarajpoot35@gmail.com*
*Yahoo Id:* *geetrajpoot35@yahoo.com*
0 comments:
Post a Comment