*Sr Hadoop Admin ( Cloudera )*
*Location: Burlington, NC *
*6+ Months C2H*
*Job Description*
*Hadoop Administrator:*
Consultant is responsible for implementation and ongoing administration of Hadoop infrastructure on BDA. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and Map Reduce access for the new users
*Job Responsibilities:*
* Providing expertise in provisioning physical systems...
*Sr Hadoop Admin ( Cloudera )*
*Location: Burlington, NC *
*6+ Months C2H*
*Job Description*
*Hadoop Administrator:*
*Job Responsibilities:*
* Providing expertise in provisioning physical systems for use in Hadoop
* Installing and Configuring Systems for use with Cloudera distribution of Hadoop (consideration given to other variants of Hadoop such as Apache, MapR, Spark, Hive, Impala, Kafka, Flume, etc.)
* Administering and Maintaining Cloudera Hadoop Clusters
* Provision physical Linux systems, patch, and maintain them.
* Performance tuning of Hadoop clusters and Hadoop Map Reduce/Spark routines.
* Management and support of Hadoop Services including HDFS, Hive, Impala, and SPARK. Primarily using Cloudera Manager but some command-line.
* Red Hat Enterprise Linux Operating System support including administration and provisioning of Oracle BDA.
* Answering trouble tickets around the Hadoop ecosystem
* Integration support of tools that need to connect to the OFR Hadoop ecosystem. This may include tools like Bedrock, Tableau, Talend, generic ODBC/JDBC, etc.
* Provisioning of new Hive/Impala databases.
* Provisioning of new folders within HDFS
* Setting up and validating Disaster Recovery replication of data fromProduction cluster
*Requirements*
*EXPERIENCE/SKILLS REQUIRED: *
* Bachelor's degree in computer science
* Demonstrated knowledge/experience in all of the areas of responsibility provided above.
* General operational knowledge such as good troubleshooting skills, understanding of system's capacity, bottlenecks, basics of memory, CPU, OS, storage and networks.
* Must have knowledge of Red Hat Enterprise Linux Systems Administration
* Must have experience with Secure Hadoop - sometimes called Kerberized Hadoop - using Kerberos.
* Knowledge in configuration management and deployment tools such as Puppet or Chef and Linux scripting.
* Must have fundamentals of central, automated configuration management (sometimes called "DevOps.")
*Thanks & Regards,*
*Aravind*
*MSR COSMOS*
5250 Claremont Ave, Ste 249 | Stockton, CA – 95207
*Desk : 925-399-7145*
*Fax : 925-219-0934 *
*Email *: aravind@msrcosmos.com
*URL*: http://www.msrcosmos.com
*Location: Burlington, NC *
*6+ Months C2H*
*Job Description*
*Hadoop Administrator:*
Consultant is responsible for implementation and ongoing administration of Hadoop infrastructure on BDA. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and Map Reduce access for the new users
*Job Responsibilities:*
* Providing expertise in provisioning physical systems for use in Hadoop
* Installing and Configuring Systems for use with Cloudera distribution of Hadoop (consideration given to other variants of Hadoop such as Apache, MapR, Spark, Hive, Impala, Kafka, Flume, etc.)
* Administering and Maintaining Cloudera Hadoop Clusters
* Provision physical Linux systems, patch, and maintain them.
* Performance tuning of Hadoop clusters and Hadoop Map Reduce/Spark routines.
* Management and support of Hadoop Services including HDFS, Hive, Impala, and SPARK. Primarily using Cloudera Manager but some command-line.
* Red Hat Enterprise Linux Operating System support including administration and provisioning of Oracle BDA.
* Answering trouble tickets around the Hadoop ecosystem
* Integration support of tools that need to connect to the OFR Hadoop ecosystem. This may include tools like Bedrock, Tableau, Talend, generic ODBC/JDBC, etc.
* Provisioning of new Hive/Impala databases.
* Provisioning of new folders within HDFS
* Setting up and validating Disaster Recovery replication of data fromProduction cluster
*Requirements*
*EXPERIENCE/SKILLS REQUIRED: *
* Bachelor's degree in computer science
* Demonstrated knowledge/experience in all of the areas of responsibility provided above.
* General operational knowledge such as good troubleshooting skills, understanding of system's capacity, bottlenecks, basics of memory, CPU, OS, storage and networks.
* Must have knowledge of Red Hat Enterprise Linux Systems Administration
* Must have experience with Secure Hadoop - sometimes called Kerberized Hadoop - using Kerberos.
* Knowledge in configuration management and deployment tools such as Puppet or Chef and Linux scripting.
* Must have fundamentals of central, automated configuration management (sometimes called "DevOps.")
*Thanks & Regards,*
*Aravind*
*MSR COSMOS*
5250 Claremont Ave, Ste 249 | Stockton, CA – 95207
*Desk : 925-399-7145*
*Fax : 925-219-0934 *
*Email *: aravind@msrcosmos.com
*URL*: http://www.msrcosmos.com
0 comments:
Post a Comment