Friday, 16 February 2018

Position: Hadoop Architect Location: Atlanta, GA Duration: 6 Months+ Minimum 15+Years experience  Job description: Provides architectural and big picture oversight and creates Architectural Specification for the development of new or enhanced products and/or services. Researches and evaluates new technologies, design patterns, and software products to determine feasibility and desirability of incorporating their capabilities within...
Position: Hadoop Architect

Location: Atlanta, GA

Duration: 6 Months+

Minimum 15+Years experience

 Job description:

Provides architectural and big picture oversight and creates Architectural Specification for the development of new or enhanced products and/or services.

Researches and evaluates new technologies, design patterns, and software products to determine feasibility and desirability of incorporating their capabilities within the company's products.

Identifies opportunities to implement and/or enforce compliance of architectural standards, including Reference Architecture, into customer and product enhancement or development projects.

Supports development and product teams by providing high-level analysis and design reviews, performance, scalability and benchmark test guidance, and subject matter expertise in technology and design.

Plans, directs and maintains projects. Reviews work requests and estimates scope of projects.

Hands on experience working with Hadoop Hortonworks 2.5 or higher

In-depth knowledge and understanding of Hadoop Architecture and HDFS including YARN

Working knowledge of MapReduce, HBase, Pig, Java, Hive, Zookeeper, Flume and Sqoop

Successful Big data deployments and implementation strategies utilizing Hadoop life cycle model

Ability to perform analytics on data stored in Hadoop via SQL queries

Experience working in core development Big Data projects, Should be able to perform hands-on development activities in HDFS, Hive, HBase, Spark, Scala, Map Reduce and Hadoop ETL development via tools is plus.

Strong knowledge of programming and scripting languages such as Java, Spark, Python, Hive, Pig
Experience with major big data technologies and frameworks including but not limited to Hadoop, MapReduce, Pig, Hive, HBase, Oozie, Mahout, Flume, ZooKeeper, MongoDB, and Cassandra; Experience in working on production big data solutions and experience in client-driven large-scale data lake implementation projects;

Work with data engineering related groups in the support of deployment of Hadoop and Spark jobs.
Deep understanding of Hadoop and Spark cluster security, networking connectivity and IO  throughput along with other factors that affect distributed system performance Strong working knowledge of disaster recovery, incident management, and security best practices


Thanks & Regards,

Chaitu

Teachnical Recruiter |R2 Technologies

6515, Shiloh Rd Unit 110 Alpharetta, GA - 30005

Desk: 470-242-7345 EXT: 302|Email: chaitu@r2techcorp.com

Gtalk: chaitu.recruiter7@gmail.com

Linked-in: https://www.linkedin.com/in/chaitanya-chaitu-480003148/

Related Posts:

0 comments:

Post a Comment

Blog Archive

Contributors

GemSoft Tech Solutions. Powered by Blogger.

Recent Posts