*Role: Big Data Consultant*
*Location: Charlotte, NC*
*Duration: 6+ Months *
*Description: *
1. Design and build scalable infrastructure and Big Data platforms to collect and process very large amounts of
Data (structured and unstructured), including if applicable streaming real-time data.
2. Work closely across an array of various teams and organizations in the company and industry
(Including partners, customers and researchers) to:
•...
*Role: Big Data Consultant*
*Location: Charlotte, NC*
*Duration: 6+ Months *
*Description: *
1. Design and build scalable infrastructure and Big Data platforms to collect and process very large amounts of
Data (structured and unstructured), including if applicable streaming real-time data.
2. Work closely across an array of various teams and organizations in the company and industry
(Including partners, customers and researchers) to:
• Demonstrate thought leadership and guide customers on Big Data adoption, setting up best practices, governance structures
• Act as internal resource to help teams leverage our big data capabilities
• Knowledge of Information Security, Data Classification, Data Masking
• Effectively communicate solutions architecture to key stakeholders and project teams
• Contribute on multiple Big Data projects and assign tasks to junior engineers, oversee the execution of tasks and provide mentorship and guidance as needed
• Understand business objectives and suggests technical strategies to meet those objectives
• Contribute to business requirement definition and use case design as a technical expert
• Convert business requirements into architectural designs and detailed technical designs
• Identify tasks, effort and dependencies based on software architecture and specifications
• Guide performance testing and recommend solutions for any performance bottle necks
• Plan and execute technology proof-of-concept’s (POC) using Big Data technology
• Undertake feasibility studies for major IT developments incorporating costs and benefits, and presents proposals to clients;
• Examine existing business models and flows of data and designs functional specifications and test plans for new systems in order to meet clients needs;
3. Using Data technologies and given business requirements, design a comprehensive technical architecture including:
a. Conceptual, Logical, Physical Models
b. Data quality standards
c. Data indexing
d. Extraction and analytic queries
*Skills (mandatory): *
4. Problem solver by nature with a strong sense of curiosity about technical puzzles especially large scale distributed computing
*Skills (mandatory): Continued *
5. Hands on technical competence in one or more of following:
a. Programming languages - Java/J2EE, Linux, PHP, Python, C, C++, Hadoop, Hive, HBase, Pig, MapReduce and other Hadoop eco-system components, R on Hadoop, Mahout
b. Data warehouse, BI and ETL tools
c. Detailed knowledge of RDBMS data modeling and SQL
d. Some knowledge of NoSQL databases types such as OLAP, Graph, Key Value, Object, XML, JSON, SQL, NOSQL, Tuple Store, Columnar, in-memory
*Skills (nice to have): *
6. Other relevant Apache technologies including:
a. Storm, Kafka, Solr/Lucene,
b. Enterprise Application integration
c. Master Data Management tools and methodologies
*Qualifications: *
o 10+ years of solid IT consulting experience in data warehousing, VLDBs, operational data stores and large scale implementations
o Experience working as a member of a distributed, often global, team of technical and domain experts using agile development methods
o Excellent one-on-one communication and presentation skills, able to convey technical information in a clear and unambiguous manner.
*Thanks*
*Raj *
*SR **Technical Recruiter*
kraj@mwpartners.net
Direct: *323-316-2055*
MW Partners
IT Professionals - Services & Consulting
*Location: Charlotte, NC*
*Duration: 6+ Months *
*Description: *
1. Design and build scalable infrastructure and Big Data platforms to collect and process very large amounts of
Data (structured and unstructured), including if applicable streaming real-time data.
2. Work closely across an array of various teams and organizations in the company and industry
(Including partners, customers and researchers) to:
• Demonstrate thought leadership and guide customers on Big Data adoption, setting up best practices, governance structures
• Act as internal resource to help teams leverage our big data capabilities
• Knowledge of Information Security, Data Classification, Data Masking
• Effectively communicate solutions architecture to key stakeholders and project teams
• Contribute on multiple Big Data projects and assign tasks to junior engineers, oversee the execution of tasks and provide mentorship and guidance as needed
• Understand business objectives and suggests technical strategies to meet those objectives
• Contribute to business requirement definition and use case design as a technical expert
• Convert business requirements into architectural designs and detailed technical designs
• Identify tasks, effort and dependencies based on software architecture and specifications
• Guide performance testing and recommend solutions for any performance bottle necks
• Plan and execute technology proof-of-concept’s (POC) using Big Data technology
• Undertake feasibility studies for major IT developments incorporating costs and benefits, and presents proposals to clients;
• Examine existing business models and flows of data and designs functional specifications and test plans for new systems in order to meet clients needs;
3. Using Data technologies and given business requirements, design a comprehensive technical architecture including:
a. Conceptual, Logical, Physical Models
b. Data quality standards
c. Data indexing
d. Extraction and analytic queries
*Skills (mandatory): *
4. Problem solver by nature with a strong sense of curiosity about technical puzzles especially large scale distributed computing
*Skills (mandatory): Continued *
5. Hands on technical competence in one or more of following:
a. Programming languages - Java/J2EE, Linux, PHP, Python, C, C++, Hadoop, Hive, HBase, Pig, MapReduce and other Hadoop eco-system components, R on Hadoop, Mahout
b. Data warehouse, BI and ETL tools
c. Detailed knowledge of RDBMS data modeling and SQL
d. Some knowledge of NoSQL databases types such as OLAP, Graph, Key Value, Object, XML, JSON, SQL, NOSQL, Tuple Store, Columnar, in-memory
*Skills (nice to have): *
6. Other relevant Apache technologies including:
a. Storm, Kafka, Solr/Lucene,
b. Enterprise Application integration
c. Master Data Management tools and methodologies
*Qualifications: *
o 10+ years of solid IT consulting experience in data warehousing, VLDBs, operational data stores and large scale implementations
o Experience working as a member of a distributed, often global, team of technical and domain experts using agile development methods
o Excellent one-on-one communication and presentation skills, able to convey technical information in a clear and unambiguous manner.
*Thanks*
*Raj *
*SR **Technical Recruiter*
kraj@mwpartners.net
Direct: *323-316-2055*
MW Partners
IT Professionals - Services & Consulting
0 comments:
Post a Comment