Hadoop Platform Administrator For Noida & Bangalore

10 - 18 Years
Bengaluru, Noida

Job Description



Warm Greetings Career Trackers & Consulting.

Product Development MNC is looking for Hadoop Platform Administrator for Noida

Please apply if you have worked as Hadoop Platform Administrator for atleast 5 years.

If you are interested, Kindly email your resume asap along with the details requested below at neeta@careertrackers.in

Company : A very Prestigious Product Development MNC
Role : Hadoop Platform Administrator
Expertise required: please refer the detailed Jd below.
Exp: 10 - 16 yrs
Work Location: Noida & Bangalore
Package: Best in the Industry
Nature of job: Permanent & Fulltime

PLEASE SHARE YOUR BELOW DETAILS FOR FURTHER PROCESSING:-

1. Full Name:
2. Email ID:
3. Mobile :
4. Current Designation :
5. Current Organization:
6.Explanation of Gap in Employment (if any):
7. Total Experience:
8. Relevant Experience as Hadoop Platform Administrator :
9. Highest Qualification :
10. Current CTC:
11. Expected CTC (in figure) :
12. Notice Period :
13. Current location:
14. Open for Noida / Bangalore :


We are looking for a passionate, high energy Application Engineer to build pipelines using multiple software tools in a Hybrid Cloud business environment YOU will be responsible for understanding requirements for tools/dashboards and use programming languages and contribute to developing application using tools to meet clients data requirements. Engineers to demonstrate skills on one or more programming languages, multiple operating systems, attention to details, time management, accuracy and good communication and documentation Skills.

The successful candidate will work in a high availability, this role provides a unique opportunity to plan, design and build up unique software Data engineering through framework and on a breadth of cloud technologies. This is a highly visible role and will involve engagement with IT & Engineering Global teams and integrate a variety of other applications, platforms & services, found in very few other opportunities in a high performing team. It will be a HUGE Plus if the candidate has Core Development experience on one or more languages like JAVA, PYTHON, SCALA. He/she is expected as a Data Engineer to have relevant Tools and a full stack development experience with hands on Multi cluster environment such as HADOOP and Cloud Technologies will be a Plus.

What you will do

Responsible for design, Develop, monitoring, tuning and optimizing, governing Large Scale Hadoop Cluster and Hadoop components in a 24*7 team.
Design & implement new components and various emerging technologies in Hadoop Echo System, and successful execution of various Proof-Of-Technology (PoT).
Monitor and analyze job performance, file system/disk-space management, cluster & database connectivity and log files.
Platform administration activities on various Hadoop services such as Yarn, RM, Zoo keeper, Cloudera manager etc
Harden the cluster to support use cases and self-service in 24x7 model and apply advanced troubleshooting techniques to on critical, highly complex customer problems.
Automate deployment and management of Hadoop services including implementing monitoring.

What you need to succeed
10 - 15 Years Of strong Linux/Java / BigData experience with Enterprise Data ware housing experience
Must have 5+ years of Big data administration experience including upgrades, cluster parameter tuning, configuration changes etc
Well versed with Hadoop challenges related to scaling and self-service analytics
Well versed with Cloudera distributions, hands on experience required for both distributions.
Well versed with Kafka, hive, spark, hbase and latest developments in the Hadoop eco system
Excellent knowledge of Hadoop integration points with enterprise BI and EDW tools
Strong Experience with Hadoop cluster management/ administration/ operations using Oozie, Yarn, Ambari, Zookeeper, Tez, Slider
Strong Experience with Hadoop ETL/ Data Ingestion: Sqoop, Flume, Hive, Spark, Hbase
Strong experience on SQL
Good to have Experience in Real Time Data Ingestion using Kafka, Storm, Spark or Complex Event Processing (CEP)
Experience in Hadoop Data Consumption and Other Components: Hive, Hue HBase, Phoenix, Spark, Mahout, Pig, Impala, Presto
Experience monitoring, troubleshooting and tuning services and applications and operational expertise such as good troubleshooting skills, understanding of systems capacity, bottlenecks, and basics of memory, CPU, OS, storage, and networks.
Should be able to take the lead and interact with US team members with minimal supervision
Bachelors Degree in Computer Science, Information Science, Information Technology or Engineering/ Related Field
Good communication skills across distributed team environment
Must be self-motivated, responsive, professional and dedicated to customer success


Would request you to kindly share this email with your friends in case you are not interested in exploring.


Thanks & Warm Regards

Neeta Chachra
Career Trackers & Consulting
8218710071
neeta@careertrackers.in
http://www.careertrackers.in


Required Candidate profile




Warm Greetings Career Trackers & Consulting.

Product Development MNC is looking for Hadoop Platform Administrator for Noida

Please apply if you have worked as Hadoop Platform Administrator for atleast 5 years.

If you are interested, Kindly email your resume asap along with the details requested below at neeta@careertrackers.in

Company : A very Prestigious Product Development MNC
Role : Hadoop Platform Administrator
Expertise required: please refer the detailed Jd below.
Exp: 10 - 16 yrs
Work Location: Noida & Bangalore
Package: Best in the Industry
Nature of job: Permanent & Fulltime

PLEASE SHARE YOUR BELOW DETAILS FOR FURTHER PROCESSING:-

1. Full Name:
2. Email ID:
3. Mobile :
4. Current Designation :
5. Current Organization:
6.Explanation of Gap in Employment (if any):
7. Total Experience:
8. Relevant Experience as Hadoop Platform Administrator :
9. Highest Qualification :
10. Current CTC:
11. Expected CTC (in figure) :
12. Notice Period :
13. Current location:
14. Open for Noida / Bangalore :


We are looking for a passionate, high energy Application Engineer to build pipelines using multiple software tools in a Hybrid Cloud business environment YOU will be responsible for understanding requirements for tools/dashboards and use programming languages and contribute to developing application using tools to meet clients data requirements. Engineers to demonstrate skills on one or more programming languages, multiple operating systems, attention to details, time management, accuracy and good communication and documentation Skills.

The successful candidate will work in a high availability, this role provides a unique opportunity to plan, design and build up unique software Data engineering through framework and on a breadth of cloud technologies. This is a highly visible role and will involve engagement with IT & Engineering Global teams and integrate a variety of other applications, platforms & services, found in very few other opportunities in a high performing team. It will be a HUGE Plus if the candidate has Core Development experience on one or more languages like JAVA, PYTHON, SCALA. He/she is expected as a Data Engineer to have relevant Tools and a full stack development experience with hands on Multi cluster environment such as HADOOP and Cloud Technologies will be a Plus.

What you will do

Responsible for design, Develop, monitoring, tuning and optimizing, governing Large Scale Hadoop Cluster and Hadoop components in a 24*7 team.
Design & implement new components and various emerging technologies in Hadoop Echo System, and successful execution of various Proof-Of-Technology (PoT).
Monitor and analyze job performance, file system/disk-space management, cluster & database connectivity and log files.
Platform administration activities on various Hadoop services such as Yarn, RM, Zoo keeper, Cloudera manager etc
Harden the cluster to support use cases and self-service in 24x7 model and apply advanced troubleshooting techniques to on critical, highly complex customer problems.
Automate deployment and management of Hadoop services including implementing monitoring.

What you need to succeed
10 - 15 Years Of strong Linux/Java / BigData experience with Enterprise Data ware housing experience
Must have 5+ years of Big data administration experience including upgrades, cluster parameter tuning, configuration changes etc
Well versed with Hadoop challenges related to scaling and self-service analytics
Well versed with Cloudera distributions, hands on experience required for both distributions.
Well versed with Kafka, hive, spark, hbase and latest developments in the Hadoop eco system
Excellent knowledge of Hadoop integration points with enterprise BI and EDW tools
Strong Experience with Hadoop cluster management/ administration/ operations using Oozie, Yarn, Ambari, Zookeeper, Tez, Slider
Strong Experience with Hadoop ETL/ Data Ingestion: Sqoop, Flume, Hive, Spark, Hbase
Strong experience on SQL
Good to have Experience in Real Time Data Ingestion using Kafka, Storm, Spark or Complex Event Processing (CEP)
Experience in Hadoop Data Consumption and Other Components: Hive, Hue HBase, Phoenix, Spark, Mahout, Pig, Impala, Presto
Experience monitoring, troubleshooting and tuning services and applications and operational expertise such as good troubleshooting skills, understanding of systems capacity, bottlenecks, and basics of memory, CPU, OS, storage, and networks.
Should be able to take the lead and interact with US team members with minimal supervision
Bachelors Degree in Computer Science, Information Science, Information Technology or Engineering/ Related Field
Good communication skills across distributed team environment
Must be self-motivated, responsive, professional and dedicated to customer success


Would request you to kindly share this email with your friends in case you are not interested in exploring.


Thanks & Warm Regards

Neeta Chachra
Career Trackers & Consulting
8218710071
neeta@careertrackers.in
http://www.careertrackers.in



Perks and Benefits 

Best In The Industry

Salary: Not Disclosed by Recruiter

Industry:IT-Software / Software Services

Functional Area:IT Software - Application Programming, Maintenance

Role Category:Programming & Design

Role:Technical Architect

Keyskills

Desired Candidate Profile

Please refer to the Job description above

Company Profile

Career Trackers and Consulting

A very Prestigious Product Development US Based MNC
View Contact Details+

Recruiter Name:Neeta

Contact Company:Career Trackers and Consulting

Telephone:8218710071

Email :neeta@careertrackers.in

Reference Id:Hadoop Platform Administrator

Website:http://www.careertrackers.in

Send a Query

Please provide your email address

Please enter your query

Supported Formats: doc, docx, rtf, pdf Max file size: 2MB

Please agree to the terms and conditions to continue.