Text copied to clipboard!

Title

Text copied to clipboard!

Hadoop Developer

Description

Text copied to clipboard!
We are looking for a skilled Hadoop Developer to join our team. The ideal candidate will have a deep understanding of Hadoop Ecosystem and its components such as HDFS, YARN, MapReduce, Hive, Pig, HBase, Spark, Oozie, Flume and Sqoop. You will be responsible for designing, building, installing, configuring and supporting Hadoop. You will also be responsible for translating complex functional and technical requirements into detailed design. You will be expected to perform performance tuning and troubleshooting of Hadoop clusters. You will also be responsible for maintaining security and data privacy in a big data environment. You should have excellent analytical and problem-solving skills, and the ability to work in a team as well as independently. You should also have excellent communication skills, as you will be required to interact with other team members and stakeholders.

Responsibilities

Text copied to clipboard!
  • Designing, building, installing, configuring and supporting Hadoop.
  • Translating complex functional and technical requirements into detailed design.
  • Performing performance tuning and troubleshooting of Hadoop clusters.
  • Maintaining security and data privacy in a big data environment.
  • Working with data delivery teams to set up new Hadoop users.
  • Testing prototypes and oversee handover to operational teams.
  • Proposing best practices and standards.
  • Understanding of cloud computing infrastructure and platforms.
  • Creating scalable and high-performance web services for data tracking.
  • Working with data delivery teams to set up new Hadoop users.

Requirements

Text copied to clipboard!
  • Bachelor's degree in Computer Science or related field.
  • Proven experience as a Hadoop Developer or similar role.
  • Knowledge of Hadoop Ecosystem and its components.
  • Experience with Hadoop cluster setup, performance tuning and monitoring.
  • Experience with cloud computing infrastructure and platforms.
  • Knowledge of Java, Python, HiveQL, NoSQL.
  • Experience with building stream-processing systems.
  • Good problem-solving and analytical skills.
  • Excellent communication skills.
  • Ability to work in a team as well as independently.

Potential interview questions

Text copied to clipboard!
  • What is your experience with Hadoop Ecosystem and its components?
  • Can you describe a project where you used Hadoop to solve a complex problem?
  • How do you ensure data security in a big data environment?
  • Can you describe your experience with cloud computing infrastructure and platforms?
  • How do you handle performance tuning and troubleshooting of Hadoop clusters?