• +91 9723535972
  • info@interviewmaterial.com

Hadoop Interview Questions and Answers

Question - What happens if you try to run a Hadoop job with an output directory that is already present?

Answer -

It will throw an exception saying that the output file directory already exists.

To run the MapReduce job, it needs to be ensured that the output directory does not exist in the HDFS.

To delete the directory before running the job, shell can be used:

Hadoop fs –rmr /path/to/your/output/
Or the Java API:

FileSystem.getlocal(conf).delete(outputDir, true);

Comment(S)

Show all Coment

Leave a Comment




NCERT Solutions

 

Share your email for latest updates

Name:
Email:

Our partners