Community Articles

Find and share helpful community-sourced technical articles.
Announcements
Celebrating as our community reaches 100,000 members! Thank you!
Labels (1)
avatar
Guru

There is situation when unfortunately and unknowingly you delete /hdp/apps/2.3.4.0-3485 either with skipTrash or without skipTrash then you will be in trouble and other services will be impacted. You will not be able to run hive,mapreduce or sqoop command, You will get following error.

Case 1: If you deleted it without skipTrash then it is very easy to recover:

[root@m1 ranger-hdfs-plugin]# hadoop fs -rmr /hdp/apps/2.3.4.0-3485

rmr: DEPRECATED: Please use ‘rm -r’ instead.

16/07/28 01:59:22 INFO fs.TrashPolicyDefault: Namenode trash configuration: Deletion interval = 360 minutes, Emptier interval = 0 minutes.

Moved: 'hdfs://HDPTSTHA/hdp/apps/2.3.4.0' to trash at: hdfs://HDPTSTHA/user/hdfs/.Trash/Current

In this case it would be very easy to recover it as after deleting it goes to your current dir and you can recover it from there.

hadoop fs -put hdfs://HDPTSTHA/user/hdfs/.Trash/Current//hdp/apps/2.3.4.0 /hdp/apps/

Case 2: If you deleted it with -skipTrash then you need to execute following steps:

[root@m1 ranger-hdfs-plugin]# hadoop fs -rmr -skipTrash /hdp/apps/2.3.4.0-3485

rmr: DEPRECATED: Please use ‘rm -r’ instead.

Deleted /hdp/apps/2.3.4.0-3485

So when I am trying to access to hive it is throwing below error.

[root@m1 admin]# hive

WARNING: Use “yarn jar” to launch YARN applications.

16/07/27 22:05:04 WARN conf.HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist

Logging initialized using configuration in file:/etc/hive/2.3.4.0-3485/0/hive-log4j.properties

Exception in thread “main” java.lang.RuntimeException: java.io.FileNotFoundException: File does not exist: /hdp/apps/2.3.4.0-3485/tez/tez.tar.gz

at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:507)

Resolution: Don’t worry friends you can resolve this issue by following give steps.

Note: You have to replace version of your hdp.

Step 1: First you will have to create following required dirs :

hdfs dfs -mkdir -p /hdp/apps/<2.3.4.0-$BUILD>/mapreduce

hdfs dfs -mkdir -p /hdp/apps/<2.3.4.0-$BUILD>/hive

hdfs dfs -mkdir -p /hdp/apps/<2.3.4.0-$BUILD>/tez

hdfs dfs -mkdir -p /hdp/apps/<2.3.4.0-$BUILD>/sqoop

hdfs dfs -mkdir -p /hdp/apps/<2.3.4.0-$BUILD>/pig

Step 2: Now you have to copy required jars in related dir.

hdfs dfs -put /usr/hdp/2.3.4.0-$BUILD/hadoop/mapreduce.tar.gz /hdp/apps/2.3.4.0-$BUILD/mapreduce/

hdfs dfs -put /usr/hdp/2.3.2.0-<$version>/hive/hive.tar.gz /hdp/apps/2.3.2.0-<$version>/hive/

hdfs dfs -put /usr/hdp/<hdp_version>/tez/lib/tez.tar.gz /hdp/apps/<hdp_version>/tez/

hdfs dfs -put /usr/hdp/<hdp-version>/sqoop/sqoop.tar.gz /hdp/apps/<hdp-version>/sqoop/

hdfs dfs -put /usr/hdp/<hdp-version>/pig/pig.tar.gz /hdp/apps/<hdp-version>/pig/

Step 3: Now you need to change dir owner and then change permission:

hdfs dfs -chown -R hdfs:hadoop /hdp

hdfs dfs -chmod -R 555 /hdp/apps/2.3.4.0-$BUILD

Now you will be able to start your hive CLI or other jobs.

[root@m1 ~]# hive

WARNING: Use “yarn jar” to launch YARN applications.

16/07/27 23:33:42 WARN conf.HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist

Logging initialized using configuration in file:/etc/hive/2.3.4.0-3485/0/hive-log4j.properties

hive>

I hope it will help you to restore your cluster. Please feel free to give your suggestion

1,919 Views
webinar banner
Version history
Last update:
‎08-01-2016 12:01 PM
Updated by:
Guru
Contributors
meetups banner