Community Articles

Find and share helpful community-sourced technical articles.
Labels (1)
avatar
Contributor

Hive Pre Upgrade tool command fails after not able to access HDFS

The command is not able to find the right configuration of hadoop core components, primarily due to unavailability of hadoop core component configuration files in the command.

Resolution steps

There are different ways to export the hadoop core configuration, one of them is below.

1. Follow the below HWX document to make sure you have properly followed the steps. (Check appropriate link for your version)

https://docs.hortonworks.com/HDPDocuments/Ambari-2.7.3.0/bk_ambari-upgrade-major/content/prepare_hiv... 

2. Follow till "Procedure for compacting Hive tables (no Kerberos)" step 3 and before running the pre-upgrade tool command, export hadoop and hive env files

source /etc/hadoop/conf/hadoop-env.sh 
source /etc/hive/conf/hive-env.sh 

3. Now while running the pre-upgrade tool command, include two more paths in the command (/etc/hadoop/conf:/etc/hive/conf). So the command below similar to below.

java -Djavax.security.auth.useSubjectCredsOnly=false (optional) -cp /etc/hadoop/conf:/etc/hive/conf:/usr/hdp/$STACK_VERSION/hive/lib/derby-10.10.2.0.jar:
/usr/hdp/$STACK_VERSION/hive/lib/*:/usr/hdp/$STACK_VERSION/hadoop/*:/usr/hdp/$STACK_VERSION/hadoop/lib/*:/usr/hdp/$STACK_VERSION/hadoop-mapreduce/*:
/usr/hdp/$STACK_VERSION/hadoop-mapreduce/lib/*:/usr/hdp/$STACK_VERSION/hadoop-hdfs/*:/usr/hdp/$STACK_VERSION/hadoop-hdfs/lib/*:/usr/hdp/$STACK_VERSION/
hadoop/etc/hadoop/*:/tmp/hive-pre-upgrade-<your version>.jar:/usr/hdp/$STACK_VERSION/hive/conf/conf.server org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool
952 Views
0 Kudos