The HIVE upgrade tool won't run as documented in the HDP 3 upgrade docs: Hive upgrade tool
None of the -cp folders exist, you need to find your current HDP version and substitute /current/ with it as well as add * to all folders, i.e.: (in my case HDP 22.214.171.124-230)
/usr/hdp/current/hive2/lib/ --> /usr/hdp/126.96.36.199-230/hive2/lib/*
Full command that worked for me:
$JAVA_HOME/bin/java -Djavax.security.auth.useSubjectCredsOnly=false -cp /usr/hdp/188.8.131.52-230/hive2/lib/derby-10.10.2.0.jar:/usr/hdp/184.108.40.206-230/hive2/lib/*:/usr/hdp/220.127.116.11-230/hadoop/*:/usr/hdp/18.104.22.168-230/hadoop/lib/*:/usr/hdp/22.214.171.124-230/hadoop-mapreduce/*:/usr/hdp/126.96.36.199-230/hadoop-mapreduce/lib/*:/usr/hdp/188.8.131.52-230/hadoop-hdfs/*:/usr/hdp/184.108.40.206-230/hadoop-hdfs/lib/*:/usr/hdp/220.127.116.11-230/hadoop/etc/hadoop/*:/tmp/hive-pre-upgrade-18.104.22.168.0.0.0-1634.jar:/usr/hdp/22.214.171.124-230/hive/conf/conf.server org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool -execute
Looks like your HDP upgrade has some issues.
Can you run "hdp-select" and make sure it returns HDP3 version? If it is not pointing to hdp3 then you can run "hdp-select set all 126.96.36.199-1634"
Please let me know if you have any questions.
This issue was way before the HDP3 upgrade was completed. This step is a requirement to prepare Hive for upgrade from HDP 2.6 to HDP 3.0
After this was resolved, the upgrade was completed smoothly. I believe this is purely a documentation issue.
I'm having the same issue here. After changing the folders, I'm still getting this exception:
2018-10-12T12:51:14,010 WARN [main] hive.ql.metadata.Hive - Failed to register all functions.
org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.thrift.TApplicationException: Invalid method name: 'get_all_functions'
Do you have any idea what's happening here?
Thanks in advance!
Nevermind. I ran the command without the -execute, and it worked fine. Now I'm a little bit confused regarding step 7 of the pre-upgrade documentation. Can you help me with that?
"7. Login to Beeline as the Hive service user, and run each generated script to prepare the cluster for upgrading.
The Hive service user is usually the hive user. This is hive by default. If you don’t know which user is the Hive service user in your cluster, go to the Ambari Web UI and click Cluster Admin > Service Accounts, and then look for Hive User."
I agree that this documentation needs some updates.
Sorry for the long delay, and thanks for your help! It seems that I didn't have to run the beeline commands, since no scripts were generated (that's my understanding, at least). Now I'm facing another issue preventing me from finishing this upgrade. I reported it here if you want to take a look (any help is welcome!):
Thank you very much!