Created 11-05-2018 01:39 PM
I am trying to upgrade my HDP cluster to version 3.0.1. I already managed to upgrade Ambari to version 2.7.1.0, but I am failing next step - running Hive PreUpgradeTool:
$JAVA_HOME/bin/java -Djavax.security.auth.useSubjectCredsOnly=false -cp /usr/hdp/current/hive-server2-hive2/lib/derby-10.10.2.0.jar:/usr/hdp/current/hive-server2-hive2/lib/*:/usr/hdp/current/hadoop/*:/usr/hdp/current/hadoop/lib/*:/usr/hdp/current/hadoop-mapreduce-client/*:/usr/hdp/current/hadoop-mapreduce-client/lib/*:/usr/hdp/current/hadoop-hdfs/*:/usr/hdp/current/hadoop-hdfs/lib/*:/usr/hdp/current/hadoop/etc/hadoop/*:/tmp/hive-pre-upgrade-3.1.0.3.0.1.0-187.jar:/usr/hdp/current/hive-client/conf/:/usr/hdp/current/hive-metastore/lib/hive-metastore.jar:/usr/hdp/current/hive-metastore/lib/libthrift-0.9.3.jar:/usr/hdp/current/hadoop-client/hadoop-common.jar:/usr/hdp/current/hive-client/lib/hive-common.jar:/usr/hdp/current/hive-client/lib/commons-cli-1.2.jar:/usr/hdp/current/hadoop-client/lib/* org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool -execute
********
END========"new HiveConf()"========
Found Acid table: com.nkscale_vehicle_weighing_result
2018-11-05T13:53:53,044 ERROR [main] org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool - PreUpgradeTool failed
java.io.IOException: No FileSystem for scheme: hdfs
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2786) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2793) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2829) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2811) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:390) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.needsCompaction(PreUpgradeTool.java:480) ~[hive-pre-upgrade-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.getCompactionCommands(PreUpgradeTool.java:400) ~[hive-pre-upgrade-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.getCompactionCommands(PreUpgradeTool.java:390) ~[hive-pre-upgrade-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.prepareAcidUpgradeInternal(PreUpgradeTool.java:251) ~[hive-pre-upgrade-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.main(PreUpgradeTool.java:150) [hive-pre-upgrade-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187]
Exception in thread "main" java.io.IOException: No FileSystem for scheme: hdfs
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2786)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2793)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2829)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2811)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:390)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.needsCompaction(PreUpgradeTool.java:480)
at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.getCompactionCommands(PreUpgradeTool.java:400)
at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.getCompactionCommands(PreUpgradeTool.java:390)
at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.prepareAcidUpgradeInternal(PreUpgradeTool.java:251)
at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.main(PreUpgradeTool.java:150)
HDP version is 2.6.3. What could be wrong and how to get past this error?
Created 11-06-2018 09:13 AM
I managed to solve this issue. The problem was with /usr/hdp/current/hadoop-hdfs path, this did not exist. I had to use other path - /usr/hdp/2.6.3.0-235/hadoop-hdfs
Created 11-06-2018 09:13 AM
I managed to solve this issue. The problem was with /usr/hdp/current/hadoop-hdfs path, this did not exist. I had to use other path - /usr/hdp/2.6.3.0-235/hadoop-hdfs