Support Questions

Find answers, ask questions, and share your expertise

Upgrade to HDP 3.0 Stuck

avatar
New Contributor

Dears,

We are facing an issue while upgrading from HDP 2.6.1 to HDP 3.0.0. Now the upgrade is stuck.

Failed on:Convert Hive Tables

2018-08-13 05:25:23,354 - Task. Type: EXECUTE, Script: scripts/pre_upgrade.py - Function: convert_tables
2018-08-13 05:25:23,498 - Using hadoop conf dir: /usr/hdp/2.6.1.0-129/hadoop/conf
2018-08-13 05:25:23,509 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2018-08-13 05:25:23,528 - call returned (0, 'hive-server2 - 2.6.1.0-129')
2018-08-13 05:25:23,529 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.1.0-129, Upgrade Direction=upgrade -> 2.6.1.0-129
2018-08-13 05:25:23,554 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://hd01.example.com:8080/resources/CredentialUtil.jar'), 'mode': 0755}
2018-08-13 05:25:23,555 - Not downloading the file from http://hd01.example.com:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists
2018-08-13 05:25:24,125 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hive.service.keytab hive/hd02.example.com@EXAMPLE.COM; '] {'user': 'hive'}
2018-08-13 05:25:24,186 - Execute['/usr/jdk64/jdk1.8.0_77/bin/java -Djavax.security.auth.useSubjectCredsOnly=false -cp /usr/hdp/2.6.1.0-129/hive2/lib/derby-10.10.2.0.jar:/usr/hdp/2.6.1.0-129/hive2/lib/*:/usr/hdp/2.6.1.0-129/hadoop/*:/usr/hdp/2.6.1.0-129/hadoop/lib/*:/usr/hdp/2.6.1.0-129/hadoop-mapreduce/*:/usr/hdp/2.6.1.0-129/hadoop-mapreduce/lib/*:/usr/hdp/2.6.1.0-129/hadoop-hdfs/*:/usr/hdp/2.6.1.0-129/hadoop-hdfs/lib/*:/usr/hdp/2.6.1.0-129/hadoop/etc/hadoop/:/usr/hdp/3.0.0.0-1634/hive/lib/hive-pre-upgrade.jar:/usr/hdp/2.6.1.0-129/hive/conf/conf.server org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool -execute'] {'user': 'hive'}
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/pre_upgrade.py", line 111, in <module>
    HivePreUpgrade().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 353, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/pre_upgrade.py", line 108, in convert_tables
    Execute(cmd, user = params.hive_user)
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run
    returns=self.resource.returns)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
    result = function(command, **kwargs)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
    tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/jdk64/jdk1.8.0_77/bin/java -Djavax.security.auth.useSubjectCredsOnly=false -cp /usr/hdp/2.6.1.0-129/hive2/lib/derby-10.10.2.0.jar:/usr/hdp/2.6.1.0-129/hive2/lib/*:/usr/hdp/2.6.1.0-129/hadoop/*:/usr/hdp/2.6.1.0-129/hadoop/lib/*:/usr/hdp/2.6.1.0-129/hadoop-mapreduce/*:/usr/hdp/2.6.1.0-129/hadoop-mapreduce/lib/*:/usr/hdp/2.6.1.0-129/hadoop-hdfs/*:/usr/hdp/2.6.1.0-129/hadoop-hdfs/lib/*:/usr/hdp/2.6.1.0-129/hadoop/etc/hadoop/:/usr/hdp/3.0.0.0-1634/hive/lib/hive-pre-upgrade.jar:/usr/hdp/2.6.1.0-129/hive/conf/conf.server org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool -execute' returned 1. SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.1.0-129/hive2/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.1.0-129/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
2018-08-13T05:25:24,960 INFO [main] org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool - Starting with execute=true, location=.
2018-08-13T05:25:24,965 INFO [main] org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool - Using Hive Version: 2.1.0.2.6.1.0-129 build: 2.1.0.2.6.1.0-129 from f65a7fce6219dbd86a9313bb37944b89fa3551b1 by jenkins source checksum 07716d194826949d38412fbe160ac143
2018-08-13T05:25:24,990 INFO [main] org.apache.hadoop.hive.conf.HiveConf - Found configuration file file:/etc/hive/2.6.1.0-129/0/conf.server/hive-site.xml
2018-08-13T05:25:25,303 WARN [main] org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.enforce.sorting does not exist
2018-08-13T05:25:25,303 WARN [main] org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.enforce.bucketing does not exist
2018-08-13T05:25:25,401 WARN [main] org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-08-13T05:25:25,453 INFO [main] org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool - Creating metastore client for PreUpgradeTool
2018-08-13T05:25:25,469 ERROR [main] org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool - PreUpgradeTool failed
java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1655) ~[hive-exec-2.1.0.2.6.1.0-129.jar:2.1.0.2.6.1.0-129]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:83) ~[hive-exec-2.1.0.2.6.1.0-129.jar:2.1.0.2.6.1.0-129]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133) ~[hive-exec-2.1.0.2.6.1.0-129.jar:2.1.0.2.6.1.0-129]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:89) ~[hive-exec-2.1.0.2.6.1.0-129.jar:2.1.0.2.6.1.0-129]
	at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.getHMS(PreUpgradeTool.java:190) ~[hive-pre-upgrade-3.1.0.3.0.0.0-1634.jar:3.1.0.3.0.0.0-1634]
	at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.prepareAcidUpgradeInternal(PreUpgradeTool.java:204) ~[hive-pre-upgrade-3.1.0.3.0.0.0-1634.jar:3.1.0.3.0.0.0-1634]
	at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.main(PreUpgradeTool.java:148) [hive-pre-upgrade-3.1.0.3.0.0.0-1634.jar:3.1.0.3.0.0.0-1634]
Caused by: java.lang.NoSuchMethodException: org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(org.apache.hadoop.hive.conf.HiveConf, java.lang.Boolean)
	at java.lang.Class.getConstructor0(Class.java:3082) ~[?:1.8.0_77]
	at java.lang.Class.getDeclaredConstructor(Class.java:2178) ~[?:1.8.0_77]
	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1651) ~[hive-exec-2.1.0.2.6.1.0-129.jar:2.1.0.2.6.1.0-129]
	... 6 more
Exception in thread "main" java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1655)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:83)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:89)
	at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.getHMS(PreUpgradeTool.java:190)
	at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.prepareAcidUpgradeInternal(PreUpgradeTool.java:204)
	at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.main(PreUpgradeTool.java:148)
Caused by: java.lang.NoSuchMethodException: org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(org.apache.hadoop.hive.conf.HiveConf, java.lang.Boolean)
	at java.lang.Class.getConstructor0(Class.java:3082)
	at java.lang.Class.getDeclaredConstructor(Class.java:2178)
	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1651)
	... 6 more

Appreciate your help.

Regards,

Ibrahim

1 ACCEPTED SOLUTION

avatar
Master Mentor

@Ibrahim Jarrar

You might be hitting the issue described in below HCC thread : https://community.hortonworks.com/questions/208893/hive-upgrade-tool-failed-at-hdp-3-upgrade.html

.

HIVE-15081 (https://issues.apache.org/jira/browse/HIVE-15081) which seems to be fixed in HDP 2.6.2.

So can you please try to upgrade to HDP-2.6.2/ HDP-2.6.5 first and then try the upgrade to HDP3.

View solution in original post

5 REPLIES 5

avatar
Master Mentor

@Ibrahim Jarrar

You might be hitting the issue described in below HCC thread : https://community.hortonworks.com/questions/208893/hive-upgrade-tool-failed-at-hdp-3-upgrade.html

.

HIVE-15081 (https://issues.apache.org/jira/browse/HIVE-15081) which seems to be fixed in HDP 2.6.2.

So can you please try to upgrade to HDP-2.6.2/ HDP-2.6.5 first and then try the upgrade to HDP3.

avatar
Master Mentor

@Ibrahim Jarrar

HDP-2.6.2 Release Notes references this:

  • HIVE-15081: RetryingMetaStoreClient.getProxy(HiveConf, Boolean) doesn't match constructor of HiveMetaStoreClient.

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.2/bk_release-notes/content/patch_hive.html

avatar

@jay regarding to upgrade to 3.0 version , we have versions as 2.6.4 so I guess it can be upgrade to 3.0 , do we need also to make storage changes in machines in the clusters ? , or just to upgrade the HDP without any disks changes ?

Michael-Bronson

avatar
Master Mentor
@Michael Bronson

The above mentioned issue should not be seen while upgrading to HDP 3.0 if the HDP version is higher than HDP 2.6.2 or euqal to HDP 2.6.2 as this issue HIVE-15081 is marked to be fixed in HDP 2.6.2 release notes.

Regarding the disk space query as per the prerequisite: https://docs.hortonworks.com/HDPDocuments/Ambari-2.7.0.0/bk_ambari-upgrade/content/upgrading_HDP_pre...

Disk Space:  Be sure to have adequate space on /usr/hdp for the target HDP version. Each complete install of an HDP version will occupy about 2.5 GB of disk space.

.

avatar

@Jay , actually I was asking about disk configuration as RAID configuration , lets say we have cluster HDP 2.6.4 with that details:

1) Worker-> (OS)
Raid1 + Raid 5 
2) Master -> (OS) Raid1 + Raid10
3) Kafka-> (OS) Raid1 + Raid10

based on that connfiguration , can we upgrade the HDP to 3.0 version ?

dose HDP 3.0 support the above configuration

Michael-Bronson