Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Error installing Hive Client

Highlighted

Error installing Hive Client

New Contributor

Hello! I am trying to install HDP 3.1.0 using Ambari 2.7.3.0 on a single node cluster with Ubuntu 18.04. Everything runs fine except for Hive Client installation. It fails with the following message:

stderr: 
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 60, in 
    HiveClient().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 38, in install
    import params
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/params.py", line 221, in 
    hive_metastore_user_passwd = PasswordString(get_password_from_credential_store(alias, provider_path, cs_lib_path, java_home, jdk_location))
  File "/usr/lib/ambari-agent/lib/ambari_commons/credential_store_helper.py", line 48, in get_password_from_credential_store
    downloadjar(cs_lib_path, jdk_location)
  File "/usr/lib/ambari-agent/lib/ambari_commons/credential_store_helper.py", line 44, in downloadjar
    mode = 0755,
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 123, in action_create
    content = self._get_content()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 160, in _get_content
    return content()
  File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 52, in __call__
    return self.get_content()
  File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 195, in get_content
    web_file = opener.open(req)
  File "/usr/lib/python2.7/urllib2.py", line 429, in open
    response = self._open(req, data)
  File "/usr/lib/python2.7/urllib2.py", line 447, in _open
    '_open', req)
  File "/usr/lib/python2.7/urllib2.py", line 407, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.7/urllib2.py", line 1228, in http_open
    return self.do_open(httplib.HTTPConnection, req)
  File "/usr/lib/python2.7/urllib2.py", line 1198, in do_open
    raise URLError(err)
urllib2.URLError: 
 stdout:
2019-08-07 13:46:13,127 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=None -> 3.1
2019-08-07 13:46:13,134 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2019-08-07 13:46:13,136 - Group['livy'] {}
2019-08-07 13:46:13,137 - Group['spark'] {}
2019-08-07 13:46:13,137 - Group['hdfs'] {}
2019-08-07 13:46:13,137 - Group['zeppelin'] {}
2019-08-07 13:46:13,137 - Group['hadoop'] {}
2019-08-07 13:46:13,138 - Group['users'] {}
2019-08-07 13:46:13,138 - Group['knox'] {}
2019-08-07 13:46:13,138 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-08-07 13:46:13,140 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-08-07 13:46:13,141 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-08-07 13:46:13,141 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-08-07 13:46:13,142 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-08-07 13:46:13,143 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-08-07 13:46:13,144 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-08-07 13:46:13,145 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-08-07 13:46:13,146 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-08-07 13:46:13,147 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2019-08-07 13:46:13,148 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-08-07 13:46:13,149 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2019-08-07 13:46:13,150 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-08-07 13:46:13,151 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2019-08-07 13:46:13,152 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-08-07 13:46:13,152 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-08-07 13:46:13,153 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2019-08-07 13:46:13,154 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-08-07 13:46:13,155 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-08-07 13:46:13,156 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-08-07 13:46:13,157 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-08-07 13:46:13,158 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
2019-08-07 13:46:13,159 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-08-07 13:46:13,160 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2019-08-07 13:46:13,166 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2019-08-07 13:46:13,166 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2019-08-07 13:46:13,167 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-08-07 13:46:13,168 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-08-07 13:46:13,169 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2019-08-07 13:46:13,177 - call returned (0, '1022')
2019-08-07 13:46:13,178 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1022'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2019-08-07 13:46:13,184 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1022'] due to not_if
2019-08-07 13:46:13,185 - Group['hdfs'] {}
2019-08-07 13:46:13,185 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2019-08-07 13:46:13,186 - FS Type: HDFS
2019-08-07 13:46:13,186 - Directory['/etc/hadoop'] {'mode': 0755}
2019-08-07 13:46:13,204 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2019-08-07 13:46:13,205 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2019-08-07 13:46:13,223 - Repository['HDP-3.1-repo-11'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/ubuntu18/3.x/updates/3.1.0.0', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdp-11', 'mirror_list': None}
2019-08-07 13:46:13,229 - Repository['HDP-3.1-GPL-repo-11'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/ubuntu18/3.x/updates/3.1.0.0', 'action': ['prepare'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdp-11', 'mirror_list': None}
2019-08-07 13:46:13,230 - Repository['HDP-UTILS-1.1.0.22-repo-11'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/ubuntu18', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdp-11', 'mirror_list': None}
2019-08-07 13:46:13,232 - Repository[None] {'action': ['create']}
2019-08-07 13:46:13,233 - File['/tmp/tmpbaC6cZ'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP/ubuntu18/3.x/updates/3.1.0.0 HDP main\ndeb http://public-repo-1.hortonworks.com/HDP-GPL/ubuntu18/3.x/updates/3.1.0.0 HDP-GPL main\ndeb http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/ubuntu18 HDP-UTILS main'}
2019-08-07 13:46:13,234 - Writing File['/tmp/tmpbaC6cZ'] because contents don't match
2019-08-07 13:46:13,234 - File['/tmp/tmpTe40gS'] {'content': StaticFile('/etc/apt/sources.list.d/ambari-hdp-11.list')}
2019-08-07 13:46:13,235 - Writing File['/tmp/tmpTe40gS'] because contents don't match
2019-08-07 13:46:13,236 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-08-07 13:46:13,276 - Skipping installation of existing package unzip
2019-08-07 13:46:13,277 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-08-07 13:46:13,297 - Skipping installation of existing package curl
2019-08-07 13:46:13,297 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-08-07 13:46:13,332 - Skipping installation of existing package hdp-select
2019-08-07 13:46:13,622 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2019-08-07 13:46:13,634 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2019-08-07 13:46:13,658 - call returned (0, 'hive-server2 - 3.1.0.0-78')
2019-08-07 13:46:13,659 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=None -> 3.1
2019-08-07 13:46:13,684 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://hortonworks:8085/resources/CredentialUtil.jar'), 'mode': 0755}
2019-08-07 13:46:13,685 - Downloading the file from http://hortonworks:8085/resources/CredentialUtil.jar

Command failed after 1 tries

Has anyone faced this problem? Do you have any suggestion on how to solve this? I thought, that it had something to do with proxy and tried inserting

import os
os.environ['http_proxy']=''

before importing urllib2 in source.py. But it did not help.