Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Cannot install Hive clients

Solved Go to solution
Highlighted

Cannot install Hive clients

New Contributor

This is my error code. i don't know what problem is...

please help me,

stderr: Traceback (most recent call last): File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 195, in get_content web_file = opener.open(req) File "/usr/lib64/python2.7/urllib2.py", line 437, in open response = meth(req, response) File "/usr/lib64/python2.7/urllib2.py", line 550, in http_response 'http', request, response, code, msg, hdrs) File "/usr/lib64/python2.7/urllib2.py", line 475, in error return self._call_chain(*args) File "/usr/lib64/python2.7/urllib2.py", line 409, in _call_chain result = func(*args) File "/usr/lib64/python2.7/urllib2.py", line 558, in http_error_default raise HTTPError(req.get_full_url(), code, msg, hdrs, fp) HTTPError: HTTP Error 404: Not Found The above exception was the cause of the following exception: Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 60, in HiveClient().execute() File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 353, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 40, in install self.configure(env) File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 48, in configure hive(name='client') File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive.py", line 114, in hive jdbc_connector(params.hive_jdbc_target, params.hive_previous_jdbc_jar) File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive.py", line 628, in jdbc_connector File(params.downloaded_custom_connector, content = DownloadSource(params.driver_curl_source)) File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__ self.env.run() File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 123, in action_create content = self._get_content() File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 160, in _get_content return content() File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 52, in __call__ return self.get_content() File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 197, in get_content raise Fail("Failed to download file from {0} due to HTTP error: {1}".format(self.url, str(ex))) resource_management.core.exceptions.Fail: Failed to download file from http://master:8080/resources/mysql-connector-java.jar due to HTTP error: HTTP Error 404: Not Found stdout: 2018-08-28 16:39:13,907 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0 2018-08-28 16:39:13,912 - Using hadoop conf dir: /usr/hdp/3.0.0.0-1634/hadoop/conf 2018-08-28 16:39:13,913 - Group['livy'] {} 2018-08-28 16:39:13,914 - Group['spark'] {} 2018-08-28 16:39:13,914 - Group['hdfs'] {} 2018-08-28 16:39:13,914 - Group['zeppelin'] {} 2018-08-28 16:39:13,914 - Group['hadoop'] {} 2018-08-28 16:39:13,915 - Group['users'] {} 2018-08-28 16:39:13,915 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-08-28 16:39:13,916 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-08-28 16:39:13,916 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-08-28 16:39:13,917 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-08-28 16:39:13,918 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None} 2018-08-28 16:39:13,918 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None} 2018-08-28 16:39:13,919 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None} 2018-08-28 16:39:13,919 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None} 2018-08-28 16:39:13,920 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None} 2018-08-28 16:39:13,921 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None} 2018-08-28 16:39:13,921 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-08-28 16:39:13,922 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-08-28 16:39:13,922 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-08-28 16:39:13,923 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-08-28 16:39:13,924 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2018-08-28 16:39:13,928 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if 2018-08-28 16:39:13,928 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'} 2018-08-28 16:39:13,928 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-08-28 16:39:13,929 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-08-28 16:39:13,930 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {} 2018-08-28 16:39:13,935 - call returned (0, '1013') 2018-08-28 16:39:13,936 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1013'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2018-08-28 16:39:13,940 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1013'] due to not_if 2018-08-28 16:39:13,940 - Group['hdfs'] {} 2018-08-28 16:39:13,941 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']} 2018-08-28 16:39:13,941 - FS Type: HDFS 2018-08-28 16:39:13,941 - Directory['/etc/hadoop'] {'mode': 0755} 2018-08-28 16:39:13,954 - File['/usr/hdp/3.0.0.0-1634/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2018-08-28 16:39:13,955 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2018-08-28 16:39:13,968 - Repository['HDP-3.0-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.0.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None} 2018-08-28 16:39:13,974 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0'} 2018-08-28 16:39:13,974 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match 2018-08-28 16:39:13,975 - Repository['HDP-3.0-GPL-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.0.0', 'action': ['create'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None} 2018-08-28 16:39:13,977 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.0-GPL-repo-1]\nname=HDP-3.0-GPL-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0'} 2018-08-28 16:39:13,978 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match 2018-08-28 16:39:13,979 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None} 2018-08-28 16:39:13,983 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.0-GPL-repo-1]\nname=HDP-3.0-GPL-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'} 2018-08-28 16:39:13,983 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match 2018-08-28 16:39:13,984 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-08-28 16:39:14,050 - Skipping installation of existing package unzip 2018-08-28 16:39:14,050 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-08-28 16:39:14,060 - Skipping installation of existing package curl 2018-08-28 16:39:14,060 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-08-28 16:39:14,070 - Skipping installation of existing package hdp-select 2018-08-28 16:39:14,074 - The repository with version 3.0.0.0-1634 for this command has been marked as resolved. It will be used to report the version of the component which was installed 2018-08-28 16:39:14,347 - Using hadoop conf dir: /usr/hdp/3.0.0.0-1634/hadoop/conf 2018-08-28 16:39:14,365 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20} 2018-08-28 16:39:14,391 - call returned (0, 'hive-server2 - 3.0.0.0-1634') 2018-08-28 16:39:14,391 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0 2018-08-28 16:39:14,419 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://master:8080/resources/CredentialUtil.jar'), 'mode': 0755} 2018-08-28 16:39:14,421 - Not downloading the file from http://master:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists 2018-08-28 16:39:15,945 - Package['hive_3_0_0_0_1634'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-08-28 16:39:16,066 - Skipping installation of existing package hive_3_0_0_0_1634 2018-08-28 16:39:16,068 - Package['hive_3_0_0_0_1634-hcatalog'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-08-28 16:39:16,102 - Skipping installation of existing package hive_3_0_0_0_1634-hcatalog 2018-08-28 16:39:16,104 - Directories to fill with configs: [u'/usr/hdp/current/hive-client/conf'] 2018-08-28 16:39:16,104 - Directory['/etc/hive/3.0.0.0-1634/0'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0755} 2018-08-28 16:39:16,105 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/3.0.0.0-1634/0', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...} 2018-08-28 16:39:16,124 - Generating config: /etc/hive/3.0.0.0-1634/0/mapred-site.xml 2018-08-28 16:39:16,124 - File['/etc/hive/3.0.0.0-1634/0/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'} 2018-08-28 16:39:16,195 - File['/etc/hive/3.0.0.0-1634/0/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644} 2018-08-28 16:39:16,196 - File['/etc/hive/3.0.0.0-1634/0/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755} 2018-08-28 16:39:16,209 - File['/etc/hive/3.0.0.0-1634/0/llap-daemon-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644} 2018-08-28 16:39:16,211 - File['/etc/hive/3.0.0.0-1634/0/llap-cli-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644} 2018-08-28 16:39:16,214 - File['/etc/hive/3.0.0.0-1634/0/hive-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644} 2018-08-28 16:39:16,216 - File['/etc/hive/3.0.0.0-1634/0/hive-exec-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644} 2018-08-28 16:39:16,229 - File['/etc/hive/3.0.0.0-1634/0/beeline-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644} 2018-08-28 16:39:16,230 - XmlConfig['beeline-site.xml'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644, 'conf_dir': '/etc/hive/3.0.0.0-1634/0', 'configurations': {'beeline.hs2.jdbc.url.container': u'jdbc:hive2://master.knu.com:2181,slave1.knu.com:2181,slave2.knu.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2', 'beeline.hs2.jdbc.url.default': 'container'}} 2018-08-28 16:39:16,236 - Generating config: /etc/hive/3.0.0.0-1634/0/beeline-site.xml 2018-08-28 16:39:16,236 - File['/etc/hive/3.0.0.0-1634/0/beeline-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'} 2018-08-28 16:39:16,238 - File['/etc/hive/3.0.0.0-1634/0/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644} 2018-08-28 16:39:16,238 - File['/usr/hdp/current/hive-client/conf/hive-site.jceks'] {'content': StaticFile('/var/lib/ambari-agent/cred/conf/hive_client/hive-site.jceks'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0640} 2018-08-28 16:39:16,239 - Writing File['/usr/hdp/current/hive-client/conf/hive-site.jceks'] because contents don't match 2018-08-28 16:39:16,239 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-client/conf', 'mode': 0644, 'configuration_attributes': {u'hidden': {u'javax.jdo.option.ConnectionPassword': u'HIVE_CLIENT,CONFIG_DOWNLOAD'}}, 'owner': 'hive', 'configurations': ...} 2018-08-28 16:39:16,248 - Generating config: /usr/hdp/current/hive-client/conf/hive-site.xml 2018-08-28 16:39:16,249 - File['/usr/hdp/current/hive-client/conf/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'} 2018-08-28 16:39:16,465 - File['/usr/hdp/current/hive-client/conf/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0755} 2018-08-28 16:39:16,466 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'} 2018-08-28 16:39:16,468 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644} 2018-08-28 16:39:16,468 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('http://master:8080/resources/DBConnectionVerification.jar'), 'mode': 0644} 2018-08-28 16:39:16,468 - Not downloading the file from http://master:8080/resources/DBConnectionVerification.jar, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists 2018-08-28 16:39:16,469 - File['/var/lib/ambari-agent/tmp/mysql-connector-java.jar'] {'content': DownloadSource('http://master:8080/resources/mysql-connector-java.jar')} 2018-08-28 16:39:16,469 - Downloading the file from http://master:8080/resources/mysql-connector-java.jar 2018-08-28 16:39:16,497 - The repository with version 3.0.0.0-1634 for this command has been marked as resolved. It will be used to report the version of the component which was installed Command failed after 1 tries
1 ACCEPTED SOLUTION

Accepted Solutions

Re: Cannot install Hive clients

Hi @Taehyeon Lee
From your error;

Failed to download file from http://master:8080/resources/mysql-connector-java.jar due to HTTP error: HTTP Error 404: Not Found 

Try this on the ambari server host;

sudo yum install mysql-connector-java*
ls -al /usr/share/java/mysql-connector-java.jar
cd /var/lib/ambari-server/resources/
ln -s /usr/share/java/mysql-connector-java.jar mysql-connector-java.jar
2 REPLIES 2

Re: Cannot install Hive clients

Hi @Taehyeon Lee
From your error;

Failed to download file from http://master:8080/resources/mysql-connector-java.jar due to HTTP error: HTTP Error 404: Not Found 

Try this on the ambari server host;

sudo yum install mysql-connector-java*
ls -al /usr/share/java/mysql-connector-java.jar
cd /var/lib/ambari-server/resources/
ln -s /usr/share/java/mysql-connector-java.jar mysql-connector-java.jar

Re: Cannot install Hive clients

New Contributor

do

sudo yum install -y mysql-connector-java to install mysql-connector-java.jar

Check the path where mysql-connector-java is installed.

[root@c902f08x05 ~]# rpm -ql mysql-connector-java-*

/usr/share/doc/mysql-connector-java-5.1.25

/usr/share/doc/mysql-connector-java-5.1.25/CHANGES

/usr/share/doc/mysql-connector-java-5.1.25/COPYING

/usr/share/doc/mysql-connector-java-5.1.25/docs

/usr/share/doc/mysql-connector-java-5.1.25/docs/README.txt

/usr/share/doc/mysql-connector-java-5.1.25/docs/connector-j.html

/usr/share/doc/mysql-connector-java-5.1.25/docs/connector-j.pdf

/usr/share/java/mysql-connector-java.jar

/usr/share/maven-fragments/mysql-connector-java

/usr/share/maven-poms/JPP-mysql-connector-java.pom

[root@c902f08x05 ~]#


check the path and run ambari-server setup with jdbc driver path

ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar

and re-try hive-client install,.it should work.

Don't have an account?
Coming from Hortonworks? Activate your account here