Support Questions

Find answers, ask questions, and share your expertise

ambari failed to install

avatar
Explorer

hi, i am trying to install hadoop via ambari but i get this error

stderr: 
Traceback (most recent call last):
  File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 195, in get_content
    web_file = opener.open(req)
  File "/usr/lib64/python2.7/urllib2.py", line 437, in open
    response = meth(req, response)
  File "/usr/lib64/python2.7/urllib2.py", line 550, in http_response
    'http', request, response, code, msg, hdrs)
  File "/usr/lib64/python2.7/urllib2.py", line 475, in error
    return self._call_chain(*args)
  File "/usr/lib64/python2.7/urllib2.py", line 409, in _call_chain
    result = func(*args)
  File "/usr/lib64/python2.7/urllib2.py", line 558, in http_error_default
    raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
HTTPError: HTTP Error 404: Not Found

The above exception was the cause of the following exception:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 60, in 
    HiveClient().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 351, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 40, in install
    self.configure(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 48, in configure
    hive(name='client')
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive.py", line 114, in hive
    jdbc_connector(params.hive_jdbc_target, params.hive_previous_jdbc_jar)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive.py", line 630, in jdbc_connector
    File(params.downloaded_custom_connector, content = DownloadSource(params.driver_curl_source))
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 123, in action_create
    content = self._get_content()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 160, in _get_content
    return content()
  File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 52, in __call__
    return self.get_content()
  File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 197, in get_content
    raise Fail("Failed to download file from {0} due to HTTP error: {1}".format(self.url, str(ex)))
resource_management.core.exceptions.Fail: Failed to download file from http://ambari.hadoop.uom.gr:8080/resources/mysql-connector-java.jar due to HTTP error: HTTP Error 404: Not Found
 stdout:
2018-12-13 16:32:15,104 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2018-12-13 16:32:15,107 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2018-12-13 16:32:15,108 - Group['livy'] {}
2018-12-13 16:32:15,109 - Group['spark'] {}
2018-12-13 16:32:15,109 - Group['hdfs'] {}
2018-12-13 16:32:15,109 - Group['zeppelin'] {}
2018-12-13 16:32:15,109 - Group['hadoop'] {}
2018-12-13 16:32:15,109 - Group['users'] {}
2018-12-13 16:32:15,109 - Group['knox'] {}
2018-12-13 16:32:15,109 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,110 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,111 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,111 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,112 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,112 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-12-13 16:32:15,113 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,113 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,114 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-12-13 16:32:15,114 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2018-12-13 16:32:15,115 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,116 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2018-12-13 16:32:15,116 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,117 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2018-12-13 16:32:15,117 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-12-13 16:32:15,118 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,118 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2018-12-13 16:32:15,119 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,119 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,120 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,121 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,121 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
2018-12-13 16:32:15,122 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-12-13 16:32:15,123 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-12-13 16:32:15,126 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-12-13 16:32:15,126 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-12-13 16:32:15,127 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-12-13 16:32:15,128 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-12-13 16:32:15,129 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-12-13 16:32:15,133 - call returned (0, '1020')
2018-12-13 16:32:15,134 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1020'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-12-13 16:32:15,137 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1020'] due to not_if
2018-12-13 16:32:15,137 - Group['hdfs'] {}
2018-12-13 16:32:15,137 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2018-12-13 16:32:15,138 - FS Type: HDFS
2018-12-13 16:32:15,138 - Directory['/etc/hadoop'] {'mode': 0755}
2018-12-13 16:32:15,147 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-12-13 16:32:15,147 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-12-13 16:32:15,159 - Repository['HDP-3.0-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-12-13 16:32:15,163 - Repository['HDP-3.0-GPL-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.1.0', 'action': ['prepare'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-12-13 16:32:15,165 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-12-13 16:32:15,166 - Repository[None] {'action': ['create']}
2018-12-13 16:32:15,167 - File['/tmp/tmpbESTEN'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.0-GPL-repo-1]\nname=HDP-3.0-GPL-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.1.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-12-13 16:32:15,167 - Writing File['/tmp/tmpbESTEN'] because contents don't match
2018-12-13 16:32:15,167 - File['/tmp/tmptU9nw5'] {'content': StaticFile('/etc/yum.repos.d/ambari-hdp-1.repo')}
2018-12-13 16:32:15,168 - Writing File['/tmp/tmptU9nw5'] because contents don't match
2018-12-13 16:32:15,168 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-12-13 16:32:15,221 - Skipping installation of existing package unzip
2018-12-13 16:32:15,221 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-12-13 16:32:15,227 - Skipping installation of existing package curl
2018-12-13 16:32:15,227 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-12-13 16:32:15,233 - Skipping installation of existing package hdp-select
2018-12-13 16:32:15,236 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-12-13 16:32:15,424 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2018-12-13 16:32:15,430 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2018-12-13 16:32:15,445 - call returned (0, 'hive-server2 - 3.0.1.0-187')
2018-12-13 16:32:15,447 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2018-12-13 16:32:15,461 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://ambari.hadoop.uom.gr:8080/resources/CredentialUtil.jar'), 'mode': 0755}
2018-12-13 16:32:15,462 - Not downloading the file from http://ambari.hadoop.uom.gr:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists
2018-12-13 16:32:15,996 - Package['hive_3_0_1_0_187'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-12-13 16:32:16,051 - Skipping installation of existing package hive_3_0_1_0_187
2018-12-13 16:32:16,052 - Package['hive_3_0_1_0_187-hcatalog'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-12-13 16:32:16,057 - Skipping installation of existing package hive_3_0_1_0_187-hcatalog
2018-12-13 16:32:16,058 - Directories to fill with configs: [u'/usr/hdp/current/hive-client/conf']
2018-12-13 16:32:16,059 - Directory['/etc/hive/3.0.1.0-187/0'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0755}
2018-12-13 16:32:16,059 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/3.0.1.0-187/0', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2018-12-13 16:32:16,066 - Generating config: /etc/hive/3.0.1.0-187/0/mapred-site.xml
2018-12-13 16:32:16,066 - File['/etc/hive/3.0.1.0-187/0/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-12-13 16:32:16,092 - File['/etc/hive/3.0.1.0-187/0/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,093 - File['/etc/hive/3.0.1.0-187/0/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755}
2018-12-13 16:32:16,095 - File['/etc/hive/3.0.1.0-187/0/llap-daemon-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,096 - File['/etc/hive/3.0.1.0-187/0/llap-cli-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,098 - File['/etc/hive/3.0.1.0-187/0/hive-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,099 - File['/etc/hive/3.0.1.0-187/0/hive-exec-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,100 - File['/etc/hive/3.0.1.0-187/0/beeline-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,100 - XmlConfig['beeline-site.xml'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644, 'conf_dir': '/etc/hive/3.0.1.0-187/0', 'configurations': {'beeline.hs2.jdbc.url.container': u'jdbc:hive2://namenode.hadoop.uom.gr:2181,resourcemanager.hadoop.uom.gr:2181,hbasemaster.hadoop.uom.gr:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2', 'beeline.hs2.jdbc.url.default': 'container'}}
2018-12-13 16:32:16,106 - Generating config: /etc/hive/3.0.1.0-187/0/beeline-site.xml
2018-12-13 16:32:16,106 - File['/etc/hive/3.0.1.0-187/0/beeline-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-12-13 16:32:16,108 - File['/etc/hive/3.0.1.0-187/0/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,108 - File['/usr/hdp/current/hive-client/conf/hive-site.jceks'] {'content': StaticFile('/var/lib/ambari-agent/cred/conf/hive_client/hive-site.jceks'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0640}
2018-12-13 16:32:16,108 - Writing File['/usr/hdp/current/hive-client/conf/hive-site.jceks'] because contents don't match
2018-12-13 16:32:16,109 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-client/conf', 'mode': 0644, 'configuration_attributes': {u'hidden': {u'javax.jdo.option.ConnectionPassword': u'HIVE_CLIENT,CONFIG_DOWNLOAD'}}, 'owner': 'hive', 'configurations': ...}
2018-12-13 16:32:16,113 - Generating config: /usr/hdp/current/hive-client/conf/hive-site.xml
2018-12-13 16:32:16,114 - File['/usr/hdp/current/hive-client/conf/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-12-13 16:32:16,211 - Generating Atlas Hook config file /usr/hdp/current/hive-client/conf/atlas-application.properties
2018-12-13 16:32:16,212 - PropertiesFile['/usr/hdp/current/hive-client/conf/atlas-application.properties'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644, 'properties': ...}
2018-12-13 16:32:16,214 - Generating properties file: /usr/hdp/current/hive-client/conf/atlas-application.properties
2018-12-13 16:32:16,214 - File['/usr/hdp/current/hive-client/conf/atlas-application.properties'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-12-13 16:32:16,223 - Writing File['/usr/hdp/current/hive-client/conf/atlas-application.properties'] because contents don't match
2018-12-13 16:32:16,226 - File['/usr/hdp/current/hive-client/conf/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0755}
2018-12-13 16:32:16,227 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2018-12-13 16:32:16,230 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2018-12-13 16:32:16,231 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('http://ambari.hadoop.uom.gr:8080/resources/DBConnectionVerification.jar'), 'mode': 0644}
2018-12-13 16:32:16,231 - Not downloading the file from http://ambari.hadoop.uom.gr:8080/resources/DBConnectionVerification.jar, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists
2018-12-13 16:32:16,231 - File['/var/lib/ambari-agent/tmp/mysql-connector-java.jar'] {'content': DownloadSource('http://ambari.hadoop.uom.gr:8080/resources/mysql-connector-java.jar')}
2018-12-13 16:32:16,231 - Downloading the file from http://ambari.hadoop.uom.gr:8080/resources/mysql-connector-java.jar
2018-12-13 16:32:16,241 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed
 

Command failed after 1 tries

1 ACCEPTED SOLUTION

avatar

Hi @Bill Ferris,

For the logs i see you are having this exception :

resource_management.core.exceptions.Fail:Failed to download file from http://ambari.hadoop.uom.gr:8080/resources/mysql-connector-java.jar due to HTTP error: HTTP Error 404: Not Found

Can you please try the following in ambari-server and see if this helps ?

 yum install mysql-connection-java -y

(OR) if you are downloading the mysql-connector-java JAR from some tar.gz archive then please make sure to check the following locations and create the symlinks something like following to point to your jar.

.Then you should find some symlink as following:

Example:

#  ls -l  /usr/share/java/mysql-connector-java.jarl
rwxrwxrwx 1 root root 31 Apr 19  2017 /usr/share/java/mysql-connector-java.jar -> mysql-connector-java-5.1.17.jar

https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.1.0/bk_ambari-administration/content/using_hive...

So now ambari knows how to find this jar. The JAR can be found hereafter

# ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar
# ls -l /var/lib/ambari-server/resources/mysql-connector-java.jar
-rw-r--r-- 1 root root 819803 Sep 28 19:52 /var/lib/ambari-server/resources/mysql-connector-java.jar

Please accept this answer if you found this helpful

View solution in original post

3 REPLIES 3

avatar

Hi @Bill Ferris,

For the logs i see you are having this exception :

resource_management.core.exceptions.Fail:Failed to download file from http://ambari.hadoop.uom.gr:8080/resources/mysql-connector-java.jar due to HTTP error: HTTP Error 404: Not Found

Can you please try the following in ambari-server and see if this helps ?

 yum install mysql-connection-java -y

(OR) if you are downloading the mysql-connector-java JAR from some tar.gz archive then please make sure to check the following locations and create the symlinks something like following to point to your jar.

.Then you should find some symlink as following:

Example:

#  ls -l  /usr/share/java/mysql-connector-java.jarl
rwxrwxrwx 1 root root 31 Apr 19  2017 /usr/share/java/mysql-connector-java.jar -> mysql-connector-java-5.1.17.jar

https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.1.0/bk_ambari-administration/content/using_hive...

So now ambari knows how to find this jar. The JAR can be found hereafter

# ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar
# ls -l /var/lib/ambari-server/resources/mysql-connector-java.jar
-rw-r--r-- 1 root root 819803 Sep 28 19:52 /var/lib/ambari-server/resources/mysql-connector-java.jar

Please accept this answer if you found this helpful

avatar
Explorer

thanks for the reply, i have centos 7.6 and get this output

no package mysql-connection-java available

nothing to do

avatar

Hi @Bill Ferris,

Please download mysql connector jar from here if you cannot install it via yum command : https://dev.mysql.com/downloads/connector/j/

Please login and accept answer as helpfull if it worked for you