<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: ambari failed to install in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/ambari-failed-to-install/m-p/239073#M200884</link>
    <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/13607/sotak.html" nodeid="13607"&gt;@Bill Ferris&lt;/A&gt;,&lt;/P&gt;&lt;P&gt;Please download mysql connector jar from here if you cannot install it via yum command : &lt;A href="https://dev.mysql.com/downloads/connector/j/" target="_blank"&gt;https://dev.mysql.com/downloads/connector/j/&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Please login and accept answer as helpfull if it worked for you&lt;/P&gt;</description>
    <pubDate>Fri, 14 Dec 2018 01:42:25 GMT</pubDate>
    <dc:creator>akhilsnaik</dc:creator>
    <dc:date>2018-12-14T01:42:25Z</dc:date>
    <item>
      <title>ambari failed to install</title>
      <link>https://community.cloudera.com/t5/Support-Questions/ambari-failed-to-install/m-p/239070#M200881</link>
      <description>&lt;P&gt;hi, i am trying to install hadoop via ambari but i get this error&lt;/P&gt;&lt;PRE&gt;stderr: 
Traceback (most recent call last):
  File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 195, in get_content
    web_file = opener.open(req)
  File "/usr/lib64/python2.7/urllib2.py", line 437, in open
    response = meth(req, response)
  File "/usr/lib64/python2.7/urllib2.py", line 550, in http_response
    'http', request, response, code, msg, hdrs)
  File "/usr/lib64/python2.7/urllib2.py", line 475, in error
    return self._call_chain(*args)
  File "/usr/lib64/python2.7/urllib2.py", line 409, in _call_chain
    result = func(*args)
  File "/usr/lib64/python2.7/urllib2.py", line 558, in http_error_default
    raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
HTTPError: HTTP Error 404: Not Found

The above exception was the cause of the following exception:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 60, in 
    HiveClient().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 351, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 40, in install
    self.configure(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 48, in configure
    hive(name='client')
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive.py", line 114, in hive
    jdbc_connector(params.hive_jdbc_target, params.hive_previous_jdbc_jar)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive.py", line 630, in jdbc_connector
    File(params.downloaded_custom_connector, content = DownloadSource(params.driver_curl_source))
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 123, in action_create
    content = self._get_content()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 160, in _get_content
    return content()
  File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 52, in __call__
    return self.get_content()
  File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 197, in get_content
    raise Fail("Failed to download file from {0} due to HTTP error: {1}".format(self.url, str(ex)))
resource_management.core.exceptions.Fail: Failed to download file from &lt;A href="http://ambari.hadoop.uom.gr:8080/resources/mysql-connector-java.jar" target="_blank"&gt;http://ambari.hadoop.uom.gr:8080/resources/mysql-connector-java.jar&lt;/A&gt; due to HTTP error: HTTP Error 404: Not Found
 stdout:
2018-12-13 16:32:15,104 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -&amp;gt; 3.0
2018-12-13 16:32:15,107 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2018-12-13 16:32:15,108 - Group['livy'] {}
2018-12-13 16:32:15,109 - Group['spark'] {}
2018-12-13 16:32:15,109 - Group['hdfs'] {}
2018-12-13 16:32:15,109 - Group['zeppelin'] {}
2018-12-13 16:32:15,109 - Group['hadoop'] {}
2018-12-13 16:32:15,109 - Group['users'] {}
2018-12-13 16:32:15,109 - Group['knox'] {}
2018-12-13 16:32:15,109 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,110 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,111 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,111 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,112 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,112 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-12-13 16:32:15,113 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,113 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,114 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-12-13 16:32:15,114 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2018-12-13 16:32:15,115 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,116 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2018-12-13 16:32:15,116 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,117 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2018-12-13 16:32:15,117 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-12-13 16:32:15,118 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,118 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2018-12-13 16:32:15,119 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,119 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,120 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,121 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,121 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
2018-12-13 16:32:15,122 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-12-13 16:32:15,123 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-12-13 16:32:15,126 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-12-13 16:32:15,126 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-12-13 16:32:15,127 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-12-13 16:32:15,128 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-12-13 16:32:15,129 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-12-13 16:32:15,133 - call returned (0, '1020')
2018-12-13 16:32:15,134 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1020'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-12-13 16:32:15,137 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1020'] due to not_if
2018-12-13 16:32:15,137 - Group['hdfs'] {}
2018-12-13 16:32:15,137 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2018-12-13 16:32:15,138 - FS Type: HDFS
2018-12-13 16:32:15,138 - Directory['/etc/hadoop'] {'mode': 0755}
2018-12-13 16:32:15,147 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-12-13 16:32:15,147 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-12-13 16:32:15,159 - Repository['HDP-3.0-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-12-13 16:32:15,163 - Repository['HDP-3.0-GPL-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.1.0', 'action': ['prepare'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-12-13 16:32:15,165 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-12-13 16:32:15,166 - Repository[None] {'action': ['create']}
2018-12-13 16:32:15,167 - File['/tmp/tmpbESTEN'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.0-GPL-repo-1]\nname=HDP-3.0-GPL-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.1.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-12-13 16:32:15,167 - Writing File['/tmp/tmpbESTEN'] because contents don't match
2018-12-13 16:32:15,167 - File['/tmp/tmptU9nw5'] {'content': StaticFile('/etc/yum.repos.d/ambari-hdp-1.repo')}
2018-12-13 16:32:15,168 - Writing File['/tmp/tmptU9nw5'] because contents don't match
2018-12-13 16:32:15,168 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-12-13 16:32:15,221 - Skipping installation of existing package unzip
2018-12-13 16:32:15,221 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-12-13 16:32:15,227 - Skipping installation of existing package curl
2018-12-13 16:32:15,227 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-12-13 16:32:15,233 - Skipping installation of existing package hdp-select
2018-12-13 16:32:15,236 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-12-13 16:32:15,424 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2018-12-13 16:32:15,430 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2018-12-13 16:32:15,445 - call returned (0, 'hive-server2 - 3.0.1.0-187')
2018-12-13 16:32:15,447 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -&amp;gt; 3.0
2018-12-13 16:32:15,461 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://ambari.hadoop.uom.gr:8080/resources/CredentialUtil.jar'), 'mode': 0755}
2018-12-13 16:32:15,462 - Not downloading the file from &lt;A href="http://ambari.hadoop.uom.gr:8080/resources/CredentialUtil.jar" target="_blank"&gt;http://ambari.hadoop.uom.gr:8080/resources/CredentialUtil.jar&lt;/A&gt;, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists
2018-12-13 16:32:15,996 - Package['hive_3_0_1_0_187'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-12-13 16:32:16,051 - Skipping installation of existing package hive_3_0_1_0_187
2018-12-13 16:32:16,052 - Package['hive_3_0_1_0_187-hcatalog'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-12-13 16:32:16,057 - Skipping installation of existing package hive_3_0_1_0_187-hcatalog
2018-12-13 16:32:16,058 - Directories to fill with configs: [u'/usr/hdp/current/hive-client/conf']
2018-12-13 16:32:16,059 - Directory['/etc/hive/3.0.1.0-187/0'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0755}
2018-12-13 16:32:16,059 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/3.0.1.0-187/0', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2018-12-13 16:32:16,066 - Generating config: /etc/hive/3.0.1.0-187/0/mapred-site.xml
2018-12-13 16:32:16,066 - File['/etc/hive/3.0.1.0-187/0/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-12-13 16:32:16,092 - File['/etc/hive/3.0.1.0-187/0/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,093 - File['/etc/hive/3.0.1.0-187/0/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755}
2018-12-13 16:32:16,095 - File['/etc/hive/3.0.1.0-187/0/llap-daemon-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,096 - File['/etc/hive/3.0.1.0-187/0/llap-cli-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,098 - File['/etc/hive/3.0.1.0-187/0/hive-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,099 - File['/etc/hive/3.0.1.0-187/0/hive-exec-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,100 - File['/etc/hive/3.0.1.0-187/0/beeline-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,100 - XmlConfig['beeline-site.xml'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644, 'conf_dir': '/etc/hive/3.0.1.0-187/0', 'configurations': {'beeline.hs2.jdbc.url.container': u'jdbc:hive2://namenode.hadoop.uom.gr:2181,resourcemanager.hadoop.uom.gr:2181,hbasemaster.hadoop.uom.gr:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2', 'beeline.hs2.jdbc.url.default': 'container'}}
2018-12-13 16:32:16,106 - Generating config: /etc/hive/3.0.1.0-187/0/beeline-site.xml
2018-12-13 16:32:16,106 - File['/etc/hive/3.0.1.0-187/0/beeline-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-12-13 16:32:16,108 - File['/etc/hive/3.0.1.0-187/0/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,108 - File['/usr/hdp/current/hive-client/conf/hive-site.jceks'] {'content': StaticFile('/var/lib/ambari-agent/cred/conf/hive_client/hive-site.jceks'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0640}
2018-12-13 16:32:16,108 - Writing File['/usr/hdp/current/hive-client/conf/hive-site.jceks'] because contents don't match
2018-12-13 16:32:16,109 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-client/conf', 'mode': 0644, 'configuration_attributes': {u'hidden': {u'javax.jdo.option.ConnectionPassword': u'HIVE_CLIENT,CONFIG_DOWNLOAD'}}, 'owner': 'hive', 'configurations': ...}
2018-12-13 16:32:16,113 - Generating config: /usr/hdp/current/hive-client/conf/hive-site.xml
2018-12-13 16:32:16,114 - File['/usr/hdp/current/hive-client/conf/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-12-13 16:32:16,211 - Generating Atlas Hook config file /usr/hdp/current/hive-client/conf/atlas-application.properties
2018-12-13 16:32:16,212 - PropertiesFile['/usr/hdp/current/hive-client/conf/atlas-application.properties'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644, 'properties': ...}
2018-12-13 16:32:16,214 - Generating properties file: /usr/hdp/current/hive-client/conf/atlas-application.properties
2018-12-13 16:32:16,214 - File['/usr/hdp/current/hive-client/conf/atlas-application.properties'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-12-13 16:32:16,223 - Writing File['/usr/hdp/current/hive-client/conf/atlas-application.properties'] because contents don't match
2018-12-13 16:32:16,226 - File['/usr/hdp/current/hive-client/conf/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0755}
2018-12-13 16:32:16,227 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2018-12-13 16:32:16,230 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2018-12-13 16:32:16,231 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('http://ambari.hadoop.uom.gr:8080/resources/DBConnectionVerification.jar'), 'mode': 0644}
2018-12-13 16:32:16,231 - Not downloading the file from &lt;A href="http://ambari.hadoop.uom.gr:8080/resources/DBConnectionVerification.jar" target="_blank"&gt;http://ambari.hadoop.uom.gr:8080/resources/DBConnectionVerification.jar&lt;/A&gt;, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists
2018-12-13 16:32:16,231 - File['/var/lib/ambari-agent/tmp/mysql-connector-java.jar'] {'content': DownloadSource('http://ambari.hadoop.uom.gr:8080/resources/mysql-connector-java.jar')}
2018-12-13 16:32:16,231 - Downloading the file from &lt;A href="http://ambari.hadoop.uom.gr:8080/resources/mysql-connector-java.jar" target="_blank"&gt;http://ambari.hadoop.uom.gr:8080/resources/mysql-connector-java.jar&lt;/A&gt;
2018-12-13 16:32:16,241 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed
 &lt;/PRE&gt;&lt;P&gt;Command failed after 1 tries&lt;/P&gt;</description>
      <pubDate>Thu, 13 Dec 2018 23:01:30 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/ambari-failed-to-install/m-p/239070#M200881</guid>
      <dc:creator>sotak1</dc:creator>
      <dc:date>2018-12-13T23:01:30Z</dc:date>
    </item>
    <item>
      <title>Re: ambari failed to install</title>
      <link>https://community.cloudera.com/t5/Support-Questions/ambari-failed-to-install/m-p/239071#M200882</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/13607/sotak.html" nodeid="13607"&gt;@Bill Ferris&lt;/A&gt;,&lt;/P&gt;&lt;P&gt;For the logs i see you are having this exception : &lt;/P&gt;&lt;PRE&gt;resource_management.core.exceptions.Fail:Failed to download file from &lt;A href="http://ambari.hadoop.uom.gr:8080/resources/mysql-connector-java.jar" target="_blank"&gt;http://ambari.hadoop.uom.gr:8080/resources/mysql-connector-java.jar&lt;/A&gt; due to HTTP error: HTTP Error 404: Not Found&lt;/PRE&gt;&lt;P&gt;Can you please try the following in ambari-server and see if this helps ?&lt;/P&gt;&lt;PRE&gt; yum install mysql-connection-java -y&lt;/PRE&gt;&lt;P&gt;(OR) if you are downloading the mysql-connector-java JAR from some tar.gz archive then please make sure to check the following locations and create the symlinks something like following to point to your jar.&lt;/P&gt;&lt;P&gt;.Then you should find some symlink as following:&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Example:&lt;/STRONG&gt;&lt;/P&gt;&lt;PRE&gt;#  ls -l  /usr/share/java/mysql-connector-java.jarl
rwxrwxrwx 1 root root 31 Apr 19  2017 /usr/share/java/mysql-connector-java.jar -&amp;gt; mysql-connector-java-5.1.17.jar&lt;/PRE&gt;&lt;P&gt;&lt;A href="https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.1.0/bk_ambari-administration/content/using_hive_with_mysql.html"&gt;https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.1.0/bk_ambari-administration/content/using_hive_with_mysql.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;So now ambari knows how to find this jar. The JAR can be found hereafter&lt;/P&gt;&lt;PRE&gt;# ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar
# ls -l /var/lib/ambari-server/resources/mysql-connector-java.jar
-rw-r--r-- 1 root root 819803 Sep 28 19:52 /var/lib/ambari-server/resources/mysql-connector-java.jar&lt;/PRE&gt;&lt;P&gt;Please accept this answer if you found this helpful&lt;/P&gt;</description>
      <pubDate>Fri, 14 Dec 2018 00:18:43 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/ambari-failed-to-install/m-p/239071#M200882</guid>
      <dc:creator>akhilsnaik</dc:creator>
      <dc:date>2018-12-14T00:18:43Z</dc:date>
    </item>
    <item>
      <title>Re: ambari failed to install</title>
      <link>https://community.cloudera.com/t5/Support-Questions/ambari-failed-to-install/m-p/239072#M200883</link>
      <description>&lt;P&gt;thanks for the reply, i have centos 7.6 and get this output &lt;/P&gt;&lt;P&gt; no package mysql-connection-java available&lt;/P&gt;&lt;P&gt;nothing to do&lt;/P&gt;</description>
      <pubDate>Fri, 14 Dec 2018 01:33:20 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/ambari-failed-to-install/m-p/239072#M200883</guid>
      <dc:creator>sotak1</dc:creator>
      <dc:date>2018-12-14T01:33:20Z</dc:date>
    </item>
    <item>
      <title>Re: ambari failed to install</title>
      <link>https://community.cloudera.com/t5/Support-Questions/ambari-failed-to-install/m-p/239073#M200884</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/13607/sotak.html" nodeid="13607"&gt;@Bill Ferris&lt;/A&gt;,&lt;/P&gt;&lt;P&gt;Please download mysql connector jar from here if you cannot install it via yum command : &lt;A href="https://dev.mysql.com/downloads/connector/j/" target="_blank"&gt;https://dev.mysql.com/downloads/connector/j/&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Please login and accept answer as helpfull if it worked for you&lt;/P&gt;</description>
      <pubDate>Fri, 14 Dec 2018 01:42:25 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/ambari-failed-to-install/m-p/239073#M200884</guid>
      <dc:creator>akhilsnaik</dc:creator>
      <dc:date>2018-12-14T01:42:25Z</dc:date>
    </item>
  </channel>
</rss>

