Member since
10-09-2016
7
Posts
0
Kudos Received
0
Solutions
12-13-2018
05:33 PM
thanks for the reply, i have centos 7.6 and get this output no package mysql-connection-java available nothing to do
... View more
12-13-2018
03:01 PM
hi, i am trying to install hadoop via ambari but i get this error stderr:
Traceback (most recent call last):
File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 195, in get_content
web_file = opener.open(req)
File "/usr/lib64/python2.7/urllib2.py", line 437, in open
response = meth(req, response)
File "/usr/lib64/python2.7/urllib2.py", line 550, in http_response
'http', request, response, code, msg, hdrs)
File "/usr/lib64/python2.7/urllib2.py", line 475, in error
return self._call_chain(*args)
File "/usr/lib64/python2.7/urllib2.py", line 409, in _call_chain
result = func(*args)
File "/usr/lib64/python2.7/urllib2.py", line 558, in http_error_default
raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
HTTPError: HTTP Error 404: Not Found
The above exception was the cause of the following exception:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 60, in
HiveClient().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 351, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 40, in install
self.configure(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive_client.py", line 48, in configure
hive(name='client')
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive.py", line 114, in hive
jdbc_connector(params.hive_jdbc_target, params.hive_previous_jdbc_jar)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/HIVE/package/scripts/hive.py", line 630, in jdbc_connector
File(params.downloaded_custom_connector, content = DownloadSource(params.driver_curl_source))
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 123, in action_create
content = self._get_content()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 160, in _get_content
return content()
File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 52, in __call__
return self.get_content()
File "/usr/lib/ambari-agent/lib/resource_management/core/source.py", line 197, in get_content
raise Fail("Failed to download file from {0} due to HTTP error: {1}".format(self.url, str(ex)))
resource_management.core.exceptions.Fail: Failed to download file from http://ambari.hadoop.uom.gr:8080/resources/mysql-connector-java.jar due to HTTP error: HTTP Error 404: Not Found
stdout:
2018-12-13 16:32:15,104 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2018-12-13 16:32:15,107 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2018-12-13 16:32:15,108 - Group['livy'] {}
2018-12-13 16:32:15,109 - Group['spark'] {}
2018-12-13 16:32:15,109 - Group['hdfs'] {}
2018-12-13 16:32:15,109 - Group['zeppelin'] {}
2018-12-13 16:32:15,109 - Group['hadoop'] {}
2018-12-13 16:32:15,109 - Group['users'] {}
2018-12-13 16:32:15,109 - Group['knox'] {}
2018-12-13 16:32:15,109 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,110 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,111 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,111 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,112 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,112 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-12-13 16:32:15,113 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,113 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,114 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-12-13 16:32:15,114 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2018-12-13 16:32:15,115 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,116 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2018-12-13 16:32:15,116 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,117 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2018-12-13 16:32:15,117 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-12-13 16:32:15,118 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,118 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2018-12-13 16:32:15,119 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,119 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,120 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,121 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-12-13 16:32:15,121 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
2018-12-13 16:32:15,122 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-12-13 16:32:15,123 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-12-13 16:32:15,126 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-12-13 16:32:15,126 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-12-13 16:32:15,127 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-12-13 16:32:15,128 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-12-13 16:32:15,129 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-12-13 16:32:15,133 - call returned (0, '1020')
2018-12-13 16:32:15,134 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1020'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-12-13 16:32:15,137 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1020'] due to not_if
2018-12-13 16:32:15,137 - Group['hdfs'] {}
2018-12-13 16:32:15,137 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2018-12-13 16:32:15,138 - FS Type: HDFS
2018-12-13 16:32:15,138 - Directory['/etc/hadoop'] {'mode': 0755}
2018-12-13 16:32:15,147 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-12-13 16:32:15,147 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-12-13 16:32:15,159 - Repository['HDP-3.0-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-12-13 16:32:15,163 - Repository['HDP-3.0-GPL-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.1.0', 'action': ['prepare'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-12-13 16:32:15,165 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-12-13 16:32:15,166 - Repository[None] {'action': ['create']}
2018-12-13 16:32:15,167 - File['/tmp/tmpbESTEN'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.0-GPL-repo-1]\nname=HDP-3.0-GPL-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.1.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-12-13 16:32:15,167 - Writing File['/tmp/tmpbESTEN'] because contents don't match
2018-12-13 16:32:15,167 - File['/tmp/tmptU9nw5'] {'content': StaticFile('/etc/yum.repos.d/ambari-hdp-1.repo')}
2018-12-13 16:32:15,168 - Writing File['/tmp/tmptU9nw5'] because contents don't match
2018-12-13 16:32:15,168 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-12-13 16:32:15,221 - Skipping installation of existing package unzip
2018-12-13 16:32:15,221 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-12-13 16:32:15,227 - Skipping installation of existing package curl
2018-12-13 16:32:15,227 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-12-13 16:32:15,233 - Skipping installation of existing package hdp-select
2018-12-13 16:32:15,236 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-12-13 16:32:15,424 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2018-12-13 16:32:15,430 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2018-12-13 16:32:15,445 - call returned (0, 'hive-server2 - 3.0.1.0-187')
2018-12-13 16:32:15,447 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2018-12-13 16:32:15,461 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://ambari.hadoop.uom.gr:8080/resources/CredentialUtil.jar'), 'mode': 0755}
2018-12-13 16:32:15,462 - Not downloading the file from http://ambari.hadoop.uom.gr:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists
2018-12-13 16:32:15,996 - Package['hive_3_0_1_0_187'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-12-13 16:32:16,051 - Skipping installation of existing package hive_3_0_1_0_187
2018-12-13 16:32:16,052 - Package['hive_3_0_1_0_187-hcatalog'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-12-13 16:32:16,057 - Skipping installation of existing package hive_3_0_1_0_187-hcatalog
2018-12-13 16:32:16,058 - Directories to fill with configs: [u'/usr/hdp/current/hive-client/conf']
2018-12-13 16:32:16,059 - Directory['/etc/hive/3.0.1.0-187/0'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0755}
2018-12-13 16:32:16,059 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/3.0.1.0-187/0', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2018-12-13 16:32:16,066 - Generating config: /etc/hive/3.0.1.0-187/0/mapred-site.xml
2018-12-13 16:32:16,066 - File['/etc/hive/3.0.1.0-187/0/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-12-13 16:32:16,092 - File['/etc/hive/3.0.1.0-187/0/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,093 - File['/etc/hive/3.0.1.0-187/0/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755}
2018-12-13 16:32:16,095 - File['/etc/hive/3.0.1.0-187/0/llap-daemon-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,096 - File['/etc/hive/3.0.1.0-187/0/llap-cli-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,098 - File['/etc/hive/3.0.1.0-187/0/hive-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,099 - File['/etc/hive/3.0.1.0-187/0/hive-exec-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,100 - File['/etc/hive/3.0.1.0-187/0/beeline-log4j2.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,100 - XmlConfig['beeline-site.xml'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644, 'conf_dir': '/etc/hive/3.0.1.0-187/0', 'configurations': {'beeline.hs2.jdbc.url.container': u'jdbc:hive2://namenode.hadoop.uom.gr:2181,resourcemanager.hadoop.uom.gr:2181,hbasemaster.hadoop.uom.gr:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2', 'beeline.hs2.jdbc.url.default': 'container'}}
2018-12-13 16:32:16,106 - Generating config: /etc/hive/3.0.1.0-187/0/beeline-site.xml
2018-12-13 16:32:16,106 - File['/etc/hive/3.0.1.0-187/0/beeline-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-12-13 16:32:16,108 - File['/etc/hive/3.0.1.0-187/0/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2018-12-13 16:32:16,108 - File['/usr/hdp/current/hive-client/conf/hive-site.jceks'] {'content': StaticFile('/var/lib/ambari-agent/cred/conf/hive_client/hive-site.jceks'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0640}
2018-12-13 16:32:16,108 - Writing File['/usr/hdp/current/hive-client/conf/hive-site.jceks'] because contents don't match
2018-12-13 16:32:16,109 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-client/conf', 'mode': 0644, 'configuration_attributes': {u'hidden': {u'javax.jdo.option.ConnectionPassword': u'HIVE_CLIENT,CONFIG_DOWNLOAD'}}, 'owner': 'hive', 'configurations': ...}
2018-12-13 16:32:16,113 - Generating config: /usr/hdp/current/hive-client/conf/hive-site.xml
2018-12-13 16:32:16,114 - File['/usr/hdp/current/hive-client/conf/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-12-13 16:32:16,211 - Generating Atlas Hook config file /usr/hdp/current/hive-client/conf/atlas-application.properties
2018-12-13 16:32:16,212 - PropertiesFile['/usr/hdp/current/hive-client/conf/atlas-application.properties'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644, 'properties': ...}
2018-12-13 16:32:16,214 - Generating properties file: /usr/hdp/current/hive-client/conf/atlas-application.properties
2018-12-13 16:32:16,214 - File['/usr/hdp/current/hive-client/conf/atlas-application.properties'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2018-12-13 16:32:16,223 - Writing File['/usr/hdp/current/hive-client/conf/atlas-application.properties'] because contents don't match
2018-12-13 16:32:16,226 - File['/usr/hdp/current/hive-client/conf/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0755}
2018-12-13 16:32:16,227 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2018-12-13 16:32:16,230 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2018-12-13 16:32:16,231 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('http://ambari.hadoop.uom.gr:8080/resources/DBConnectionVerification.jar'), 'mode': 0644}
2018-12-13 16:32:16,231 - Not downloading the file from http://ambari.hadoop.uom.gr:8080/resources/DBConnectionVerification.jar, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists
2018-12-13 16:32:16,231 - File['/var/lib/ambari-agent/tmp/mysql-connector-java.jar'] {'content': DownloadSource('http://ambari.hadoop.uom.gr:8080/resources/mysql-connector-java.jar')}
2018-12-13 16:32:16,231 - Downloading the file from http://ambari.hadoop.uom.gr:8080/resources/mysql-connector-java.jar
2018-12-13 16:32:16,241 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed
Command failed after 1 tries
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hive
10-09-2016
03:16 PM
hi, i followed the latest guide automated install with ambari, configuring 10 centos 6.8 vms. In the last step, all Services failed to install , i have failules and warnings encoutered, when i click retry install it happens the same thing but with different order of failures. I have checked Internet Connection, PUBLIC repos and i think are all done exactly as the guide says. this is a log file from one service Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/datanode.py", line 174, in <module>
DataNode().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/datanode.py", line 49, in install
self.install_packages(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 567, in install_packages
retry_count=agent_stack_retry_count)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 54, in action_install
self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 49, in install_package
self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 83, in checked_call_with_retries
return self._call_with_retries(cmd, is_checked=True, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 91, in _call_with_retries
code, out = func(cmd, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 71, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 93, in checked_call
tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 141, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 294, in _call
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_2_5_0_0_1245' returned 1. Error: Package: glibc-headers-2.12-1.192.el6.x86_64 (base)
Requires: kernel-headers >= 2.2.1
Error: Package: glibc-headers-2.12-1.192.el6.x86_64 (base)
Requires: kernel-headers
You could try using --skip-broken to work around the problem
You could try running: rpm -Va --nofiles --nodigest
stdout: /var/lib/ambari-agent/data/output-2297.txt
2016-10-09 17:45:04,629 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-10-09 17:45:04,630 - Group['livy'] {}
2016-10-09 17:45:04,631 - Group['spark'] {}
2016-10-09 17:45:04,631 - Group['zeppelin'] {}
2016-10-09 17:45:04,631 - Group['hadoop'] {}
2016-10-09 17:45:04,631 - Group['users'] {}
2016-10-09 17:45:04,631 - Group['knox'] {}
2016-10-09 17:45:04,632 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,632 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,632 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,633 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,633 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,633 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-10-09 17:45:04,634 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,634 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-10-09 17:45:04,634 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-10-09 17:45:04,635 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,635 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,635 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,636 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,636 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,636 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-10-09 17:45:04,637 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,637 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,637 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,638 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,638 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,638 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,639 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,639 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,639 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,640 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-10-09 17:45:04,640 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2016-10-09 17:45:04,644 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2016-10-09 17:45:04,644 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2016-10-09 17:45:04,645 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-10-09 17:45:04,646 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2016-10-09 17:45:04,649 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2016-10-09 17:45:04,649 - Group['hdfs'] {}
2016-10-09 17:45:04,649 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']}
2016-10-09 17:45:04,650 - FS Type:
2016-10-09 17:45:04,650 - Directory['/etc/hadoop'] {'mode': 0755}
2016-10-09 17:45:04,650 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2016-10-09 17:45:04,661 - Initializing 2 repositories
2016-10-09 17:45:04,662 - Repository['HDP-2.5'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.5.0.0/', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2016-10-09 17:45:04,668 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.5]\nname=HDP-2.5\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.5.0.0/\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-10-09 17:45:04,669 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2016-10-09 17:45:04,671 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-10-09 17:45:04,672 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-10-09 17:45:04,744 - Skipping installation of existing package unzip
2016-10-09 17:45:04,744 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-10-09 17:45:04,750 - Skipping installation of existing package curl
2016-10-09 17:45:04,750 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-10-09 17:45:04,756 - Skipping installation of existing package hdp-select
2016-10-09 17:45:04,889 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-10-09 17:45:04,892 - Stack Feature Version Info: stack_version=2.5, version=None, current_cluster_version=None -> 2.5
2016-10-09 17:45:04,894 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-10-09 17:45:04,897 - checked_call['rpm -q --queryformat '%{version}-%{release}' hdp-select | sed -e 's/\.el[0-9]//g''] {'stderr': -1}
2016-10-09 17:45:04,918 - checked_call returned (0, '2.5.0.0-1245', '')
2016-10-09 17:45:04,920 - Package['hadoop_2_5_0_0_1245'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-10-09 17:45:04,980 - Installing package hadoop_2_5_0_0_1245 ('/usr/bin/yum -d 0 -e 0 -y install hadoop_2_5_0_0_1245')
Command failed after 1 tries
thank you
... View more
Labels:
- Labels:
-
Apache Ambari