Created 04-11-2017 12:36 PM
stderr: Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-INSTALL/scripts/hook.py", line 37, in <module> BeforeInstallHook().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 314, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-INSTALL/scripts/hook.py", line 34, in hook install_packages() File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-INSTALL/scripts/shared_initialization.py", line 37, in install_packages retry_count=params.agent_stack_retry_count) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 54, in action_install self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 51, in install_package self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput()) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 86, in checked_call_with_retries return self._call_with_retries(cmd, is_checked=True, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 98, in _call_with_retries code, out = func(cmd, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install hdp-select' returned 1. Error: Nothing to do stdout: 2017-04-11 16:01:22,304 - Stack Feature Version Info: stack_version=2.6, version=None, current_cluster_version=None -> 2.6 2017-04-11 16:01:22,305 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf User Group mapping (user_group) is missing in the hostLevelParams 2017-04-11 16:01:22,306 - Group['livy'] {} 2017-04-11 16:01:22,307 - Group['spark'] {} 2017-04-11 16:01:22,308 - Group['zeppelin'] {} 2017-04-11 16:01:22,308 - Group['hadoop'] {} 2017-04-11 16:01:22,308 - Group['users'] {} 2017-04-11 16:01:22,308 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-04-11 16:01:22,309 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-04-11 16:01:22,310 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-04-11 16:01:22,310 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-04-11 16:01:22,311 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']} 2017-04-11 16:01:22,311 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-04-11 16:01:22,312 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']} 2017-04-11 16:01:22,312 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']} 2017-04-11 16:01:22,313 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop']} 2017-04-11 16:01:22,314 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-04-11 16:01:22,314 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-04-11 16:01:22,315 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-04-11 16:01:22,315 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']} 2017-04-11 16:01:22,316 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-04-11 16:01:22,316 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-04-11 16:01:22,317 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-04-11 16:01:22,318 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-04-11 16:01:22,318 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-04-11 16:01:22,319 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-04-11 16:01:22,319 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-04-11 16:01:22,320 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-04-11 16:01:22,320 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-04-11 16:01:22,322 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2017-04-11 16:01:22,412 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if 2017-04-11 16:01:22,413 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'} 2017-04-11 16:01:22,414 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-04-11 16:01:22,415 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2017-04-11 16:01:22,508 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if 2017-04-11 16:01:22,509 - Group['hdfs'] {} 2017-04-11 16:01:22,509 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']} 2017-04-11 16:01:22,510 - FS Type: 2017-04-11 16:01:22,510 - Directory['/etc/hadoop'] {'mode': 0755} 2017-04-11 16:01:22,511 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2017-04-11 16:01:22,511 - Changing owner for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hdfs 2017-04-11 16:01:22,526 - Repository list is empty. Ambari may not be managing the repositories. 2017-04-11 16:01:22,526 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-04-11 16:01:22,579 - Skipping installation of existing package unzip 2017-04-11 16:01:22,579 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-04-11 16:01:22,587 - Skipping installation of existing package curl 2017-04-11 16:01:22,588 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-04-11 16:01:22,596 - Installing package hdp-select ('/usr/bin/yum -d 0 -e 0 -y install hdp-select') 2017-04-11 16:01:23,830 - Execution of '/usr/bin/yum -d 0 -e 0 -y install hdp-select' returned 1. Error: Nothing to do 2017-04-11 16:01:23,830 - Failed to install package hdp-select. Executing '/usr/bin/yum clean metadata' 2017-04-11 16:01:24,242 - Retrying to install package hdp-select after 30 seconds Command failed after 1 tries
Created 04-11-2017 12:36 PM
Can you please share the complete stackTrace of the error from Ambari UI + Also from the ambari-server.log
Also the exact detail of ambari-server and the HDP Stack also will be useful to know.
When exactly did you get this error (Any specific component or service that failed to install) ?
Created 04-11-2017 12:36 PM
I've add the complete stackTrace
Created 04-11-2017 01:58 PM
It looks like there might be a problem with the repository you are using. This error suggests that hdp-select doesn't exist in your repo:
resource_management.core.exceptions.ExecutionFailed:Execution of '/usr/bin/yum -d 0 -e 0 -y install hdp-select' returned 1.Error:Nothing to do
Can you verify that /etc/yum.repos.d/HDP.repo exists and has the correct repository listed? You can also try a "yum clean all" on that host.
Created 04-18-2017 11:29 AM
Jay/Jonathan,
going thru community, just noticed similar issue am facing currently. I don't see further update on this issue.
--local repository in place
--Agent installed successfully on new hosts thru ambari url. if I try to install manually on new host it worked fine using local repository server.
Host Status
Message
test1.cluster.com 100%
Failures encountered
test5.cluster.com 100%
Failures encountered
stderr: /var/lib/ambari-agent/data/errors-1101.txtTraceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/datanode.py", line 153, in <module> DataNode().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 218, in execute method(env) File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/datanode.py", line 34, in install self.install_packages(env, params.exclude_packages) File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 376, in install_packages Package(name) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 157, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 45, in action_install self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 49, in install_package shell.checked_call(cmd, sudo=True, logoutput=self.get_logoutput()) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call tries=tries, try_sleep=try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call raise Fail(err_msg) resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install 'hadoop_2_3_*'' returned 1. Error: Package: hadoop_2_3_0_0_2557-2.7.1.2.3.0.0-2557.el6.x86_64 (HDP-2.3) Requires: nc You could try using --skip-broken to work around the problem You could try running: rpm -Va --nofiles --nodigeststdout: /var/lib/ambari-agent/data/output-1101.txt
2017-04-17 23:30:26,664 - Group['hadoop'] {'ignore_failures': False} 2017-04-17 23:30:26,665 - Group['users'] {'ignore_failures': False} 2017-04-17 23:30:26,666 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2017-04-17 23:30:26,666 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['users']} 2017-04-17 23:30:26,667 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2017-04-17 23:30:26,667 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2017-04-17 23:30:26,668 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2017-04-17 23:30:26,668 - User['ams'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': ['hadoop']} 2017-04-17 23:30:26,669 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-04-17 23:30:26,670 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2017-04-17 23:30:26,674 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if 2017-04-17 23:30:26,674 - Group['hdfs'] {'ignore_failures': False} 2017-04-17 23:30:26,675 - User['hdfs'] {'ignore_failures': False, 'groups': ['hadoop', 'hdfs']} 2017-04-17 23:30:26,675 - Directory['/etc/hadoop'] {'mode': 0755} 2017-04-17 23:30:26,690 - Repository['HDP-2.3'] {'base_url': 'http://192.168.0.100/HDP/centos6/2.x/updates/2.3.0.0', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None} 2017-04-17 23:30:26,699 - File['/etc/yum.repos.d/HDP.repo'] {'content': InlineTemplate(...)} 2017-04-17 23:30:26,700 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://192.168.0.100/HDP-UTILS-1.1.0.20/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None} 2017-04-17 23:30:26,703 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': InlineTemplate(...)} 2017-04-17 23:30:26,703 - Package['unzip'] {} 2017-04-17 23:30:26,902 - Skipping installation of existing package unzip 2017-04-17 23:30:26,902 - Package['curl'] {} 2017-04-17 23:30:27,030 - Skipping installation of existing package curl 2017-04-17 23:30:27,030 - Package['hdp-select'] {} 2017-04-17 23:30:27,154 - Skipping installation of existing package hdp-select 2017-04-17 23:30:27,275 - Package['rpcbind'] {} 2017-04-17 23:30:27,471 - Skipping installation of existing package rpcbind 2017-04-17 23:30:27,471 - Package['hadoop_2_3_*'] {} 2017-04-17 23:30:27,589 - Installing package hadoop_2_3_* ('/usr/bin/yum -d 0 -e 0 -y install 'hadoop_2_3_*'')
Thank you
Created 04-18-2017 11:30 AM
Please review the error stack, let me know if anything found to fix and move forward.
Created 04-21-2017 04:35 AM
--It required/touch depended package during the installation of package, after setting up/mount the CentOS iso file on repo server it worked fine with-out any issue.
Installing package hadoop_2_3_*('/usr/bin/yum -d 0 -e 0 -y install 'hadoop_2_3_*'')
Thank You..
Created 05-16-2017 12:58 PM
Dears,
I got the same problem here. Below are my stderr:
kindly help....
2017-05-16 20:46:29,151 - Stack Feature Version Info: stack_version=2.6, version=None, current_cluster_version=None -> 2.62017-05-16 20:46:29,154 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/confUser Group mapping (user_group) is missing in the hostLevelParams2017-05-16 20:46:29,157 - Group['livy'] {}2017-05-16 20:46:29,161 - Group['spark'] {}2017-05-16 20:46:29,162 - Group['zeppelin'] {}
Created 05-16-2017 01:11 PM
@Ahmad,
I got the same issue, it required/touch with OS packages during HDP installation. Setup the OS repo file in place and re-try your HDP installation.
After setting up all repo's (ambari/HDP/OS) run
yum repolist
Also provide the complete stderr log file....it cause any other issue...
Created 05-17-2017 03:57 AM
sorry, setup OS repo file at master or at each host ?
kindly find the attached the stdout
stdout: /var/lib/ambari-agent/data/output-2766.txt 2017-05-16 20:46:29,151 - Stack Feature Version Info: stack_version=2.6, version=None, current_cluster_version=None -> 2.6 2017-05-16 20:46:29,154 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf User Group mapping (user_group) is missing in the hostLevelParams 2017-05-16 20:46:29,157 - Group['livy'] {} 2017-05-16 20:46:29,161 - Group['spark'] {} 2017-05-16 20:46:29,162 - Group['zeppelin'] {} 2017-05-16 20:46:29,162 - Group['hadoop'] {} 2017-05-16 20:46:29,163 - Group['users'] {} 2017-05-16 20:46:29,164 - Group['knox'] {} 2017-05-16 20:46:29,165 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,172 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,174 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,175 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,177 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']} 2017-05-16 20:46:29,179 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,181 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,183 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,185 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,187 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,188 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,190 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,192 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']} 2017-05-16 20:46:29,194 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']} 2017-05-16 20:46:29,196 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop']} 2017-05-16 20:46:29,197 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,199 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,201 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,203 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']} 2017-05-16 20:46:29,205 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,206 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,208 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,210 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,212 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,214 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-05-16 20:46:29,216 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-05-16 20:46:29,221 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2017-05-16 20:46:29,233 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if 2017-05-16 20:46:29,235 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'} 2017-05-16 20:46:29,242 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-05-16 20:46:29,247 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2017-05-16 20:46:29,259 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if 2017-05-16 20:46:29,260 - Group['hdfs'] {} 2017-05-16 20:46:29,261 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']} 2017-05-16 20:46:29,263 - FS Type: 2017-05-16 20:46:29,263 - Directory['/etc/hadoop'] {'mode': 0755} 2017-05-16 20:46:29,266 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2017-05-16 20:46:29,307 - Initializing 2 repositories 2017-05-16 20:46:29,309 - Repository['HDP-2.6'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.0.3', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None} 2017-05-16 20:46:29,337 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.6]\nname=HDP-2.6\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.0.3\n\npath=/\nenabled=1\ngpgcheck=0'} 2017-05-16 20:46:29,339 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None} 2017-05-16 20:46:29,349 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6\n\npath=/\nenabled=1\ngpgcheck=0'} 2017-05-16 20:46:29,350 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-05-16 20:46:29,658 - Skipping installation of existing package unzip 2017-05-16 20:46:29,659 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-05-16 20:46:29,697 - Skipping installation of existing package curl 2017-05-16 20:46:29,698 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-05-16 20:46:29,736 - Skipping installation of existing package hdp-select 2017-05-16 20:46:30,187 - Package['accumulo_2_3_6_0_3796'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-05-16 20:46:30,462 - Installing package accumulo_2_3_6_0_3796 ('/usr/bin/yum -d 0 -e 0 -y install accumulo_2_3_6_0_3796') 2017-05-16 20:46:32,303 - Execution of '/usr/bin/yum -d 0 -e 0 -y install accumulo_2_3_6_0_3796' returned 1. This system is not registered with RHN Classic or RHN Satellite. You can use rhn_register to register. RHN Satellite or RHN Classic support will be disabled. Error: Nothing to do 2017-05-16 20:46:32,304 - Failed to install package accumulo_2_3_6_0_3796. Executing '/usr/bin/yum clean metadata' 2017-05-16 20:46:32,720 - Retrying to install package accumulo_2_3_6_0_3796 after 30 seconds Command failed after 1 tries
Created 05-17-2017 12:22 PM
-repo file not setup correctly...
setup/copy the Ambari/OS repo required in all hosts...if you are trying to setup HDP thru ambari...no need to copy the HDP/HDP-UTILS repo files..just setup the repo's in ambari url..file copy taken care at the time of installation for all hosts.
Output:
ls -ltra /etc/yum.repo.d/
yum repolist
Created 04-06-2019 06:47 PM
Make sure that you have installed this RPMS on all node.
libtirpc-devel-0.2.4-0.15.el7.x86_64
libtirpc-0.2.4-0.15.el7.x86_64
cheers