Support Questions

Find answers, ask questions, and share your expertise

Ambari all services failed to install

avatar
Explorer

hi, i followed the latest guide automated install with ambari, configuring 10 centos 6.8 vms.

In the last step, all Services failed to install , i have failules and warnings encoutered, when i click retry install

it happens the same thing but with different order of failures.

I have checked Internet Connection, PUBLIC repos and i think are all done exactly as the guide says.

this is a log file from one service

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/datanode.py", line 174, in <module>
    DataNode().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/datanode.py", line 49, in install
    self.install_packages(env)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 567, in install_packages
    retry_count=agent_stack_retry_count)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 54, in action_install
    self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 49, in install_package
    self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 83, in checked_call_with_retries
    return self._call_with_retries(cmd, is_checked=True, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 91, in _call_with_retries
    code, out = func(cmd, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 71, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 93, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 141, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 294, in _call
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_2_5_0_0_1245' returned 1. Error: Package: glibc-headers-2.12-1.192.el6.x86_64 (base)
           Requires: kernel-headers >= 2.2.1
Error: Package: glibc-headers-2.12-1.192.el6.x86_64 (base)
           Requires: kernel-headers
 You could try using --skip-broken to work around the problem
 You could try running: rpm -Va --nofiles --nodigest

stdout: /var/lib/ambari-agent/data/output-2297.txt
2016-10-09 17:45:04,629 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-10-09 17:45:04,630 - Group['livy'] {}
2016-10-09 17:45:04,631 - Group['spark'] {}
2016-10-09 17:45:04,631 - Group['zeppelin'] {}
2016-10-09 17:45:04,631 - Group['hadoop'] {}
2016-10-09 17:45:04,631 - Group['users'] {}
2016-10-09 17:45:04,631 - Group['knox'] {}
2016-10-09 17:45:04,632 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,632 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,632 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,633 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,633 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,633 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-10-09 17:45:04,634 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,634 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-10-09 17:45:04,634 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-10-09 17:45:04,635 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,635 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,635 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,636 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,636 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,636 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-10-09 17:45:04,637 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,637 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,637 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,638 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,638 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,638 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,639 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,639 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,639 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-10-09 17:45:04,640 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-10-09 17:45:04,640 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2016-10-09 17:45:04,644 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2016-10-09 17:45:04,644 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2016-10-09 17:45:04,645 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-10-09 17:45:04,646 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2016-10-09 17:45:04,649 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2016-10-09 17:45:04,649 - Group['hdfs'] {}
2016-10-09 17:45:04,649 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']}
2016-10-09 17:45:04,650 - FS Type: 
2016-10-09 17:45:04,650 - Directory['/etc/hadoop'] {'mode': 0755}
2016-10-09 17:45:04,650 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2016-10-09 17:45:04,661 - Initializing 2 repositories
2016-10-09 17:45:04,662 - Repository['HDP-2.5'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.5.0.0/', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2016-10-09 17:45:04,668 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.5]\nname=HDP-2.5\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.5.0.0/\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-10-09 17:45:04,669 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2016-10-09 17:45:04,671 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-10-09 17:45:04,672 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-10-09 17:45:04,744 - Skipping installation of existing package unzip
2016-10-09 17:45:04,744 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-10-09 17:45:04,750 - Skipping installation of existing package curl
2016-10-09 17:45:04,750 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-10-09 17:45:04,756 - Skipping installation of existing package hdp-select
2016-10-09 17:45:04,889 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-10-09 17:45:04,892 - Stack Feature Version Info: stack_version=2.5, version=None, current_cluster_version=None -> 2.5
2016-10-09 17:45:04,894 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-10-09 17:45:04,897 - checked_call['rpm -q --queryformat '%{version}-%{release}' hdp-select | sed -e 's/\.el[0-9]//g''] {'stderr': -1}
2016-10-09 17:45:04,918 - checked_call returned (0, '2.5.0.0-1245', '')
2016-10-09 17:45:04,920 - Package['hadoop_2_5_0_0_1245'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-10-09 17:45:04,980 - Installing package hadoop_2_5_0_0_1245 ('/usr/bin/yum -d 0 -e 0 -y install hadoop_2_5_0_0_1245')

Command failed after 1 tries

thank you

1 ACCEPTED SOLUTION

avatar
Master Guru

You are missing kernel-headers. Check /etc/yum.conf file for a line saying "exclude=kernel*" and comment it out. Then, as a test run "yum install kernel-headers" and "yum install hadoop_2_5_0_0_1245" on one node. If it works, comment out that line on all cluster nodes and retry the install by Ambari. More details about kernel headers here.

View solution in original post

3 REPLIES 3

avatar
Master Guru

You are missing kernel-headers. Check /etc/yum.conf file for a line saying "exclude=kernel*" and comment it out. Then, as a test run "yum install kernel-headers" and "yum install hadoop_2_5_0_0_1245" on one node. If it works, comment out that line on all cluster nodes and retry the install by Ambari. More details about kernel headers here.

avatar
Explorer

yes, i will remove it and try again,

thanks

avatar
Explorer

That was it