Support Questions
Find answers, ask questions, and share your expertise

How do i get around the problem below (cint missing)?

Explorer

I'm trying to install a cluster and am using the cluster install wizard.

I am using the "how to" that you have kindly provided at: https://docs.hortonworks.com/HDPDocuments/Ambari-2.2.2.0/bk_Installing_HDP_AMB/content/_download_the... with Centos 7.

As the traceback shows, the wizard reports "cannot import name cint".

Thank you in advance for your assistance.

Traceback (most recent call last):

  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-INSTALL/scripts/hook.py", line 37, in <module>
    BeforeInstallHook().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-INSTALL/scripts/hook.py", line 28, in hook
    import params
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-INSTALL/scripts/params.py", line 21, in <module>
    from ambari_commons.str_utils import cbool, cint
ImportError: cannot import name cint

stdout: /var/lib/ambari-agent/data/output-70.txt

6 REPLIES 6

Super Mentor

@j c currey

Looks like your ambari-agent installation is not proper ... some python scripts are not updated properly.

Can you try reinstalling the ambari agent again.

# ambari-agent stop
# yum reinstall ambari-agent
# ambari-agent start

.

Still if it does not work then can you please share the following details?

1. The exact ambari package version

# ambari-agent --version
# rpm -qa | grep ambari*
# yum info ambari-agent

2. The ambari repo version

# cat /etc/yum.repos.d/ambari.repo 

.

Explorer

@Jay SenSharma

Performed stop, reinstall, start on headnode

Same failure.

I then went further and tried something on my own and ended up clobbering everything.

I then reloaded CentOS 7 on the headnode and hopefully cleared the slave nodes with https://community.hortonworks.com/questions/1110/how-to-completely-remove-uninstall-ambari-and-hdp.h...

I then reinstalled Ambari using https://docs.hortonworks.com/HDPDocuments/Ambari-2.2.2.0/bk_Installing_HDP_AMB/content/_download_the...

The confirm hosts step in the wizard now works.

Went through the customize services panels. Ambari suggested some tuning parameters--which I ignored.

Clicked past the Derby warning.

Looked at review and then clicked deploy.

Now on the first slave node I get this when the wizard tries to install App Timeline on the first slave node:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/application_timeline_server.py", line 147, in <module>
    ApplicationTimelineServer().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/application_timeline_server.py", line 38, in install
    self.install_packages(env)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 410, in install_packages
    retry_count=agent_stack_retry_count)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 54, in action_install
    self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 49, in install_package
    self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 83, in checked_call_with_retries
    return self._call_with_retries(cmd, is_checked=True, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 91, in _call_with_retries
    code, out = func(cmd, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install 'hadoop_2_4_*-yarn'' returned 1. Error:  Multilib version problems found. This often means that the root
       cause is something else and multilib version checking is just
       pointing out that there is a problem. Eg.:
       
         1. You have an upgrade for glibc which is missing some
            dependency that another package requires. Yum is trying to
            solve this by installing an older version of glibc of the
            different architecture. If you exclude the bad architecture
            yum will tell you what the root cause is (which package
            requires what). You can try redoing the upgrade with
            --exclude glibc.otherarch ... this should give you an error
            message showing the root cause of the problem.
       
         2. You have multiple architectures of glibc installed, but
            yum can only see an upgrade for one of those architectures.
            If you don't want/need both architectures anymore then you
            can remove the one with the missing update and everything
            will work.
       
         3. You have duplicate versions of glibc installed already.
            You can use "yum check" to get yum show these errors.
       
       ...you can also use --setopt=protected_multilib=false to remove
       this checking, however this is almost never the correct thing to
       do as something else is very likely to go wrong (often causing
       much more problems).
       
       Protected multilib versions: glibc-2.17-157.el7_3.1.x86_64 != glibc-2.17-157.el7.i686
stdout: /var/lib/ambari-agent/data/output-48.txt
2017-03-14 14:15:44,457 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-03-14 14:15:44,460 - Group['spark'] {}
2017-03-14 14:15:44,461 - Group['hadoop'] {}
2017-03-14 14:15:44,462 - Group['users'] {}
2017-03-14 14:15:44,462 - Group['knox'] {}
2017-03-14 14:15:44,462 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,464 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,465 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,466 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-03-14 14:15:44,466 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,467 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,468 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-03-14 14:15:44,469 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-03-14 14:15:44,470 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,471 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,472 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,473 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-03-14 14:15:44,474 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,475 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,476 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,477 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,478 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,479 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,480 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,481 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,482 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:44,483 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-03-14 14:15:44,485 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-03-14 14:15:44,493 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2017-03-14 14:15:44,494 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
2017-03-14 14:15:44,495 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-03-14 14:15:44,497 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2017-03-14 14:15:44,505 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2017-03-14 14:15:44,506 - Group['hdfs'] {}
2017-03-14 14:15:44,506 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
2017-03-14 14:15:44,507 - FS Type: 
2017-03-14 14:15:44,507 - Directory['/etc/hadoop'] {'mode': 0755}
2017-03-14 14:15:44,508 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777}
2017-03-14 14:15:44,529 - Repository['HDP-2.4'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.3.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2017-03-14 14:15:44,541 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.4]\nname=HDP-2.4\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.3.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-03-14 14:15:44,542 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2017-03-14 14:15:44,547 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.20]\nname=HDP-UTILS-1.1.0.20\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-03-14 14:15:44,548 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-03-14 14:15:44,686 - Skipping installation of existing package unzip
2017-03-14 14:15:44,686 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-03-14 14:15:44,703 - Skipping installation of existing package curl
2017-03-14 14:15:44,703 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-03-14 14:15:44,719 - Skipping installation of existing package hdp-select
2017-03-14 14:15:45,027 - Package['hadoop_2_4_*-yarn'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-03-14 14:15:45,166 - Installing package hadoop_2_4_*-yarn ('/usr/bin/yum -d 0 -e 0 -y install 'hadoop_2_4_*-yarn'')

I also get this when the wizard tries to install Accumulo TServer on the first slave node:

Traceback (most recent call last):

  File "/var/lib/ambari-agent/cache/common-services/ACCUMULO/1.6.1.2.2.0/package/scripts/accumulo_tserver.py", line 24, in <module>
    AccumuloScript('tserver').execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/ACCUMULO/1.6.1.2.2.0/package/scripts/accumulo_script.py", line 66, in install
    self.install_packages(env)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 410, in install_packages
    retry_count=agent_stack_retry_count)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 54, in action_install
    self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 49, in install_package
    self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 83, in checked_call_with_retries
    return self._call_with_retries(cmd, is_checked=True, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 91, in _call_with_retries
    code, out = func(cmd, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install 'accumulo_2_4_*'' returned 1. Error:  Multilib version problems found. This often means that the root
       cause is something else and multilib version checking is just
       pointing out that there is a problem. Eg.:
       
         1. You have an upgrade for glibc which is missing some
            dependency that another package requires. Yum is trying to
            solve this by installing an older version of glibc of the
            different architecture. If you exclude the bad architecture
            yum will tell you what the root cause is (which package
            requires what). You can try redoing the upgrade with
            --exclude glibc.otherarch ... this should give you an error
            message showing the root cause of the problem.
       
         2. You have multiple architectures of glibc installed, but
            yum can only see an upgrade for one of those architectures.
            If you don't want/need both architectures anymore then you
            can remove the one with the missing update and everything
            will work.
       
         3. You have duplicate versions of glibc installed already.
            You can use "yum check" to get yum show these errors.
       
       ...you can also use --setopt=protected_multilib=false to remove
       this checking, however this is almost never the correct thing to
       do as something else is very likely to go wrong (often causing
       much more problems).
       
       Protected multilib versions: glibc-2.17-157.el7_3.1.x86_64 != glibc-2.17-157.el7.i686
stdout: /var/lib/ambari-agent/data/output-47.txt
2017-03-14 14:15:20,062 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-03-14 14:15:20,065 - Group['spark'] {}
2017-03-14 14:15:20,068 - Group['hadoop'] {}
2017-03-14 14:15:20,069 - Adding group Group['hadoop']
2017-03-14 14:15:20,117 - Group['users'] {}
2017-03-14 14:15:20,118 - Group['knox'] {}
2017-03-14 14:15:20,118 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:20,119 - Adding user User['hive']
2017-03-14 14:15:20,219 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:20,219 - Adding user User['storm']
2017-03-14 14:15:20,316 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:20,316 - Adding user User['zookeeper']
2017-03-14 14:15:20,418 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-03-14 14:15:20,419 - Adding user User['oozie']
2017-03-14 14:15:20,521 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:20,522 - Adding user User['atlas']
2017-03-14 14:15:20,636 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:20,636 - Adding user User['ams']
2017-03-14 14:15:20,751 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-03-14 14:15:20,752 - Adding user User['falcon']
2017-03-14 14:15:20,842 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-03-14 14:15:20,843 - Adding user User['tez']
2017-03-14 14:15:20,939 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:20,939 - Adding user User['accumulo']
2017-03-14 14:15:21,042 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:21,042 - Adding user User['mahout']
2017-03-14 14:15:21,145 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:21,145 - Adding user User['spark']
2017-03-14 14:15:21,265 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-03-14 14:15:21,266 - Adding user User['ambari-qa']
2017-03-14 14:15:21,380 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:21,381 - Adding user User['flume']
2017-03-14 14:15:21,484 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:21,484 - Adding user User['kafka']
2017-03-14 14:15:21,574 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:21,575 - Adding user User['hdfs']
2017-03-14 14:15:21,677 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:21,678 - Adding user User['sqoop']
2017-03-14 14:15:21,768 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:21,769 - Adding user User['yarn']
2017-03-14 14:15:21,883 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:21,883 - Adding user User['mapred']
2017-03-14 14:15:21,998 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:21,998 - Adding user User['hbase']
2017-03-14 14:15:22,083 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:22,084 - Adding user User['knox']
2017-03-14 14:15:22,174 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-03-14 14:15:22,174 - Adding user User['hcat']
2017-03-14 14:15:22,271 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-03-14 14:15:22,280 - Writing File['/var/lib/ambari-agent/tmp/changeUid.sh'] because it doesn't exist
2017-03-14 14:15:22,280 - Changing permission for /var/lib/ambari-agent/tmp/changeUid.sh from 644 to 555
2017-03-14 14:15:22,281 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-03-14 14:15:22,289 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2017-03-14 14:15:22,290 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
2017-03-14 14:15:22,290 - Creating directory Directory['/tmp/hbase-hbase'] since it doesn't exist.
2017-03-14 14:15:22,290 - Changing owner for /tmp/hbase-hbase from 0 to hbase
2017-03-14 14:15:22,290 - Changing permission for /tmp/hbase-hbase from 755 to 775
2017-03-14 14:15:22,291 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-03-14 14:15:22,293 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2017-03-14 14:15:22,301 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2017-03-14 14:15:22,302 - Group['hdfs'] {}
2017-03-14 14:15:22,302 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
2017-03-14 14:15:22,303 - Modifying user hdfs
2017-03-14 14:15:22,352 - FS Type: 
2017-03-14 14:15:22,352 - Directory['/etc/hadoop'] {'mode': 0755}
2017-03-14 14:15:22,353 - Creating directory Directory['/etc/hadoop'] since it doesn't exist.
2017-03-14 14:15:22,354 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777}
2017-03-14 14:15:22,354 - Creating directory Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] since it doesn't exist.
2017-03-14 14:15:22,354 - Changing owner for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hdfs
2017-03-14 14:15:22,354 - Changing group for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hadoop
2017-03-14 14:15:22,355 - Changing permission for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 755 to 777
2017-03-14 14:15:22,355 - Directory['/var/lib/ambari-agent/tmp/AMBARI-artifacts/'] {'recursive': True}
2017-03-14 14:15:22,355 - Creating directory Directory['/var/lib/ambari-agent/tmp/AMBARI-artifacts/'] since it doesn't exist.
2017-03-14 14:15:22,356 - File['/var/lib/ambari-agent/tmp/jdk-8u60-linux-x64.tar.gz'] {'content': DownloadSource('http://headnode.gladys.ambari:8080/resources//jdk-8u60-linux-x64.tar.gz'), 'not_if': 'test -f /var/lib/ambari-agent/tmp/jdk-8u60-linux-x64.tar.gz'}
2017-03-14 14:15:22,362 - Downloading the file from http://headnode.gladys.ambari:8080/resources//jdk-8u60-linux-x64.tar.gz
2017-03-14 14:15:24,778 - Directory['/usr/jdk64'] {}
2017-03-14 14:15:24,778 - Execute[('chmod', 'a+x', u'/usr/jdk64')] {'sudo': True}
2017-03-14 14:15:24,788 - Execute['cd /var/lib/ambari-agent/tmp/jdk_tmp_086nF7 && tar -xf /var/lib/ambari-agent/tmp/jdk-8u60-linux-x64.tar.gz && ambari-sudo.sh cp -rp /var/lib/ambari-agent/tmp/jdk_tmp_086nF7/* /usr/jdk64'] {}
2017-03-14 14:15:29,860 - Directory['/var/lib/ambari-agent/tmp/jdk_tmp_086nF7'] {'action': ['delete']}
2017-03-14 14:15:29,861 - Removing directory Directory['/var/lib/ambari-agent/tmp/jdk_tmp_086nF7'] and all its content
2017-03-14 14:15:30,049 - File['/usr/jdk64/jdk1.8.0_60/bin/java'] {'mode': 0755, 'cd_access': 'a'}
2017-03-14 14:15:30,050 - Execute[('chmod', '-R', '755', u'/usr/jdk64/jdk1.8.0_60')] {'sudo': True}
2017-03-14 14:15:30,093 - Repository['HDP-2.4'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.3.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2017-03-14 14:15:30,106 - File['/etc/yum.repos.d/HDP.repo'] {'content': InlineTemplate(...)}
2017-03-14 14:15:30,107 - Writing File['/etc/yum.repos.d/HDP.repo'] because it doesn't exist
2017-03-14 14:15:30,108 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2017-03-14 14:15:30,113 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.20]\nname=HDP-UTILS-1.1.0.20\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-03-14 14:15:30,114 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-03-14 14:15:30,255 - Skipping installation of existing package unzip
2017-03-14 14:15:30,256 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-03-14 14:15:30,272 - Skipping installation of existing package curl
2017-03-14 14:15:30,273 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-03-14 14:15:30,289 - Installing package hdp-select ('/usr/bin/yum -d 0 -e 0 -y install hdp-select')
2017-03-14 14:15:39,049 - Package['accumulo_2_4_*'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-03-14 14:15:39,188 - Installing package accumulo_2_4_* ('/usr/bin/yum -d 0 -e 0 -y install 'accumulo_2_4_*'')

Responding to your information requests:

[root@headnode .ssh]# ambari-agent --version

2.2.2.0

***************************************************

[root@headnode .ssh]# rpm -qa | grep ambari*

ambari-server-2.2.2.0-460.x86_64 ambari-agent-2.2.2.0-460.x86_64

****************************************************

[root@headnode .ssh]# yum info ambari-agent

Loaded plugins: fastestmirror, langpacks Loading mirror speeds from cached hostfile * base: centos.mirror.lstn.net * extras: mirror.cloud-bricks.net * updates: mirror.oss.ou.edu Installed Packages Name : ambari-agent Arch : x86_64 Version : 2.2.2.0 Release : 460 Size : 31 M Repo : installed From repo : Updates-ambari-2.2.2.0 Summary : Ambari Agent URL : http://www.apache.org License : 2012, Apache Software Foundation Description : Maven Recipe: RPM Package.

*****************************************************

Thank you again for your help.

This time I will leave the cluster alone waiting for your guidance.

jimc

      

Super Mentor

@j c currey

Regarding the following error:

resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0  -e 0 -y install 'hadoop_2_4_*-yarn'' returned 1. Error:  Multilib version problems found. This often means that the root cause is something else and multilib version checking is just

Which indicates multiple version issue at yum level. So can you please check the following:

1. Following command will give the binaries that were duplicated.

# yum check

2. Following command will help getting more info about the "glibc" multiversion. Example:

# rpm -qa|grep glibc

glibc-common-2.12-1.192.el6.x86_64
glibc-devel-2.12-1.192.el6.x86_64
glibc-2.12-1.192.el6.x86_64
glibc-headers-2.12-1.192.el6.x86_64
glibc-2.12-1.192.el6.i686

3. Also please run the following command to see if mistakenly you have installed the 686 binaries:

# yum info glibc*

4. If you have incorrect version of this package installed then you can remove it.

yum remove <EXACT_PACKAGE_NAME>
yum clean all

.

Explorer

@Jay SenSharma

Thank you again.

I will wait for your instructions rather than go off on my own and introduce more variables.

jimc

[root@headnode ~]# yum check

Loaded plugins: fastestmirror, langpacks

check all

[root@headnode ~]#

***************************************************************

[root@headnode ~]# rpm -qa|grep glibc

glibc-devel-2.17-157.el7_3.1.x86_64

glibc-2.17-157.el7_3.1.x86_64

glibc-headers-2.17-157.el7_3.1.x86_64

glibc-common-2.17-157.el7_3.1.x86_64

[root@headnode ~]#

***************************************************************

[root@headnode ~]# yum info glibc*
Loaded plugins: fastestmirror, langpacks
Loading mirror speeds from cached hostfile
 * base: centos.mirror.lstn.net
 * extras: mirror.cloud-bricks.net
 * updates: mirror.oss.ou.edu
Installed Packages
Name        : glibc
Arch        : x86_64
Version     : 2.17
Release     : 157.el7_3.1
Size        : 13 M
Repo        : installed
From repo   : updates
Summary     : The GNU libc libraries
URL         : http://www.gnu.org/software/glibc/
License     : LGPLv2+ and LGPLv2+ with exceptions and GPLv2+
Description : The glibc package contains standard libraries which are used by
            : multiple programs on the system. In order to save disk space and
            : memory, as well as to make upgrading easier, common system code is
            : kept in one place and shared between programs. This particular package
            : contains the most important sets of shared libraries: the standard C
            : library and the standard math library. Without these two libraries, a
            : Linux system will not function.
Name        : glibc-common
Arch        : x86_64
Version     : 2.17
Release     : 157.el7_3.1
Size        : 115 M
Repo        : installed
From repo   : updates
Summary     : Common binaries and locale data for glibc
URL         : http://www.gnu.org/software/glibc/
License     : LGPLv2+ and LGPLv2+ with exceptions and GPLv2+
Description : The glibc-common package includes common binaries for the GNU libc
            : libraries, as well as national language (locale) support.
Name        : glibc-devel
Arch        : x86_64
Version     : 2.17
Release     : 157.el7_3.1
Size        : 1.0 M
Repo        : installed
From repo   : updates
Summary     : Object files for development using standard C libraries.
URL         : http://www.gnu.org/software/glibc/
License     : LGPLv2+ and LGPLv2+ with exceptions and GPLv2+
Description : The glibc-devel package contains the object files necessary
            : for developing programs which use the standard C libraries (which are
            : used by nearly all programs).  If you are developing programs which
            : will use the standard C libraries, your system needs to have these
            : standard object files available in order to create the
            : executables.
            :
            : Install glibc-devel if you are going to develop programs which will
            : use the standard C libraries.
Name        : glibc-headers
Arch        : x86_64
Version     : 2.17
Release     : 157.el7_3.1
Size        : 2.2 M
Repo        : installed
From repo   : updates
Summary     : Header files for development using standard C libraries.
URL         : http://www.gnu.org/software/glibc/
License     : LGPLv2+ and LGPLv2+ with exceptions and GPLv2+
Description : The glibc-headers package contains the header files necessary
            : for developing programs which use the standard C libraries (which are
            : used by nearly all programs).  If you are developing programs which
            : will use the standard C libraries, your system needs to have these
            : standard header files available in order to create the
            : executables.
            :
            : Install glibc-headers if you are going to develop programs which will
            : use the standard C libraries.
Available Packages
Name        : glibc
Arch        : i686
Version     : 2.17
Release     : 157.el7_3.1
Size        : 4.2 M
Repo        : updates/7/x86_64
Summary     : The GNU libc libraries
URL         : http://www.gnu.org/software/glibc/
License     : LGPLv2+ and LGPLv2+ with exceptions and GPLv2+
Description : The glibc package contains standard libraries which are used by
            : multiple programs on the system. In order to save disk space and
            : memory, as well as to make upgrading easier, common system code is
            : kept in one place and shared between programs. This particular package
            : contains the most important sets of shared libraries: the standard C
            : library and the standard math library. Without these two libraries, a
            : Linux system will not function.
Name        : glibc-devel
Arch        : i686
Version     : 2.17
Release     : 157.el7_3.1
Size        : 1.1 M
Repo        : updates/7/x86_64
Summary     : Object files for development using standard C libraries.
URL         : http://www.gnu.org/software/glibc/
License     : LGPLv2+ and LGPLv2+ with exceptions and GPLv2+
Description : The glibc-devel package contains the object files necessary
            : for developing programs which use the standard C libraries (which are
            : used by nearly all programs).  If you are developing programs which
            : will use the standard C libraries, your system needs to have these
            : standard object files available in order to create the
            : executables.
            :
            : Install glibc-devel if you are going to develop programs which will
            : use the standard C libraries.
Name        : glibc-static
Arch        : i686
Version     : 2.17
Release     : 157.el7_3.1
Size        : 1.2 M
Repo        : updates/7/x86_64
Summary     : C library static libraries for -static linking.
URL         : http://www.gnu.org/software/glibc/
License     : LGPLv2+ and LGPLv2+ with exceptions and GPLv2+
Description : The glibc-static package contains the C library static libraries
            : for -static linking.  You don't need these, unless you link statically,
            : which is highly discouraged.
Name        : glibc-static
Arch        : x86_64
Version     : 2.17
Release     : 157.el7_3.1
Size        : 1.5 M
Repo        : updates/7/x86_64
Summary     : C library static libraries for -static linking.
URL         : http://www.gnu.org/software/glibc/
License     : LGPLv2+ and LGPLv2+ with exceptions and GPLv2+
Description : The glibc-static package contains the C library static libraries
            : for -static linking.  You don't need these, unless you link statically,
            : which is highly discouraged.
Name        : glibc-utils
Arch        : x86_64
Version     : 2.17
Release     : 157.el7_3.1
Size        : 209 k
Repo        : updates/7/x86_64
Summary     : Development utilities from GNU C library
URL         : http://www.gnu.org/software/glibc/
License     : LGPLv2+ and LGPLv2+ with exceptions and GPLv2+
Description : The glibc-utils package contains memusage, a memory usage profiler,
            : mtrace, a memory leak tracer and xtrace, a function call tracer
            : which can be helpful during program debugging.
            :
            : If unsure if you need this, don't install this package.
[root@headnode ~]#

Explorer

@Jay SenSharma

Can you suggest where I might go to resolve this issue?

jimc

Super Mentor

@j c currey

1. Have you cleaned any previously installed HDP installation (i see that you shared a link for cleaning the ambari installation) but in this case if this is a fresh cluster you are trying to setup then you should can try cleaning the old HDP packages from the hosts (if hdp packages already exited in them) We can use the "hostCleanup.py" script as mentioned in the : https://cwiki.apache.org/confluence/display/AMBARI/Host+Cleanup+for+Ambari+and+Stack

/usr/lib/python2.6/site-packages/ambari_agent/HostCleanup.py

2. Remove the conflicting packages, like following from all hosts.

rpm -e --nodeps glibc-2.17-157.el7.i686
yum remove glibc-2.17-157.el7.i686

Above need to be checked because your error is showing:

Protected multilib versions: glibc-2.17-157.el7_3.1.x86_64 != glibc-2.17-157.el7.i686

3. From the agent hosts please check if you are able to access Hortonworks repo? (to isolate the issue of internet connectivity)

wget -nv http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.3.0/hdp.repo

4. Clean yum repo database. Check if there are any conflicting package ?

yum check
yum clean all
yum update

5. Double check the "/etc/yum.repos.d/HDP-UTILS.repo" adn "/etc/yum.repos.d/HDP.repo" are pointing to the correct repo on all hosts.

6. Now retry the installation.