Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Unable to install HDP 2.6.2 using Ambari 2.5

avatar
Explorer

I am trying to setup Ambari 2.5, HDP 2.6 and HDF 3.x on CentOS 7.x using Python 27. The setup is a single server with Amari Server, Ambari agent and HDP/HDF.

I am able to install Ambari server, Ambari agent and setup local repository for HDP and HDF. I am able to install everything except failed to install services components. Accumlo failed with error. I skipped Accumlo then it failed on AppTimer. I believe something is not right. I am very new to this area.

Here is the log file.

stderr: /var/lib/ambari-agent/data/errors-195.txt
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/application_timeline_server.py", line 94, in <module>
    ApplicationTimelineServer().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 329, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/application_timeline_server.py", line 39, in install
    self.install_packages(env)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 708, in install_packages
    retry_count=agent_stack_retry_count)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 54, in action_install
    self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 53, in install_package
    self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 86, in checked_call_with_retries
    return self._call_with_retries(cmd, is_checked=True, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 98, in _call_with_retries
    code, out = func(cmd, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
    tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_2_4_2_0_258-yarn' returned 1. Error: Nothing to do
stdout: /var/lib/ambari-agent/data/output-195.txt
2017-10-09 08:36:02,773 - Stack Feature Version Info: Cluster Stack=2.6, Cluster Current Version=None, Command Stack=None, Command Version=None -> 2.6
2017-10-09 08:36:02,782 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
User Group mapping (user_group) is missing in the hostLevelParams
2017-10-09 08:36:02,783 - Group['livy'] {}
2017-10-09 08:36:02,784 - Group['spark'] {}
2017-10-09 08:36:02,784 - Group['zeppelin'] {}
2017-10-09 08:36:02,784 - Group['hadoop'] {}
2017-10-09 08:36:02,784 - Group['users'] {}
2017-10-09 08:36:02,785 - Group['knox'] {}
2017-10-09 08:36:02,785 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,786 - call['/var/lib/ambari-agent/tmp/changeUid.sh hive'] {}
2017-10-09 08:36:02,793 - call returned (0, '1001')
2017-10-09 08:36:02,793 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1001}
2017-10-09 08:36:02,795 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,796 - call['/var/lib/ambari-agent/tmp/changeUid.sh storm'] {}
2017-10-09 08:36:02,802 - call returned (0, '1002')
2017-10-09 08:36:02,802 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1002}
2017-10-09 08:36:02,803 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,804 - call['/var/lib/ambari-agent/tmp/changeUid.sh infra-solr'] {}
2017-10-09 08:36:02,810 - call returned (0, '1022')
2017-10-09 08:36:02,811 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1022}
2017-10-09 08:36:02,812 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,812 - call['/var/lib/ambari-agent/tmp/changeUid.sh zookeeper'] {}
2017-10-09 08:36:02,819 - call returned (0, '1003')
2017-10-09 08:36:02,819 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1003}
2017-10-09 08:36:02,820 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,821 - call['/var/lib/ambari-agent/tmp/changeUid.sh atlas'] {}
2017-10-09 08:36:02,827 - call returned (0, '1005')
2017-10-09 08:36:02,828 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1005}
2017-10-09 08:36:02,829 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,830 - call['/var/lib/ambari-agent/tmp/changeUid.sh oozie'] {}
2017-10-09 08:36:02,836 - call returned (0, '1004')
2017-10-09 08:36:02,836 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': 1004}
2017-10-09 08:36:02,837 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,838 - call['/var/lib/ambari-agent/tmp/changeUid.sh ams'] {}
2017-10-09 08:36:02,843 - call returned (0, '1006')
2017-10-09 08:36:02,844 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1006}
2017-10-09 08:36:02,845 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,845 - call['/var/lib/ambari-agent/tmp/changeUid.sh falcon'] {}
2017-10-09 08:36:02,853 - call returned (0, '1007')
2017-10-09 08:36:02,853 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': 1007}
2017-10-09 08:36:02,854 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,855 - call['/var/lib/ambari-agent/tmp/changeUid.sh tez'] {}
2017-10-09 08:36:02,861 - call returned (0, '1008')
2017-10-09 08:36:02,862 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': 1008}
2017-10-09 08:36:02,863 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,864 - call['/var/lib/ambari-agent/tmp/changeUid.sh zeppelin'] {}
2017-10-09 08:36:02,870 - call returned (0, '1023')
2017-10-09 08:36:02,870 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid': 1023}
2017-10-09 08:36:02,872 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,873 - call['/var/lib/ambari-agent/tmp/changeUid.sh livy'] {}
2017-10-09 08:36:02,879 - call returned (0, '1024')
2017-10-09 08:36:02,879 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1024}
2017-10-09 08:36:02,881 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,881 - call['/var/lib/ambari-agent/tmp/changeUid.sh mahout'] {}
2017-10-09 08:36:02,888 - call returned (0, '1010')
2017-10-09 08:36:02,888 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1010}
2017-10-09 08:36:02,889 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,890 - call['/var/lib/ambari-agent/tmp/changeUid.sh spark'] {}
2017-10-09 08:36:02,896 - call returned (0, '1011')
2017-10-09 08:36:02,897 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1011}
2017-10-09 08:36:02,898 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2017-10-09 08:36:02,899 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,900 - call['/var/lib/ambari-agent/tmp/changeUid.sh flume'] {}
2017-10-09 08:36:02,906 - call returned (0, '1013')
2017-10-09 08:36:02,907 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1013}
2017-10-09 08:36:02,908 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,909 - call['/var/lib/ambari-agent/tmp/changeUid.sh kafka'] {}
2017-10-09 08:36:02,915 - call returned (0, '1014')
2017-10-09 08:36:02,915 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1014}
2017-10-09 08:36:02,916 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,917 - call['/var/lib/ambari-agent/tmp/changeUid.sh hdfs'] {}
2017-10-09 08:36:02,923 - call returned (0, '1015')
2017-10-09 08:36:02,924 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1015}
2017-10-09 08:36:02,925 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,926 - call['/var/lib/ambari-agent/tmp/changeUid.sh sqoop'] {}
2017-10-09 08:36:02,932 - call returned (0, '1016')
2017-10-09 08:36:02,933 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1016}
2017-10-09 08:36:02,934 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,935 - call['/var/lib/ambari-agent/tmp/changeUid.sh yarn'] {}
2017-10-09 08:36:02,940 - call returned (0, '1017')
2017-10-09 08:36:02,941 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1017}
2017-10-09 08:36:02,942 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,943 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2017-10-09 08:36:02,953 - call returned (0, '1019')
2017-10-09 08:36:02,954 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1019}
2017-10-09 08:36:02,955 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,956 - call['/var/lib/ambari-agent/tmp/changeUid.sh hcat'] {}
2017-10-09 08:36:02,962 - call returned (0, '1021')
2017-10-09 08:36:02,962 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1021}
2017-10-09 08:36:02,965 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,966 - call['/var/lib/ambari-agent/tmp/changeUid.sh mapred'] {}
2017-10-09 08:36:02,973 - call returned (0, '1018')
2017-10-09 08:36:02,973 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1018}
2017-10-09 08:36:02,974 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,975 - call['/var/lib/ambari-agent/tmp/changeUid.sh knox'] {}
2017-10-09 08:36:02,983 - call returned (0, '1020')
2017-10-09 08:36:02,984 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1020}
2017-10-09 08:36:02,984 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,986 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-10-09 08:36:02,990 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2017-10-09 08:36:02,991 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2017-10-09 08:36:02,992 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,993 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-09 08:36:02,994 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2017-10-09 08:36:03,000 - call returned (0, '1019')
2017-10-09 08:36:03,000 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1019'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2017-10-09 08:36:03,004 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1019'] due to not_if
2017-10-09 08:36:03,005 - Group['hdfs'] {}
2017-10-09 08:36:03,005 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
2017-10-09 08:36:03,006 - FS Type: 
2017-10-09 08:36:03,006 - Directory['/etc/hadoop'] {'mode': 0755}
2017-10-09 08:36:03,006 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2017-10-09 08:36:03,020 - Initializing 2 repositories
2017-10-09 08:36:03,021 - Repository['HDP-2.6'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.2.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2017-10-09 08:36:03,030 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.6]\nname=HDP-2.6\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.2.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-10-09 08:36:03,031 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2017-10-09 08:36:03,036 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-10-09 08:36:03,036 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-10-09 08:36:03,141 - Skipping installation of existing package unzip
2017-10-09 08:36:03,142 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-10-09 08:36:03,152 - Skipping installation of existing package curl
2017-10-09 08:36:03,152 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-10-09 08:36:03,163 - Skipping installation of existing package hdp-select
2017-10-09 08:36:03,348 - checked_call['rpm -q --queryformat '%{version}-%{release}' hdp-select | sed -e 's/\.el[0-9]//g''] {'stderr': -1}
2017-10-09 08:36:03,379 - checked_call returned (0, '2.4.2.0-258', '')
2017-10-09 08:36:03,380 - Package['hadoop_2_4_2_0_258-yarn'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-10-09 08:36:03,482 - Installing package hadoop_2_4_2_0_258-yarn ('/usr/bin/yum -d 0 -e 0 -y install hadoop_2_4_2_0_258-yarn')
2017-10-09 08:36:05,880 - Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_2_4_2_0_258-yarn' returned 1. Error: Nothing to do
2017-10-09 08:36:05,881 - Failed to install package hadoop_2_4_2_0_258-yarn. Executing '/usr/bin/yum clean metadata'
2017-10-09 08:36:06,507 - Retrying to install package hadoop_2_4_2_0_258-yarn after 30 seconds

Command failed after 1 tries
1 ACCEPTED SOLUTION

avatar
Master Mentor

@Tarun Patel

It might be possible that the "hdp-select" was not upgraded properly on that host.

Can you please check the output of the following command:

# hdp-select
# hdp-select | grep '2_4_2_0_258'


# hdf-select

.

Also can you please do an upgrade of the following package to see if it works:

# yum upgrade hdp-select

.

View solution in original post

9 REPLIES 9

avatar
Master Mentor

@Tarun Patel

As you mentioned that you are trying to setup "Ambari 2.5, HDP 2.6 and HDF 3.x"

But from your error it looks like the installation for "hadoop-yarn" component is happening for version HDP 2.4.2-258

resource_management.core.exceptions.ExecutionFailed:Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_2_4_2_0_258-yarn' returned 1.Error:Nothing to do

.

This can happen if on the mentioned host (where this installation of Yarn is happening) is having an incorrect repo present inside the

# ls -l /etc/yum/repos.d

.

So please check the yum repos. So please share the output of the following command so that we can see the baseurl of your repos from the host where it is failing. We will need to remove any repo file that has "2_4_2_0_258" entry int he baseurl.

# find "/etc/yum.repos.d/"  -name "*.*" | xargs grep -i 'baseurl'

.

Also perform yum clean all after removing the repo file that is pointing to "2.4.2-258" version.

# yum clean all

.

avatar
Explorer

@username

Thanks Jay.

I do have right repository setup within yum. And as I mentioned my local repository shows correct HDP and HDF version. For some reasons, internally it picks up HDP 2.4.x version. Here is the output of baseurl.

/etc/yum.repos.d/HDP.repo:baseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.2.0
/etc/yum.repos.d/hdp.repo:baseurl=http://myserver/hdp/centos7/HDP-2.6.2.0
/etc/yum.repos.d/hdp.repo:baseurl=http://myserver/hdp/centos7/HDP-UTILS-1.1.0.21
/etc/yum.repos.d/HDP-UTILS.repo:baseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7
/etc/yum.repos.d/hdf.repo:baseurl=http://myserver/hdp/centos7/HDP-UTILS-1.1.0.21

/etc/yum.repos.d/hdf.repo:baseurl=http://myserver/hdf/centos7/HDF-3.0.1.1

I have two entries: 1) local repository and 2) public url. As you can both points to right versions.

I have also checked python27 is enabled and active. Otherwise Amari server 2.5 won't get installed.

avatar
Master Mentor

@Tarun Patel

It might be possible that the "hdp-select" was not upgraded properly on that host.

Can you please check the output of the following command:

# hdp-select
# hdp-select | grep '2_4_2_0_258'


# hdf-select

.

Also can you please do an upgrade of the following package to see if it works:

# yum upgrade hdp-select

.

avatar
Master Mentor

@Tarun Patel

Also your following line of command execution shows a failure:

2017-10-09 08:36:03,348 - checked_call['rpm -q --queryformat '%{version}-%{release}' hdp-select | sed -e 's/\.el[0-9]//g''] {'stderr': -1

.

So can you please try running the following command manually to see if it executes fine?

# rpm -q --queryformat '%{version}-%{release}' hdp-select | sed -e 's/\.el[0-9]//g

.

If the above command execution fails then please check the following directory It should not show "2_4_2_0_258" directory there. If shows then try to move it to some other directory.

Example:

# ls -l /usr/hdp/
total 12
drwxr-xr-x. 26 root root 4096 Oct  4 08:03 2.4.2.0-258
drwxr-xr-x. 31 root root 4096 Oct  4 10:20 2.6.0.0-598
drwxr-xr-x.  2 root root 4096 Oct  4 10:19 current

.

The "/usr/hdp" directory should ideally have "current" and a directory with the latest version number of your hdp. Presense of any Unwanted directories inside "/usr/hdp" can cause issues.

avatar
Master Mentor

@Tarun Patel

As the issue is resolved, hence it will be also great if you can mark this HCC thread as Answered by clicking on the "Accept" Button. That way other HCC users can quickly find the solution when they encounter the same issue.

avatar
Explorer

Thanks Jay.

hdp-select is now upgraded to 2.6 version. I noticed following references for HDP-2.4* before upgrading hdp-select.

/var/lib/yum/repos/x86_64/7/HDP-2.4.2.0
/var/cache/yum/x86_64/7/HDP-2.4.2.0
/var/tmp/yum-dbevis-pgC9hP/x86_64/7/HDP-2.4.2.0
/var/tmp/yum-tpatel-o8kIN3/x86_64/7/HDP-2.4.2.0
/var/tmp/yum-adm_djd-2X9Zq0/x86_64/7/HDP-2.4.2.0

After upgrade I see following items.

/var/lib/yum/repos/x86_64/7/HDP-2.4.2.0
/var/lib/yum/repos/x86_64/7/HDP-2.4
/var/cache/yum/x86_64/7/HDP-2.4.2.0
/var/cache/yum/x86_64/7/HDP-2.4
/var/tmp/yum-dbevis-pgC9hP/x86_64/7/HDP-2.4.2.0
/var/tmp/yum-tpatel-o8kIN3/x86_64/7/HDP-2.4.2.0
/var/tmp/yum-adm_djd-2X9Zq0/x86_64/7/HDP-2.4.2.0

How do I remove entries from /lib/yum/repos folder? Shall I need to use yum erase command?

avatar
Master Mentor

@Tarun Patel

Manually removing (or erasing) the "/var/lib/yum/repos/x86_64/7/HDP-2.4.2.0" will not be a good idea.

For clearing the yum cache like "/var/cache/yum/x86_64/7/HDP-2.4" you can just try the following:

# yum clean all

.

However i guess after upgrading the "hdp-select" you should be able to install/start the yarn apptimeline without any issue. Please try that once.


avatar
Explorer

Thanks Jay.

The issue has been resolved. I wanted to understand where I missed the hdp-select upgrade step? Is it because I started with 2.4 and cleaned up everything but missed hdp-select 2.4?

avatar
Master Mentor

@Tarun Patel

Good to know that the issue is resolved.

Regarding the Root Cause, I guess somehow the "hdp-select" package was not upgraded well. We might need to review the detailed logs to find out what went wrong (like yum.logs and ambari logs/ ambari-agent logs).

.