Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

App Timeline Server Installation failed

New Contributor

Im installing the single HDP-2.6.5.0-2557 on single node through ambari-2.6.2.0-155.x86_64 on centos7.5 and installation is failed during lauch of the cluster. Below is the log.

2018-11-12 16:48:38,378 - The 'hadoop-yarn-timelineserver' component did not advertise a version. This may indicate a problem with the component packaging.
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/application_timeline_server.py", line 89, in <module>
    ApplicationTimelineServer().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 375, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/application_timeline_server.py", line 38, in install
    self.install_packages(env)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 821, in install_packages
    retry_count=agent_stack_retry_count)
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/package/__init__.py", line 53, in action_install
    self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/package/yumrpm.py", line 264, in install_package
    self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/package/__init__.py", line 266, in checked_call_with_retries
    return self._call_with_retries(cmd, is_checked=True, **kwargs)
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/package/__init__.py", line 283, in _call_with_retries
    code, out = func(cmd, **kwargs)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
    result = function(command, **kwargs)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
    tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 303, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_2_6_0_3_8-yarn' returned 1. No Presto metadata available for HDP-2.6-repo-251
Error downloading packages:
  spark2_2_6_0_3_8-yarn-shuffle-2.1.0.2.6.0.3-8.noarch: [Errno 256] No more mirrors to try.
  ranger_2_6_0_3_8-yarn-plugin-0.7.0.2.6.0.3-8.x86_64: [Errno 256] No more mirrors to try.
stdout: /var/lib/ambari-agent/data/output-1100.txt
2018-11-12 16:26:06,617 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-11-12 16:26:06,623 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-11-12 16:26:06,624 - Group['livy'] {}
2018-11-12 16:26:06,625 - Group['spark'] {}
2018-11-12 16:26:06,626 - Group['hdfs'] {}
2018-11-12 16:26:06,626 - Group['hadoop'] {}
2018-11-12 16:26:06,626 - Group['users'] {}
2018-11-12 16:26:06,627 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-11-12 16:26:06,628 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-11-12 16:26:06,628 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-11-12 16:26:06,629 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-11-12 16:26:06,630 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-11-12 16:26:06,631 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-11-12 16:26:06,632 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-11-12 16:26:06,633 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-11-12 16:26:06,633 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-11-12 16:26:06,634 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-11-12 16:26:06,635 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-11-12 16:26:06,636 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-11-12 16:26:06,637 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-11-12 16:26:06,637 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-11-12 16:26:06,639 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-11-12 16:26:06,640 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-11-12 16:26:06,640 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-11-12 16:26:06,641 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-11-12 16:26:06,643 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-11-12 16:26:06,649 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-11-12 16:26:06,649 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-11-12 16:26:06,651 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-11-12 16:26:06,652 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-11-12 16:26:06,653 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-11-12 16:26:06,663 - call returned (0, '1036')
2018-11-12 16:26:06,663 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1036'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-11-12 16:26:06,668 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1036'] due to not_if
2018-11-12 16:26:06,669 - Group['hdfs'] {}
2018-11-12 16:26:06,669 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2018-11-12 16:26:06,670 - FS Type: 
2018-11-12 16:26:06,670 - Directory['/etc/hadoop'] {'mode': 0755}
2018-11-12 16:26:06,670 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-11-12 16:26:06,687 - Repository['HDP-2.6-repo-251'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.0.3', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-251', 'mirror_list': None}
2018-11-12 16:26:06,694 - File['/etc/yum.repos.d/ambari-hdp-251.repo'] {'content': '[HDP-2.6-repo-251]\nname=HDP-2.6-repo-251\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.0.3\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-11-12 16:26:06,695 - Writing File['/etc/yum.repos.d/ambari-hdp-251.repo'] because contents don't match
2018-11-12 16:26:06,699 - Repository['HDP-UTILS-1.1.0.21-repo-251'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-251', 'mirror_list': None}
2018-11-12 16:26:06,702 - File['/etc/yum.repos.d/ambari-hdp-251.repo'] {'content': '[HDP-2.6-repo-251]\nname=HDP-2.6-repo-251\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.0.3\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-251]\nname=HDP-UTILS-1.1.0.21-repo-251\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-11-12 16:26:06,702 - Writing File['/etc/yum.repos.d/ambari-hdp-251.repo'] because contents don't match
2018-11-12 16:26:06,746 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-11-12 16:26:06,991 - Skipping installation of existing package unzip
2018-11-12 16:26:06,991 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-11-12 16:26:07,100 - Skipping installation of existing package curl
2018-11-12 16:26:07,101 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-11-12 16:26:07,219 - Skipping installation of existing package hdp-select
2018-11-12 16:26:07,284 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}
2018-11-12 16:26:07,308 - call returned (0, '')
2018-11-12 16:26:07,512 - Command repositories: HDP-2.6-repo-251, HDP-UTILS-1.1.0.21-repo-251
2018-11-12 16:26:07,513 - Applicable repositories: HDP-2.6-repo-251, HDP-UTILS-1.1.0.21-repo-251
2018-11-12 16:26:07,514 - Looking for matching packages in the following repositories: HDP-2.6-repo-251, HDP-UTILS-1.1.0.21-repo-251
2018-11-12 16:26:10,059 - Package['hadoop_2_6_0_3_8-yarn'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-11-12 16:26:10,298 - Installing package hadoop_2_6_0_3_8-yarn ('/usr/bin/yum -d 0 -e 0 -y install hadoop_2_6_0_3_8-yarn')
2018-11-12 16:37:55,518 - Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_2_6_0_3_8-yarn' returned 1. Error downloading packages:
  spark2_2_6_0_3_8-yarn-shuffle-2.1.0.2.6.0.3-8.noarch: [Errno 256] No more mirrors to try.
  ranger_2_6_0_3_8-yarn-plugin-0.7.0.2.6.0.3-8.x86_64: [Errno 256] No more mirrors to try.
2018-11-12 16:37:55,518 - Failed to install package hadoop_2_6_0_3_8-yarn. Executing '/usr/bin/yum clean metadata'
2018-11-12 16:37:55,972 - Retrying to install package hadoop_2_6_0_3_8-yarn after 30 seconds
2018-11-12 16:48:38,353 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}
2018-11-12 16:48:38,378 - call returned (0, '')
2018-11-12 16:48:38,378 - The 'hadoop-yarn-timelineserver' component did not advertise a version. This may indicate a problem with the component packaging.

Command failed after 1 tries
5 REPLIES 5

Super Collaborator

@Abhishek G

execute below commands and try again

# yum clean all

# yum clean metadata

New Contributor

@scharan

I tried with those commands and got the same result.

can you have a look at the repo you have configured in your node.

yum repolist 

and delete the repo's that are not required

by navigating to

[root@anaikhdp1 javascripts]# cd /etc/yum.repos.d/
[root@anaikhdp1 yum.repos.d]# ls -lh

and delete the repo file that is old and not created by you.

and evaluate the repo's provided while registering cluster version and see if its correct from ambari ui

@Abhishek G see if this helps you

New Contributor

@ Akhil S Naik

There are only 2 repos related to ambari:

1.ambari.repo #VERSION_NUMBER=2.6.2.0-155 [ambari-2.6.2.0] name=ambari Version - ambari-2.6.2.0 baseurl=http://public-repo-1.hortonworks.com/ambari/centos7/2.x/updates/2.6.2.0 gpgcheck=1 gpgkey=http://public-repo-1.hortonworks.com/ambari/centos7/2.x/updates/2.6.2.0/RPM-GPG-KEY/RPM-GPG-KEY-Jenk... enabled=1 priority=1

2.ambari-hdp-251.repo [HDP-2.6-repo-251] name=HDP-2.6-repo-251 baseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.0.3 path=/ enabled=1 gpgcheck=0 [HDP-UTILS-1.1.0.21-repo-251] name=HDP-UTILS-1.1.0.21-repo-251 baseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7path=/ enabled=1 gpgcheck=0

There is a version difference in the baseurl is that effecting my installation...??

BR

Abhishek G

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.