<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Failed to execute command /usr/bin/yum -y install hadoop_3_1_0_0_78-yarn,RuntimeError: Failed to execute command '/usr/bin/yum -y install hadoop_3_1_0_0_78-yarn in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Failed-to-execute-command-usr-bin-yum-y-install-hadoop-3-1-0/m-p/241654#M203457</link>
    <description>&lt;P&gt;Kindly help:&lt;/P&gt;&lt;P&gt;Encountered the below error when tyring to install HDP using ambari on Oracle Linux 7 with Oracle 12c and mysql&lt;/P&gt;&lt;P&gt;"RuntimeError: Failed to execute command '/usr/bin/yum -y install 
hadoop_3_1_0_0_78-yarn', exited with code '1', message: 'Error: Package:
 hadoop_3_1_0_0_78-hdfs-3.1.1.3.1.0.0-78.x86_64 (HDP-3.1-repo-1)&lt;BR /&gt;&lt;BR /&gt;  Requires: libtirpc-devel"&lt;/P&gt;&lt;P&gt;when i tried to install libtirpc-devel, Yum install responds the package exists.&lt;/P&gt;&lt;P&gt;stderr: &lt;BR /&gt;2019-01-04 14:02:21,247 - The 'hadoop-yarn-timelineserver' component did not advertise a version. This may indicate a problem with the component packaging.&lt;BR /&gt;Traceback (most recent call last):&lt;BR /&gt;  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/YARN/package/scripts/application_timeline_server.py", line 97, in &amp;lt;module&amp;gt;&lt;BR /&gt;  ApplicationTimelineServer().execute()&lt;BR /&gt;  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute&lt;BR /&gt;  method(env)&lt;BR /&gt;  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/YARN/package/scripts/application_timeline_server.py", line 42, in install&lt;BR /&gt;  self.install_packages(env)&lt;BR /&gt;  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 849, in install_packages&lt;BR /&gt;  retry_count=agent_stack_retry_count)&lt;BR /&gt;  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__&lt;BR /&gt;  self.env.run()&lt;BR /&gt;  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run&lt;BR /&gt;  self.run_action(resource, action)&lt;BR /&gt;  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action&lt;BR /&gt;  provider_action()&lt;BR /&gt;  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/packaging.py", line 30, in action_install&lt;BR /&gt;  self._pkg_manager.install_package(package_name, self.__create_context())&lt;BR /&gt;  File "/usr/lib/ambari-agent/lib/ambari_commons/repo_manager/yum_manager.py", line 219, in install_package&lt;BR /&gt;  shell.repository_manager_executor(cmd, self.properties, context)&lt;BR /&gt;  File "/usr/lib/ambari-agent/lib/ambari_commons/shell.py", line 753, in repository_manager_executor&lt;BR /&gt;  raise RuntimeError(message)&lt;BR /&gt;RuntimeError: Failed to execute command '/usr/bin/yum -y install hadoop_3_1_0_0_78-yarn', exited with code '1', message: 'Error: Package: hadoop_3_1_0_0_78-hdfs-3.1.1.3.1.0.0-78.x86_64 (HDP-3.1-repo-1)&lt;BR /&gt;&lt;BR /&gt;  Requires: libtirpc-devel&lt;BR /&gt;'&lt;BR /&gt; stdout:&lt;BR /&gt;2019-01-04 14:02:06,212 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=None -&amp;gt; 3.1&lt;BR /&gt;2019-01-04 14:02:06,219 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf&lt;BR /&gt;2019-01-04 14:02:06,220 - Group['kms'] {}&lt;BR /&gt;2019-01-04 14:02:06,221 - Group['livy'] {}&lt;BR /&gt;2019-01-04 14:02:06,221 - Group['spark'] {}&lt;BR /&gt;2019-01-04 14:02:06,222 - Group['ranger'] {}&lt;BR /&gt;2019-01-04 14:02:06,222 - Group['hdfs'] {}&lt;BR /&gt;2019-01-04 14:02:06,222 - Group['zeppelin'] {}&lt;BR /&gt;2019-01-04 14:02:06,222 - Group['hadoop'] {}&lt;BR /&gt;2019-01-04 14:02:06,222 - Group['users'] {}&lt;BR /&gt;2019-01-04 14:02:06,222 - Group['knox'] {}&lt;BR /&gt;2019-01-04 14:02:06,223 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,224 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,225 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,226 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,227 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,228 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,228 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,229 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,230 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,233 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,233 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,234 - User['kms'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['kms', 'hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,235 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,236 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,237 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,238 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,239 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,240 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,241 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,242 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,243 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,244 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,245 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,246 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;BR /&gt;2019-01-04 14:02:06,247 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}&lt;BR /&gt;2019-01-04 14:02:06,251 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if&lt;BR /&gt;2019-01-04 14:02:06,251 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}&lt;BR /&gt;2019-01-04 14:02:06,252 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;BR /&gt;2019-01-04 14:02:06,254 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;BR /&gt;2019-01-04 14:02:06,254 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}&lt;BR /&gt;2019-01-04 14:02:06,260 - call returned (0, '1012')&lt;BR /&gt;2019-01-04 14:02:06,260 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1012'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}&lt;BR /&gt;2019-01-04 14:02:06,264 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1012'] due to not_if&lt;BR /&gt;2019-01-04 14:02:06,264 - Group['hdfs'] {}&lt;BR /&gt;2019-01-04 14:02:06,265 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}&lt;BR /&gt;2019-01-04 14:02:06,265 - FS Type: HDFS&lt;BR /&gt;2019-01-04 14:02:06,265 - Directory['/etc/hadoop'] {'mode': 0755}&lt;BR /&gt;2019-01-04 14:02:06,266 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}&lt;BR /&gt;2019-01-04 14:02:06,279 - Repository['HDP-3.1-repo-2'] {'base_url': '&lt;A href="http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.1.0.0" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.1.0.0&lt;/A&gt;', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None}&lt;BR /&gt;2019-01-04 14:02:06,285 - Repository['HDP-3.1-GPL-repo-2'] {'base_url': '&lt;A href="http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.1.0.0" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.1.0.0&lt;/A&gt;', 'action': ['prepare'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None}&lt;BR /&gt;2019-01-04 14:02:06,287 - Repository['HDP-UTILS-1.1.0.22-repo-2'] {'base_url': '&lt;A href="http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7&lt;/A&gt;', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None}&lt;BR /&gt;2019-01-04 14:02:06,289 - Repository[None] {'action': ['create']}&lt;BR /&gt;2019-01-04 14:02:06,290 - File['/tmp/tmpyxlbch'] {'content': '[HDP-3.1-repo-2]\nname=HDP-3.1-repo-2\nbaseurl=&lt;A href="http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.1.0.0\" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.1.0.0\&lt;/A&gt;n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.1-GPL-repo-2]\nname=HDP-3.1-GPL-repo-2\nbaseurl=&lt;A href="http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.1.0.0\" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.1.0.0\&lt;/A&gt;n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-2]\nname=HDP-UTILS-1.1.0.22-repo-2\nbaseurl=&lt;A href="http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\&lt;/A&gt;n\npath=/\nenabled=1\ngpgcheck=0'}&lt;BR /&gt;2019-01-04 14:02:06,290 - Writing File['/tmp/tmpyxlbch'] because contents don't match&lt;BR /&gt;2019-01-04 14:02:06,291 - File['/tmp/tmpa1YdC7'] {'content': StaticFile('/etc/yum.repos.d/ambari-hdp-2.repo')}&lt;BR /&gt;2019-01-04 14:02:06,291 - Writing File['/tmp/tmpa1YdC7'] because contents don't match&lt;BR /&gt;2019-01-04 14:02:06,292 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;BR /&gt;2019-01-04 14:02:06,374 - Skipping installation of existing package unzip&lt;BR /&gt;2019-01-04 14:02:06,374 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;BR /&gt;2019-01-04 14:02:06,381 - Skipping installation of existing package curl&lt;BR /&gt;2019-01-04 14:02:06,382 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;BR /&gt;2019-01-04 14:02:06,389 - Skipping installation of existing package hdp-select&lt;BR /&gt;2019-01-04 14:02:06,438 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}&lt;BR /&gt;2019-01-04 14:02:06,456 - call returned (0, '')&lt;BR /&gt;2019-01-04 14:02:06,695 - Command repositories: HDP-3.1-repo-2, HDP-3.1-GPL-repo-2, HDP-UTILS-1.1.0.22-repo-2&lt;BR /&gt;2019-01-04 14:02:06,695 - Applicable repositories: HDP-3.1-repo-2, HDP-3.1-GPL-repo-2, HDP-UTILS-1.1.0.22-repo-2&lt;BR /&gt;2019-01-04 14:02:06,696 - Looking for matching packages in the following repositories: HDP-3.1-repo-2, HDP-3.1-GPL-repo-2, HDP-UTILS-1.1.0.22-repo-2&lt;BR /&gt;2019-01-04 14:02:12,791 - Adding fallback repositories: HDP-3.1-repo-1, HDP-UTILS-1.1.0.22-repo-1, HDP-3.1-GPL-repo-1&lt;BR /&gt;2019-01-04 14:02:19,073 - Package['hadoop_3_1_0_0_78-yarn'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;BR /&gt;2019-01-04 14:02:19,148 - Installing package hadoop_3_1_0_0_78-yarn ('/usr/bin/yum -y install hadoop_3_1_0_0_78-yarn')&lt;BR /&gt;2019-01-04 14:02:21,228 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}&lt;BR /&gt;2019-01-04 14:02:21,247 - call returned (0, '')&lt;BR /&gt;2019-01-04 14:02:21,247 - The 'hadoop-yarn-timelineserver' component did not advertise a version. This may indicate a problem with the component packaging.&lt;BR /&gt;&lt;BR /&gt;Command failed after 1 tries&lt;/P&gt;,&lt;P&gt;Trying to Install HDP through Ambari and encoutered this issue on a Single Node server with Oracle Linux 7.5 and Oracle 12C and Mysql&lt;/P&gt;&lt;PRE&gt;RuntimeError: Failed to execute command '/usr/bin/yum -y install hadoop_3_1_0_0_78-yarn&lt;/PRE&gt;&lt;P&gt;Total Error:&lt;/P&gt;&lt;P&gt;stderr: &lt;BR /&gt;2019-01-04 14:02:21,247 - The 'hadoop-yarn-timelineserver' component did not advertise a version. This may indicate a problem with the component packaging.&lt;BR /&gt;Traceback (most recent call last):&lt;BR /&gt;  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/YARN/package/scripts/application_timeline_server.py", line 97, in &amp;lt;module&amp;gt;&lt;BR /&gt;  ApplicationTimelineServer().execute()&lt;BR /&gt;  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute&lt;BR /&gt;  method(env)&lt;BR /&gt;  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/YARN/package/scripts/application_timeline_server.py", line 42, in install&lt;BR /&gt;  self.install_packages(env)&lt;BR /&gt;  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 849, in install_packages&lt;BR /&gt;  retry_count=agent_stack_retry_count)&lt;BR /&gt;  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__&lt;BR /&gt;  self.env.run()&lt;BR /&gt;  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run&lt;BR /&gt;  self.run_action(resource, action)&lt;BR /&gt;  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action&lt;BR /&gt;  provider_action()&lt;BR /&gt;  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/packaging.py", line 30, in action_install&lt;BR /&gt;  self._pkg_manager.install_package(package_name, self.__create_context())&lt;BR /&gt;  File "/usr/lib/ambari-agent/lib/ambari_commons/repo_manager/yum_manager.py", line 219, in install_package&lt;BR /&gt;  shell.repository_manager_executor(cmd, self.properties, context)&lt;BR /&gt;  File "/usr/lib/ambari-agent/lib/ambari_commons/shell.py", line 753, in repository_manager_executor&lt;BR /&gt;  raise RuntimeError(message)&lt;BR /&gt;RuntimeError: Failed to execute command '/usr/bin/yum -y install hadoop_3_1_0_0_78-yarn', exited with code '1', message: 'Error: Package: hadoop_3_1_0_0_78-hdfs-3.1.1.3.1.0.0-78.x86_64 (HDP-3.1-repo-1)&lt;BR /&gt;&lt;BR /&gt;  Requires: libtirpc-devel&lt;BR /&gt;'&lt;BR /&gt; stdout:&lt;BR /&gt;2019-01-04 14:02:06,212 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=None -&amp;gt; 3.1&lt;BR /&gt;2019-01-04 14:02:06,219 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf&lt;BR /&gt;2019-01-04 14:02:06,220 - Group['kms'] {}&lt;BR /&gt;2019-01-04 14:02:06,221 - Group['livy'] {}&lt;BR /&gt;2019-01-04 14:02:06,221 - Group['spark'] {}&lt;BR /&gt;2019-01-04 14:02:06,222 - Group['ranger'] {}&lt;BR /&gt;2019-01-04 14:02:06,222 - Group['hdfs'] {}&lt;BR /&gt;2019-01-04 14:02:06,222 - Group['zeppelin'] {}&lt;BR /&gt;2019-01-04 14:02:06,222 - Group['hadoop'] {}&lt;BR /&gt;2019-01-04 14:02:06,222 - Group['users'] {}&lt;BR /&gt;2019-01-04 14:02:06,222 - Group['knox'] {}&lt;BR /&gt;2019-01-04 14:02:06,223 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,224 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,225 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,226 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,227 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,228 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,228 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,229 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,230 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,233 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,233 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,234 - User['kms'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['kms', 'hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,235 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,236 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,237 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,238 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,239 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,240 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,241 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,242 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,243 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,244 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,245 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}&lt;BR /&gt;2019-01-04 14:02:06,246 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;BR /&gt;2019-01-04 14:02:06,247 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}&lt;BR /&gt;2019-01-04 14:02:06,251 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if&lt;BR /&gt;2019-01-04 14:02:06,251 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}&lt;BR /&gt;2019-01-04 14:02:06,252 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;BR /&gt;2019-01-04 14:02:06,254 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;BR /&gt;2019-01-04 14:02:06,254 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}&lt;BR /&gt;2019-01-04 14:02:06,260 - call returned (0, '1012')&lt;BR /&gt;2019-01-04 14:02:06,260 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1012'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}&lt;BR /&gt;2019-01-04 14:02:06,264 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1012'] due to not_if&lt;BR /&gt;2019-01-04 14:02:06,264 - Group['hdfs'] {}&lt;BR /&gt;2019-01-04 14:02:06,265 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}&lt;BR /&gt;2019-01-04 14:02:06,265 - FS Type: HDFS&lt;BR /&gt;2019-01-04 14:02:06,265 - Directory['/etc/hadoop'] {'mode': 0755}&lt;BR /&gt;2019-01-04 14:02:06,266 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}&lt;BR /&gt;2019-01-04 14:02:06,279 - Repository['HDP-3.1-repo-2'] {'base_url': '&lt;A href="http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.1.0.0" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.1.0.0&lt;/A&gt;', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None}&lt;BR /&gt;2019-01-04 14:02:06,285 - Repository['HDP-3.1-GPL-repo-2'] {'base_url': '&lt;A href="http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.1.0.0" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.1.0.0&lt;/A&gt;', 'action': ['prepare'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None}&lt;BR /&gt;2019-01-04 14:02:06,287 - Repository['HDP-UTILS-1.1.0.22-repo-2'] {'base_url': '&lt;A href="http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7&lt;/A&gt;', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None}&lt;BR /&gt;2019-01-04 14:02:06,289 - Repository[None] {'action': ['create']}&lt;BR /&gt;2019-01-04 14:02:06,290 - File['/tmp/tmpyxlbch'] {'content': '[HDP-3.1-repo-2]\nname=HDP-3.1-repo-2\nbaseurl=&lt;A href="http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.1.0.0\" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.1.0.0\&lt;/A&gt;n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.1-GPL-repo-2]\nname=HDP-3.1-GPL-repo-2\nbaseurl=&lt;A href="http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.1.0.0\" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.1.0.0\&lt;/A&gt;n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-2]\nname=HDP-UTILS-1.1.0.22-repo-2\nbaseurl=&lt;A href="http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\&lt;/A&gt;n\npath=/\nenabled=1\ngpgcheck=0'}&lt;BR /&gt;2019-01-04 14:02:06,290 - Writing File['/tmp/tmpyxlbch'] because contents don't match&lt;BR /&gt;2019-01-04 14:02:06,291 - File['/tmp/tmpa1YdC7'] {'content': StaticFile('/etc/yum.repos.d/ambari-hdp-2.repo')}&lt;BR /&gt;2019-01-04 14:02:06,291 - Writing File['/tmp/tmpa1YdC7'] because contents don't match&lt;BR /&gt;2019-01-04 14:02:06,292 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;BR /&gt;2019-01-04 14:02:06,374 - Skipping installation of existing package unzip&lt;BR /&gt;2019-01-04 14:02:06,374 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;BR /&gt;2019-01-04 14:02:06,381 - Skipping installation of existing package curl&lt;BR /&gt;2019-01-04 14:02:06,382 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;BR /&gt;2019-01-04 14:02:06,389 - Skipping installation of existing package hdp-select&lt;BR /&gt;2019-01-04 14:02:06,438 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}&lt;BR /&gt;2019-01-04 14:02:06,456 - call returned (0, '')&lt;BR /&gt;2019-01-04 14:02:06,695 - Command repositories: HDP-3.1-repo-2, HDP-3.1-GPL-repo-2, HDP-UTILS-1.1.0.22-repo-2&lt;BR /&gt;2019-01-04 14:02:06,695 - Applicable repositories: HDP-3.1-repo-2, HDP-3.1-GPL-repo-2, HDP-UTILS-1.1.0.22-repo-2&lt;BR /&gt;2019-01-04 14:02:06,696 - Looking for matching packages in the following repositories: HDP-3.1-repo-2, HDP-3.1-GPL-repo-2, HDP-UTILS-1.1.0.22-repo-2&lt;BR /&gt;2019-01-04 14:02:12,791 - Adding fallback repositories: HDP-3.1-repo-1, HDP-UTILS-1.1.0.22-repo-1, HDP-3.1-GPL-repo-1&lt;BR /&gt;2019-01-04 14:02:19,073 - Package['hadoop_3_1_0_0_78-yarn'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;BR /&gt;2019-01-04 14:02:19,148 - Installing package hadoop_3_1_0_0_78-yarn ('/usr/bin/yum -y install hadoop_3_1_0_0_78-yarn')&lt;BR /&gt;2019-01-04 14:02:21,228 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}&lt;BR /&gt;2019-01-04 14:02:21,247 - call returned (0, '')&lt;BR /&gt;2019-01-04 14:02:21,247 - The 'hadoop-yarn-timelineserver' component did not advertise a version. This may indicate a problem with the component packaging.&lt;BR /&gt;&lt;BR /&gt;Command failed after 1 tries&lt;/P&gt;</description>
    <pubDate>Fri, 16 Sep 2022 14:01:54 GMT</pubDate>
    <dc:creator>rakesh_oth</dc:creator>
    <dc:date>2022-09-16T14:01:54Z</dc:date>
  </channel>
</rss>

