Support Questions

Find answers, ask questions, and share your expertise

Failed to execute command /usr/bin/yum -y install hadoop_3_1_0_0_78-yarn,RuntimeError: Failed to execute command '/usr/bin/yum -y install hadoop_3_1_0_0_78-yarn

avatar
Explorer

Kindly help:

Encountered the below error when tyring to install HDP using ambari on Oracle Linux 7 with Oracle 12c and mysql

"RuntimeError: Failed to execute command '/usr/bin/yum -y install hadoop_3_1_0_0_78-yarn', exited with code '1', message: 'Error: Package: hadoop_3_1_0_0_78-hdfs-3.1.1.3.1.0.0-78.x86_64 (HDP-3.1-repo-1)

Requires: libtirpc-devel"

when i tried to install libtirpc-devel, Yum install responds the package exists.

stderr:
2019-01-04 14:02:21,247 - The 'hadoop-yarn-timelineserver' component did not advertise a version. This may indicate a problem with the component packaging.
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/YARN/package/scripts/application_timeline_server.py", line 97, in <module>
ApplicationTimelineServer().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/YARN/package/scripts/application_timeline_server.py", line 42, in install
self.install_packages(env)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 849, in install_packages
retry_count=agent_stack_retry_count)
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/packaging.py", line 30, in action_install
self._pkg_manager.install_package(package_name, self.__create_context())
File "/usr/lib/ambari-agent/lib/ambari_commons/repo_manager/yum_manager.py", line 219, in install_package
shell.repository_manager_executor(cmd, self.properties, context)
File "/usr/lib/ambari-agent/lib/ambari_commons/shell.py", line 753, in repository_manager_executor
raise RuntimeError(message)
RuntimeError: Failed to execute command '/usr/bin/yum -y install hadoop_3_1_0_0_78-yarn', exited with code '1', message: 'Error: Package: hadoop_3_1_0_0_78-hdfs-3.1.1.3.1.0.0-78.x86_64 (HDP-3.1-repo-1)

Requires: libtirpc-devel
'
stdout:
2019-01-04 14:02:06,212 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=None -> 3.1
2019-01-04 14:02:06,219 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2019-01-04 14:02:06,220 - Group['kms'] {}
2019-01-04 14:02:06,221 - Group['livy'] {}
2019-01-04 14:02:06,221 - Group['spark'] {}
2019-01-04 14:02:06,222 - Group['ranger'] {}
2019-01-04 14:02:06,222 - Group['hdfs'] {}
2019-01-04 14:02:06,222 - Group['zeppelin'] {}
2019-01-04 14:02:06,222 - Group['hadoop'] {}
2019-01-04 14:02:06,222 - Group['users'] {}
2019-01-04 14:02:06,222 - Group['knox'] {}
2019-01-04 14:02:06,223 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,224 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,225 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,226 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,227 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,228 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-01-04 14:02:06,228 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,229 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,230 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2019-01-04 14:02:06,233 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-01-04 14:02:06,233 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2019-01-04 14:02:06,234 - User['kms'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['kms', 'hadoop'], 'uid': None}
2019-01-04 14:02:06,235 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2019-01-04 14:02:06,236 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,237 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2019-01-04 14:02:06,238 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-01-04 14:02:06,239 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,240 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2019-01-04 14:02:06,241 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,242 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,243 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,244 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,245 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
2019-01-04 14:02:06,246 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-01-04 14:02:06,247 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2019-01-04 14:02:06,251 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2019-01-04 14:02:06,251 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2019-01-04 14:02:06,252 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-01-04 14:02:06,254 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-01-04 14:02:06,254 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2019-01-04 14:02:06,260 - call returned (0, '1012')
2019-01-04 14:02:06,260 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1012'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2019-01-04 14:02:06,264 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1012'] due to not_if
2019-01-04 14:02:06,264 - Group['hdfs'] {}
2019-01-04 14:02:06,265 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2019-01-04 14:02:06,265 - FS Type: HDFS
2019-01-04 14:02:06,265 - Directory['/etc/hadoop'] {'mode': 0755}
2019-01-04 14:02:06,266 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2019-01-04 14:02:06,279 - Repository['HDP-3.1-repo-2'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.1.0.0', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None}
2019-01-04 14:02:06,285 - Repository['HDP-3.1-GPL-repo-2'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.1.0.0', 'action': ['prepare'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None}
2019-01-04 14:02:06,287 - Repository['HDP-UTILS-1.1.0.22-repo-2'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None}
2019-01-04 14:02:06,289 - Repository[None] {'action': ['create']}
2019-01-04 14:02:06,290 - File['/tmp/tmpyxlbch'] {'content': '[HDP-3.1-repo-2]\nname=HDP-3.1-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.1.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.1-GPL-repo-2]\nname=HDP-3.1-GPL-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.1.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-2]\nname=HDP-UTILS-1.1.0.22-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2019-01-04 14:02:06,290 - Writing File['/tmp/tmpyxlbch'] because contents don't match
2019-01-04 14:02:06,291 - File['/tmp/tmpa1YdC7'] {'content': StaticFile('/etc/yum.repos.d/ambari-hdp-2.repo')}
2019-01-04 14:02:06,291 - Writing File['/tmp/tmpa1YdC7'] because contents don't match
2019-01-04 14:02:06,292 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-01-04 14:02:06,374 - Skipping installation of existing package unzip
2019-01-04 14:02:06,374 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-01-04 14:02:06,381 - Skipping installation of existing package curl
2019-01-04 14:02:06,382 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-01-04 14:02:06,389 - Skipping installation of existing package hdp-select
2019-01-04 14:02:06,438 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}
2019-01-04 14:02:06,456 - call returned (0, '')
2019-01-04 14:02:06,695 - Command repositories: HDP-3.1-repo-2, HDP-3.1-GPL-repo-2, HDP-UTILS-1.1.0.22-repo-2
2019-01-04 14:02:06,695 - Applicable repositories: HDP-3.1-repo-2, HDP-3.1-GPL-repo-2, HDP-UTILS-1.1.0.22-repo-2
2019-01-04 14:02:06,696 - Looking for matching packages in the following repositories: HDP-3.1-repo-2, HDP-3.1-GPL-repo-2, HDP-UTILS-1.1.0.22-repo-2
2019-01-04 14:02:12,791 - Adding fallback repositories: HDP-3.1-repo-1, HDP-UTILS-1.1.0.22-repo-1, HDP-3.1-GPL-repo-1
2019-01-04 14:02:19,073 - Package['hadoop_3_1_0_0_78-yarn'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-01-04 14:02:19,148 - Installing package hadoop_3_1_0_0_78-yarn ('/usr/bin/yum -y install hadoop_3_1_0_0_78-yarn')
2019-01-04 14:02:21,228 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}
2019-01-04 14:02:21,247 - call returned (0, '')
2019-01-04 14:02:21,247 - The 'hadoop-yarn-timelineserver' component did not advertise a version. This may indicate a problem with the component packaging.

Command failed after 1 tries

,

Trying to Install HDP through Ambari and encoutered this issue on a Single Node server with Oracle Linux 7.5 and Oracle 12C and Mysql

RuntimeError: Failed to execute command '/usr/bin/yum -y install hadoop_3_1_0_0_78-yarn

Total Error:

stderr:
2019-01-04 14:02:21,247 - The 'hadoop-yarn-timelineserver' component did not advertise a version. This may indicate a problem with the component packaging.
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/YARN/package/scripts/application_timeline_server.py", line 97, in <module>
ApplicationTimelineServer().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/YARN/package/scripts/application_timeline_server.py", line 42, in install
self.install_packages(env)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 849, in install_packages
retry_count=agent_stack_retry_count)
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/packaging.py", line 30, in action_install
self._pkg_manager.install_package(package_name, self.__create_context())
File "/usr/lib/ambari-agent/lib/ambari_commons/repo_manager/yum_manager.py", line 219, in install_package
shell.repository_manager_executor(cmd, self.properties, context)
File "/usr/lib/ambari-agent/lib/ambari_commons/shell.py", line 753, in repository_manager_executor
raise RuntimeError(message)
RuntimeError: Failed to execute command '/usr/bin/yum -y install hadoop_3_1_0_0_78-yarn', exited with code '1', message: 'Error: Package: hadoop_3_1_0_0_78-hdfs-3.1.1.3.1.0.0-78.x86_64 (HDP-3.1-repo-1)

Requires: libtirpc-devel
'
stdout:
2019-01-04 14:02:06,212 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=None -> 3.1
2019-01-04 14:02:06,219 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2019-01-04 14:02:06,220 - Group['kms'] {}
2019-01-04 14:02:06,221 - Group['livy'] {}
2019-01-04 14:02:06,221 - Group['spark'] {}
2019-01-04 14:02:06,222 - Group['ranger'] {}
2019-01-04 14:02:06,222 - Group['hdfs'] {}
2019-01-04 14:02:06,222 - Group['zeppelin'] {}
2019-01-04 14:02:06,222 - Group['hadoop'] {}
2019-01-04 14:02:06,222 - Group['users'] {}
2019-01-04 14:02:06,222 - Group['knox'] {}
2019-01-04 14:02:06,223 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,224 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,225 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,226 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,227 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,228 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-01-04 14:02:06,228 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,229 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,230 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2019-01-04 14:02:06,233 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-01-04 14:02:06,233 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2019-01-04 14:02:06,234 - User['kms'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['kms', 'hadoop'], 'uid': None}
2019-01-04 14:02:06,235 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2019-01-04 14:02:06,236 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,237 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2019-01-04 14:02:06,238 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-01-04 14:02:06,239 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,240 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2019-01-04 14:02:06,241 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,242 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,243 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,244 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-01-04 14:02:06,245 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
2019-01-04 14:02:06,246 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-01-04 14:02:06,247 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2019-01-04 14:02:06,251 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2019-01-04 14:02:06,251 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2019-01-04 14:02:06,252 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-01-04 14:02:06,254 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-01-04 14:02:06,254 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2019-01-04 14:02:06,260 - call returned (0, '1012')
2019-01-04 14:02:06,260 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1012'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2019-01-04 14:02:06,264 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1012'] due to not_if
2019-01-04 14:02:06,264 - Group['hdfs'] {}
2019-01-04 14:02:06,265 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2019-01-04 14:02:06,265 - FS Type: HDFS
2019-01-04 14:02:06,265 - Directory['/etc/hadoop'] {'mode': 0755}
2019-01-04 14:02:06,266 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2019-01-04 14:02:06,279 - Repository['HDP-3.1-repo-2'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.1.0.0', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None}
2019-01-04 14:02:06,285 - Repository['HDP-3.1-GPL-repo-2'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.1.0.0', 'action': ['prepare'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None}
2019-01-04 14:02:06,287 - Repository['HDP-UTILS-1.1.0.22-repo-2'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None}
2019-01-04 14:02:06,289 - Repository[None] {'action': ['create']}
2019-01-04 14:02:06,290 - File['/tmp/tmpyxlbch'] {'content': '[HDP-3.1-repo-2]\nname=HDP-3.1-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.1.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.1-GPL-repo-2]\nname=HDP-3.1-GPL-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.1.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-2]\nname=HDP-UTILS-1.1.0.22-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2019-01-04 14:02:06,290 - Writing File['/tmp/tmpyxlbch'] because contents don't match
2019-01-04 14:02:06,291 - File['/tmp/tmpa1YdC7'] {'content': StaticFile('/etc/yum.repos.d/ambari-hdp-2.repo')}
2019-01-04 14:02:06,291 - Writing File['/tmp/tmpa1YdC7'] because contents don't match
2019-01-04 14:02:06,292 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-01-04 14:02:06,374 - Skipping installation of existing package unzip
2019-01-04 14:02:06,374 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-01-04 14:02:06,381 - Skipping installation of existing package curl
2019-01-04 14:02:06,382 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-01-04 14:02:06,389 - Skipping installation of existing package hdp-select
2019-01-04 14:02:06,438 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}
2019-01-04 14:02:06,456 - call returned (0, '')
2019-01-04 14:02:06,695 - Command repositories: HDP-3.1-repo-2, HDP-3.1-GPL-repo-2, HDP-UTILS-1.1.0.22-repo-2
2019-01-04 14:02:06,695 - Applicable repositories: HDP-3.1-repo-2, HDP-3.1-GPL-repo-2, HDP-UTILS-1.1.0.22-repo-2
2019-01-04 14:02:06,696 - Looking for matching packages in the following repositories: HDP-3.1-repo-2, HDP-3.1-GPL-repo-2, HDP-UTILS-1.1.0.22-repo-2
2019-01-04 14:02:12,791 - Adding fallback repositories: HDP-3.1-repo-1, HDP-UTILS-1.1.0.22-repo-1, HDP-3.1-GPL-repo-1
2019-01-04 14:02:19,073 - Package['hadoop_3_1_0_0_78-yarn'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-01-04 14:02:19,148 - Installing package hadoop_3_1_0_0_78-yarn ('/usr/bin/yum -y install hadoop_3_1_0_0_78-yarn')
2019-01-04 14:02:21,228 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}
2019-01-04 14:02:21,247 - call returned (0, '')
2019-01-04 14:02:21,247 - The 'hadoop-yarn-timelineserver' component did not advertise a version. This may indicate a problem with the component packaging.

Command failed after 1 tries

4 REPLIES 4

avatar
Master Mentor

@rakesh m

You will see the error as following:

RuntimeError: Failed to execute command '/usr/bin/yum -y install 
hadoop_3_1_0_0_78-yarn', exited with code '1', message: 'Error: Package:
 hadoop_3_1_0_0_78-hdfs-3.1.1.3.1.0.0-78.x86_64 (HDP-3.1-repo-1)

 Requires: libtirpc-devel

HDP package has a dependency to "" package

This package you can find from the Base repo of your OS.

Please see a similar discussion here and install the "libtirpc-devel" on your cluster nodes prior to installing the components.

https://community.hortonworks.com/questions/109996/hdp-26-requires-libtirpc-devel.html

avatar
Master Mentor

@rakesh m

You can install this package from OS repo in following ways as they come as Optional-rpms of rhel-server repositories:

Option-1. Enable the Optional rpm's for rhel-server

# yum-config-manager --enable rhui-REGION-rhel-server-optional
# yum install libtirpc-devel



Option-2. Download the package(quick way) and install on all the hosts, (following is just an example you might have slightly different version of RHEL)

Internet Access

# yum install https://www.rpmfind.net/linux/centos/7.5.1804/os/x86_64/Packages/libtirpc-devel-0.2.4-0.10.el7.x86_6...


- If No internet access, then you can download on you machine then put it on server and execute below command

# yum localinstall libtirpc-deve* 

.

avatar
Explorer

Thanks for your answer.

I beleive since the OS is Oracle Linux 5.6 or for some unknown reason, none of the above worked. It worked when i downloded

libtirpc-devel-0.2.4-0.15.el7.x86_64 for CentOS 7.6.1810 for x86_64

from https://rpmfind.net/linux/rpm2html/search.php?query=libtirpc-devel%28x86-64%29. i am not sure if this works well in the long.

avatar
Master Mentor

@rakesh m

I do not see that Oracle Linux 5.6 in the Tested and Certified version os OS with HDP3

Please check : https://supportmatrix.hortonworks.com/

It will be good to be on the tested and certified version of OS.