Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Error installing components

Error installing components

Rising Star

Hi !

I am trying to deploy a new cluster on AWS with the following specs:

RHEL 7.5, Ambari 2.6.2, HDP 2.6.5.

After registering the node, when I add the components it fails. Below are the log from one of the nodes. How can I skip it to install other components if one is giving issue. Please help..

2018-06-27 18:05:21,063 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-06-27 18:05:21,069 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-06-27 18:05:21,070 - Group['livy'] {}
2018-06-27 18:05:21,072 - Adding group Group['livy']
2018-06-27 18:05:21,094 - Group['spark'] {}
2018-06-27 18:05:21,094 - Adding group Group['spark']
2018-06-27 18:05:21,109 - Group['hdfs'] {}
2018-06-27 18:05:21,109 - Adding group Group['hdfs']
2018-06-27 18:05:21,124 - Group['hadoop'] {}
2018-06-27 18:05:21,124 - Adding group Group['hadoop']
2018-06-27 18:05:21,138 - Group['users'] {}
2018-06-27 18:05:21,139 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-27 18:05:21,139 - Adding user User['hive']
2018-06-27 18:05:21,169 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-27 18:05:21,170 - Adding user User['zookeeper']
2018-06-27 18:05:21,193 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-27 18:05:21,193 - Adding user User['infra-solr']
2018-06-27 18:05:21,215 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-06-27 18:05:21,216 - Adding user User['oozie']
2018-06-27 18:05:21,238 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-27 18:05:21,238 - Adding user User['ams']
2018-06-27 18:05:21,262 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-06-27 18:05:21,262 - Adding user User['tez']
2018-06-27 18:05:21,285 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-27 18:05:21,285 - Adding user User['livy']
2018-06-27 18:05:21,309 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-27 18:05:21,309 - Adding user User['spark']
2018-06-27 18:05:21,333 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-06-27 18:05:21,333 - Adding user User['ambari-qa']
2018-06-27 18:05:21,365 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-27 18:05:21,365 - Adding user User['kafka']
2018-06-27 18:05:21,389 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-06-27 18:05:21,389 - Adding user User['hdfs']
2018-06-27 18:05:21,412 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-27 18:05:21,412 - Adding user User['sqoop']
2018-06-27 18:05:21,435 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-27 18:05:21,435 - Adding user User['yarn']
2018-06-27 18:05:21,457 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-27 18:05:21,458 - Adding user User['mapred']
2018-06-27 18:05:21,481 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-06-27 18:05:21,481 - Adding user User['hcat']
2018-06-27 18:05:21,505 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-06-27 18:05:21,509 - Writing File['/var/lib/ambari-agent/tmp/changeUid.sh'] because it doesn't exist
2018-06-27 18:05:21,509 - Changing permission for /var/lib/ambari-agent/tmp/changeUid.sh from 644 to 555
2018-06-27 18:05:21,510 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-06-27 18:05:21,514 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-06-27 18:05:21,515 - Group['hdfs'] {}
2018-06-27 18:05:21,515 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2018-06-27 18:05:21,515 - FS Type: 
2018-06-27 18:05:21,516 - Directory['/etc/hadoop'] {'mode': 0755}
2018-06-27 18:05:21,516 - Creating directory Directory['/etc/hadoop'] since it doesn't exist.
2018-06-27 18:05:21,516 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-06-27 18:05:21,516 - Creating directory Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] since it doesn't exist.
2018-06-27 18:05:21,517 - Changing owner for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hdfs
2018-06-27 18:05:21,517 - Changing group for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hadoop
2018-06-27 18:05:21,517 - Changing permission for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 755 to 1777
2018-06-27 18:05:21,517 - Directory['/var/lib/ambari-agent/tmp/AMBARI-artifacts/'] {'create_parents': True}
2018-06-27 18:05:21,517 - Creating directory Directory['/var/lib/ambari-agent/tmp/AMBARI-artifacts/'] since it doesn't exist.
2018-06-27 18:05:21,518 - File['/var/lib/ambari-agent/tmp/jdk-8u112-linux-x64.tar.gz'] {'content': DownloadSource('http://ip-172-31-7-134.us-east-2.compute.internal:8080/resources/jdk-8u112-linux-x64.tar.gz'), 'not_if': 'test -f /var/lib/ambari-agent/tmp/jdk-8u112-linux-x64.tar.gz'}
2018-06-27 18:05:21,520 - Downloading the file from http://ip-172-31-7-134.us-east-2.compute.internal:8080/resources/jdk-8u112-linux-x64.tar.gz
2018-06-27 18:05:23,877 - File['/var/lib/ambari-agent/tmp/jdk-8u112-linux-x64.tar.gz'] {'mode': 0755}
2018-06-27 18:05:23,877 - Changing permission for /var/lib/ambari-agent/tmp/jdk-8u112-linux-x64.tar.gz from 644 to 755
2018-06-27 18:05:23,878 - Directory['/usr/jdk64'] {}
2018-06-27 18:05:23,878 - Creating directory Directory['/usr/jdk64'] since it doesn't exist.
2018-06-27 18:05:23,878 - Execute[('chmod', 'a+x', u'/usr/jdk64')] {'sudo': True}
2018-06-27 18:05:23,884 - Execute['cd /var/lib/ambari-agent/tmp/jdk_tmp_LrBHXG && tar -xf /var/lib/ambari-agent/tmp/jdk-8u112-linux-x64.tar.gz && ambari-sudo.sh cp -rp /var/lib/ambari-agent/tmp/jdk_tmp_LrBHXG/* /usr/jdk64'] {}
2018-06-27 18:05:27,401 - Directory['/var/lib/ambari-agent/tmp/jdk_tmp_LrBHXG'] {'action': ['delete']}
2018-06-27 18:05:27,402 - Removing directory Directory['/var/lib/ambari-agent/tmp/jdk_tmp_LrBHXG'] and all its content
2018-06-27 18:05:28,056 - File['/usr/jdk64/jdk1.8.0_112/bin/java'] {'mode': 0755, 'cd_access': 'a'}
2018-06-27 18:05:28,057 - Execute[('chmod', '-R', '755', u'/usr/jdk64/jdk1.8.0_112')] {'sudo': True}
2018-06-27 18:05:28,088 - Repository['HDP-2.6-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-06-27 18:05:28,096 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': InlineTemplate(...)}
2018-06-27 18:05:28,096 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because it doesn't exist
2018-06-27 18:05:28,097 - Repository['HDP-2.6-GPL-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.5.0', 'action': ['create'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-06-27 18:05:28,100 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-2.6-GPL-repo-1]\nname=HDP-2.6-GPL-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.5.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-06-27 18:05:28,100 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-06-27 18:05:28,100 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-06-27 18:05:28,103 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-2.6-GPL-repo-1]\nname=HDP-2.6-GPL-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.5.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-06-27 18:05:28,103 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-06-27 18:05:28,104 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-06-27 18:05:28,239 - Installing package unzip ('/usr/bin/yum -d 0 -e 0 -y install unzip')
2018-06-27 18:05:30,059 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-06-27 18:05:30,095 - Skipping installation of existing package curl
2018-06-27 18:05:30,096 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-06-27 18:05:30,130 - Installing package hdp-select ('/usr/bin/yum -d 0 -e 0 -y install hdp-select')
2018-06-27 18:05:31,377 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}
2018-06-27 18:05:31,400 - call returned (1, 'Traceback (most recent call last):\n  File "/usr/bin/hdp-select", line 446, in <module>\n    printVersions()\n  File "/usr/bin/hdp-select", line 286, in printVersions\n    for f in os.listdir(root):\nOSError: [Errno 2] No such file or directory: \'/usr/hdp\'')
2018-06-27 18:05:31,615 - Command repositories: HDP-2.6-repo-1, HDP-2.6-GPL-repo-1, HDP-UTILS-1.1.0.22-repo-1
2018-06-27 18:05:31,615 - Applicable repositories: HDP-2.6-repo-1, HDP-2.6-GPL-repo-1, HDP-UTILS-1.1.0.22-repo-1
2018-06-27 18:05:31,617 - Looking for matching packages in the following repositories: HDP-2.6-repo-1, HDP-2.6-GPL-repo-1, HDP-UTILS-1.1.0.22-repo-1
2018-06-27 18:05:34,280 - Package['hadoop_2_6_5_0_292-yarn'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-06-27 18:05:34,412 - Installing package hadoop_2_6_5_0_292-yarn ('/usr/bin/yum -d 0 -e 0 -y install hadoop_2_6_5_0_292-yarn')
2018-06-27 18:06:05,163 - Package['hadoop_2_6_5_0_292-mapreduce'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-06-27 18:06:05,201 - Installing package hadoop_2_6_5_0_292-mapreduce ('/usr/bin/yum -d 0 -e 0 -y install hadoop_2_6_5_0_292-mapreduce')
2018-06-27 18:06:06,663 - Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_2_6_5_0_292-mapreduce' returned 1. Error: Package: hadoop_2_6_5_0_292-hdfs-2.7.3.2.6.5.0-292.x86_64 (HDP-2.6-repo-1)
           Requires: libtirpc-devel
 You could try using --skip-broken to work around the problem
 You could try running: rpm -Va --nofiles --nodigest
2018-06-27 18:06:06,663 - Failed to install package hadoop_2_6_5_0_292-mapreduce. Executing '/usr/bin/yum clean metadata'
2018-06-27 18:06:06,910 - Retrying to install package hadoop_2_6_5_0_292-mapreduce after 30 seconds
1 REPLY 1
Highlighted

Re: Error installing components

Mentor

@Prakash Punj

You will need to install libtirpc-devel on all the hosts in the cluster. You might need to enable the OS Base repo

subscription-manager repos --enable=rhel-7-server-optional-rpms,

subscription-manager repos --enable=rhel-7-server-eus-optional-rpms.

Here is a link to walk you through

https://community.hortonworks.com/answers/99526/view.html

HTH

Don't have an account?
Coming from Hortonworks? Activate your account here