Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Cluster installation fails at the Install, start, test step.

avatar
Explorer

Hi Geeks,

I am trying to set up the HortonWorks cluster using ambari 2.7.0.0 and HDP 3.0.0.0 and using public repositories. But it is keep failing at the Install, start, test step with the below errors. No one of the components are getting installed.

stderr: Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stack-hooks/before-INSTALL/scripts/hook.py", line 37, in <module> BeforeInstallHook().execute() File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 363, in execute self.save_component_version_to_structured_out(self.command_name) File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 222, in save_component_version_to_structured_out stack_select_package_name = stack_select.get_package_name() File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 109, in get_package_name package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name) File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 223, in get_packages supported_packages = get_supported_packages() File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 147, in get_supported_packages raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path)) resource_management.core.exceptions.Fail: Unable to query for supported packages using /usr/bin/hdp-select stdout: 2018-08-10 16:30:56,129 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0 2018-08-10 16:30:56,134 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2018-08-10 16:30:56,135 - Group['hdfs'] {} 2018-08-10 16:30:56,136 - Group['hadoop'] {} 2018-08-10 16:30:56,136 - Group['users'] {} 2018-08-10 16:30:56,136 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-08-10 16:30:56,138 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-08-10 16:30:56,139 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-08-10 16:30:56,140 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-08-10 16:30:56,141 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None} 2018-08-10 16:30:56,143 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None} 2018-08-10 16:30:56,144 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None} 2018-08-10 16:30:56,145 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None} 2018-08-10 16:30:56,146 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-08-10 16:30:56,147 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-08-10 16:30:56,148 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-08-10 16:30:56,149 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-08-10 16:30:56,151 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2018-08-10 16:30:56,156 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if 2018-08-10 16:30:56,156 - Group['hdfs'] {} 2018-08-10 16:30:56,156 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']} 2018-08-10 16:30:56,157 - FS Type: HDFS 2018-08-10 16:30:56,157 - Directory['/etc/hadoop'] {'mode': 0755} 2018-08-10 16:30:56,158 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2018-08-10 16:30:56,169 - Repository['HDP-3.0-repo-2'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.0.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None} 2018-08-10 16:30:56,175 - File['/etc/yum.repos.d/ambari-hdp-2.repo'] {'content': '[HDP-3.0-repo-2]\nname=HDP-3.0-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0'} 2018-08-10 16:30:56,176 - Writing File['/etc/yum.repos.d/ambari-hdp-2.repo'] because contents don't match 2018-08-10 16:30:56,176 - Repository['HDP-3.0-GPL-repo-2'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.0.0', 'action': ['create'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None} 2018-08-10 16:30:56,179 - File['/etc/yum.repos.d/ambari-hdp-2.repo'] {'content': '[HDP-3.0-repo-2]\nname=HDP-3.0-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.0-GPL-repo-2]\nname=HDP-3.0-GPL-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0'} 2018-08-10 16:30:56,179 - Writing File['/etc/yum.repos.d/ambari-hdp-2.repo'] because contents don't match 2018-08-10 16:30:56,180 - Repository['HDP-UTILS-1.1.0.22-repo-2'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None} 2018-08-10 16:30:56,182 - File['/etc/yum.repos.d/ambari-hdp-2.repo'] {'content': '[HDP-3.0-repo-2]\nname=HDP-3.0-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.0-GPL-repo-2]\nname=HDP-3.0-GPL-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-2]\nname=HDP-UTILS-1.1.0.22-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'} 2018-08-10 16:30:56,183 - Writing File['/etc/yum.repos.d/ambari-hdp-2.repo'] because contents don't match 2018-08-10 16:30:56,183 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-08-10 16:30:56,429 - Skipping installation of existing package unzip 2018-08-10 16:30:56,429 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-08-10 16:30:56,529 - Skipping installation of existing package curl 2018-08-10 16:30:56,530 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-08-10 16:30:56,630 - Skipping installation of existing package hdp-select Command failed after 1 tries

1 ACCEPTED SOLUTION

avatar
Explorer

Hi @Akhil, Thank you for the response. Yes, this answer was really helpful. One thing, I wanted to add was there was hdp-select already installed with the same version still it was failing. Once, I uninstalled hdp-select, the ambari installed the fresh one and it started working. Thank you Once again!!

View solution in original post

3 REPLIES 3

avatar

Hi @Alpesh Ghori ,-

Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-08-10 16:30:56,630 - Skipping installation of existing package hdp-select Command failed after 1 tries

This error indicates that your hdp-select command is not working.

can you mannually ssh to any one of the nodes and try to run this command mannualy

additionally can you run this command whether to see

[root@anaik1 /]# hdp-select
[root@anaik1 /]# ls -lart /usr/hdp/
total 12
drwxr-xr-x.  3 root root   16 Jul 17 05:56 share
drwxr-xr-x. 15 root root 4096 Jul 17 05:56 ..
drwxr-xr-x.  5 root root   51 Jul 17 05:57 .
drwxr-xr-x. 26 root root 4096 Aug  8 08:42 3.0.0.0-1634
drwxr-xr-x.  3 root root 4096 Aug  8 08:48 current
[root@anaik1 /]# rpm -qa | grep hdp-select
hdp-select-3.0.0.0-1634.el7.centos.noarch

your Packages are installed from HDP-3.0.0 only.

The "rpm -qa | grep hdp-select" command will show if the hdp-select binary is installed from HDP 3.0.0 repo or from any old REPO. If the version mismatch then we might need to upgrade the hdp-select to the current hdp version (HDP 2.6.3).

you can also verify if any of your yum repo is pointing to the old HDP repo inside the "/etc/yum.repos.d" , If yes then please edit those repos and then set the "enabled=0" in those files so that only the HDP 3.0.0 yum repo file is enabled when multiple HDP repos are present there.

Then run

# yum clean all
# yum repolist

Please feel free to Login and accept the answer if you found this helpfull. it will encourage other users to do this troubleshooting once they face the same issue.

avatar
Explorer

Hi @Akhil, Thank you for the response. Yes, this answer was really helpful. One thing, I wanted to add was there was hdp-select already installed with the same version still it was failing. Once, I uninstalled hdp-select, the ambari installed the fresh one and it started working. Thank you Once again!!

avatar

Hi @Alpesh Ghori ,

Glad that helpful.

Cheers