Support Questions

Find answers, ask questions, and share your expertise

Ambari 2.6.5 Install Fail - Multilple Ambari-HDP repos

avatar
New Contributor

Hi all, 

 

I have reinstalled ambari and trying to setup a three-node cluster. However, I keep getting the same error on the last step about multiple ambari-hdp-repos-1-2 and now 51. I am running centos7 and the install is being done offline through local repositories. 

 

I know this has been covered in another thread but the answer is not very clear to me. My HDP, HDP-UTILS and SOLR repo urls are correct. A

 

Any help please???

Thank you!

 

 

2020-04-15 15:50:40,562 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2020-04-15 15:50:40,566 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2020-04-15 15:50:40,567 - Group['livy'] {}
2020-04-15 15:50:40,609 - Adding group Group['livy']
2020-04-15 15:50:42,177 - Group['spark'] {}
2020-04-15 15:50:42,177 - Adding group Group['spark']
2020-04-15 15:50:42,200 - Group['solr'] {}
2020-04-15 15:50:42,201 - Adding group Group['solr']
2020-04-15 15:50:42,229 - Group['hdfs'] {}
2020-04-15 15:50:42,230 - Adding group Group['hdfs']
2020-04-15 15:50:42,257 - Group['hadoop'] {}
2020-04-15 15:50:42,257 - Group['users'] {}
2020-04-15 15:50:42,258 - Group['knox'] {}
2020-04-15 15:50:42,258 - Adding group Group['knox']
2020-04-15 15:50:42,283 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-04-15 15:50:42,284 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-04-15 15:50:42,285 - Adding user User['zookeeper']
2020-04-15 15:50:42,374 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2020-04-15 15:50:42,377 - Adding user User['oozie']
2020-04-15 15:50:42,453 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-04-15 15:50:42,455 - Adding user User['ams']
2020-04-15 15:50:42,543 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2020-04-15 15:50:42,544 - Adding user User['tez']
2020-04-15 15:50:42,671 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-04-15 15:50:42,672 - Adding user User['livy']
2020-04-15 15:50:42,791 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-04-15 15:50:42,793 - Adding user User['spark']
2020-04-15 15:50:42,900 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2020-04-15 15:50:42,901 - Adding user User['ambari-qa']
2020-04-15 15:50:43,027 - User['solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-04-15 15:50:43,028 - Adding user User['solr']
2020-04-15 15:50:43,159 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-04-15 15:50:43,161 - Adding user User['kafka']
2020-04-15 15:50:43,270 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2020-04-15 15:50:43,271 - Adding user User['hdfs']
2020-04-15 15:50:43,365 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-04-15 15:50:43,366 - Adding user User['sqoop']
2020-04-15 15:50:43,447 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-04-15 15:50:43,449 - Adding user User['yarn']
2020-04-15 15:50:43,537 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-04-15 15:50:43,538 - Adding user User['mapred']
2020-04-15 15:50:43,615 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-04-15 15:50:43,617 - Adding user User['hbase']
2020-04-15 15:50:43,696 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-04-15 15:50:43,697 - Adding user User['knox']
2020-04-15 15:50:43,784 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-04-15 15:50:43,786 - Adding user User['hcat']
2020-04-15 15:50:43,871 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2020-04-15 15:50:43,873 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2020-04-15 15:50:43,876 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2020-04-15 15:50:43,876 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2020-04-15 15:50:43,876 - Changing owner for /tmp/hbase-hbase from 59918 to hbase
2020-04-15 15:50:43,876 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2020-04-15 15:50:43,877 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2020-04-15 15:50:43,878 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2020-04-15 15:50:43,882 - call returned (0, '59912')
2020-04-15 15:50:43,883 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 59912'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2020-04-15 15:50:43,886 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 59912'] due to not_if
2020-04-15 15:50:43,886 - Group['hdfs'] {}
2020-04-15 15:50:43,886 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2020-04-15 15:50:43,886 - FS Type: 
2020-04-15 15:50:43,887 - Directory['/etc/hadoop'] {'mode': 0755}
2020-04-15 15:50:43,887 - Creating directory Directory['/etc/hadoop'] since it doesn't exist.
2020-04-15 15:50:43,887 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2020-04-15 15:50:43,887 - Changing owner for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 59914 to hdfs
2020-04-15 15:50:43,898 - Repository['HDP-2.6-repo-51'] {'append_to_file': False, 'base_url': 'http://ohdpmgrl01.ont.hostname.nl/web/HDP/centos7/2.6.5.0-292/', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-51', 'mirror_list': None}
2020-04-15 15:50:43,903 - File['/etc/yum.repos.d/ambari-hdp-51.repo'] {'content': InlineTemplate(...)}
2020-04-15 15:50:43,903 - Writing File['/etc/yum.repos.d/ambari-hdp-51.repo'] because it doesn't exist
2020-04-15 15:50:43,903 - Repository['HDP-SOLR-3.0.0-100-repo-51'] {'append_to_file': True, 'base_url': 'http://ohdpmgrl01.ont.hostname.nl/web/centos7/', 'action': ['create'], 'components': [u'HDP-SOLR', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-51', 'mirror_list': None}
2020-04-15 15:50:43,906 - File['/etc/yum.repos.d/ambari-hdp-51.repo'] {'content': '[HDP-2.6-repo-51]\nname=HDP-2.6-repo-51\nbaseurl=http://ohdpmgrl01.ont.hostname.nl/web/HDP/centos7/2.6.5.0-292/\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-SOLR-3.0.0-100-repo-51]\nname=HDP-SOLR-3.0.0-100-repo-51\nbaseurl=http://ohdpmgrl01.ont.hostname.nl/web/centos7/\n\npath=/\nenabled=1\ngpgcheck=0'}
2020-04-15 15:50:43,906 - Writing File['/etc/yum.repos.d/ambari-hdp-51.repo'] because contents don't match
2020-04-15 15:50:43,906 - Repository['HDP-UTILS-1.1.0.21-repo-51'] {'append_to_file': True, 'base_url': 'http://ohdpmgrl01.ont.hostname.nl/web/HDP-UTILS/centos7/1.1.0.22/', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-51', 'mirror_list': None}
2020-04-15 15:50:43,908 - File['/etc/yum.repos.d/ambari-hdp-51.repo'] {'content': '[HDP-2.6-repo-51]\nname=HDP-2.6-repo-51\nbaseurl=http://ohdpmgrl01.ont.hostname.nl/web/HDP/centos7/2.6.5.0-292/\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-SOLR-3.0.0-100-repo-51]\nname=HDP-SOLR-3.0.0-100-repo-51\nbaseurl=http://ohdpmgrl01.ont.hostname.nl/web/centos7/\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-51]\nname=HDP-UTILS-1.1.0.21-repo-51\nbaseurl=http://ohdpmgrl01.ont.hostname.nl/web/HDP-UTILS/centos7/1.1.0.22/\n\npath=/\nenabled=1\ngpgcheck=0'}
2020-04-15 15:50:43,908 - Writing File['/etc/yum.repos.d/ambari-hdp-51.repo'] because contents don't match
2020-04-15 15:50:43,909 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-04-15 15:50:44,326 - Skipping installation of existing package unzip
2020-04-15 15:50:44,327 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-04-15 15:50:44,645 - Skipping installation of existing package curl
2020-04-15 15:50:44,645 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-04-15 15:50:44,970 - Skipping installation of existing package hdp-select

 

 

 

2 ACCEPTED SOLUTIONS

avatar
Super Guru

@Soc You should go to 

/etc/yum.repos.d/

and clean your repo files.  Delete any that you do need.

 

Next execute:

 

yum clean all

 

Then try again.

View solution in original post

avatar
New Contributor

I solved this one, it had to do with the hdp-select, it had to be removed (and then of course clean up the yum repos) from the manager and the agents and the same hdp-select to be install on all to resolve the repo issue.

View solution in original post

2 REPLIES 2

avatar
Super Guru

@Soc You should go to 

/etc/yum.repos.d/

and clean your repo files.  Delete any that you do need.

 

Next execute:

 

yum clean all

 

Then try again.

avatar
New Contributor

I solved this one, it had to do with the hdp-select, it had to be removed (and then of course clean up the yum repos) from the manager and the agents and the same hdp-select to be install on all to resolve the repo issue.