Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

hcat client install failure with Ambari 2.6

avatar
New Contributor

I use public repository.

But "No package found for hive2_${stack_version}(hive2_(\d|_)+$) " was appeared in the ambari-server.log and failed to install hcat client.

stderr: Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hcat_client.py", line 79, in <module> HCatClient().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 367, in execute method(env) File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hcat_client.py", line 35, in install self.install_packages(env) File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 803, in install_packages name = self.format_package_name(package['name']) File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 538, in format_package_name raise Fail("Cannot match package for regexp name {0}. Available packages: {1}".format(name, self.available_packages_in_repos)) resource_management.core.exceptions.Fail: Cannot match package for regexp name hive2_${stack_version}. Available packages: ['accumulo', 'accumulo-conf-standalone', 'accumulo-source', 'accumulo_2_6_3_0_235', 'accumulo_2_6_3_0_235-conf-standalone', 'accumulo_2_6_3_0_235-source', 'atlas-metadata', 'atlas-metadata-falcon-plugin', 'atlas-metadata-hive-plugin', 'atlas-metadata-sqoop-plugin', 'atlas-metadata-storm-plugin', 'atlas-metadata_2_6_3_0_235', 'atlas-metadata_2_6_3_0_235-falcon-plugin', 'atlas-metadata_2_6_3_0_235-sqoop-plugin', 'atlas-metadata_2_6_3_0_235-storm-plugin', 'bigtop-tomcat', 'datafu', 'datafu_2_6_3_0_235', 'druid', 'druid_2_6_3_0_235', 'falcon', 'falcon-doc', 'falcon_2_6_3_0_235', 'falcon_2_6_3_0_235-doc', 'flume', 'flume-agent', 'flume_2_6_3_0_235', 'flume_2_6_3_0_235-agent', 'hadoop', 'hadoop-client', 'hadoop-conf-pseudo', 'hadoop-doc', 'hadoop-hdfs', 'hadoop-hdfs-datanode', 'hadoop-hdfs-fuse', 'hadoop-hdfs-journalnode', 'hadoop-hdfs-namenode', 'hadoop-hdfs-secondarynamenode', 'hadoop-hdfs-zkfc', 'hadoop-httpfs', 'hadoop-httpfs-server', 'hadoop-libhdfs', 'hadoop-mapreduce', 'hadoop-mapreduce-historyserver', 'hadoop-source', 'hadoop-yarn', 'hadoop-yarn-nodemanager', 'hadoop-yarn-proxyserver', 'hadoop-yarn-resourcemanager', 'hadoop-yarn-timelineserver', 'hadoop_2_6_3_0_235-conf-pseudo', 'hadoop_2_6_3_0_235-doc', 'hadoop_2_6_3_0_235-hdfs-datanode', 'hadoop_2_6_3_0_235-hdfs-fuse', 'hadoop_2_6_3_0_235-hdfs-journalnode', 'hadoop_2_6_3_0_235-hdfs-namenode', 'hadoop_2_6_3_0_235-hdfs-secondarynamenode', 'hadoop_2_6_3_0_235-hdfs-zkfc', 'hadoop_2_6_3_0_235-httpfs', 'hadoop_2_6_3_0_235-httpfs-server', 'hadoop_2_6_3_0_235-libhdfs', 'hadoop_2_6_3_0_235-mapreduce-historyserver', 'hadoop_2_6_3_0_235-source', 'hadoop_2_6_3_0_235-yarn-nodemanager', 'hadoop_2_6_3_0_235-yarn-proxyserver', 'hadoop_2_6_3_0_235-yarn-resourcemanager', 'hadoop_2_6_3_0_235-yarn-timelineserver', 'hadooplzo', 'hadooplzo-native', 'hadooplzo_2_6_3_0_235', 'hadooplzo_2_6_3_0_235-native', 'hbase', 'hbase-doc', 'hbase-master', 'hbase-regionserver', 'hbase-rest', 'hbase-thrift', 'hbase-thrift2', 'hbase_2_6_3_0_235', 'hbase_2_6_3_0_235-doc', 'hbase_2_6_3_0_235-master', 'hbase_2_6_3_0_235-regionserver', 'hbase_2_6_3_0_235-rest', 'hbase_2_6_3_0_235-thrift', 'hbase_2_6_3_0_235-thrift2', 'hive', 'hive-hcatalog', 'hive-hcatalog-server', 'hive-jdbc', 'hive-metastore', 'hive-server', 'hive-server2', 'hive-webhcat', 'hive-webhcat-server', 'hue', 'hue-beeswax', 'hue-common', 'hue-hcatalog', 'hue-oozie', 'hue-pig', 'hue-server', 'kafka', 'kafka_2_6_3_0_235', 'knox', 'knox_2_6_3_0_235', 'livy', 'livy2', 'livy2_2_6_3_0_235', 'livy_2_6_3_0_235', 'mahout', 'mahout-doc', 'mahout_2_6_3_0_235', 'mahout_2_6_3_0_235-doc', 'oozie', 'oozie-client', 'oozie-common', 'oozie-sharelib', 'oozie-sharelib-distcp', 'oozie-sharelib-hcatalog', 'oozie-sharelib-hive', 'oozie-sharelib-hive2', 'oozie-sharelib-mapreduce-streaming', 'oozie-sharelib-pig', 'oozie-sharelib-spark', 'oozie-sharelib-sqoop', 'oozie-webapp', 'oozie_2_6_3_0_235', 'oozie_2_6_3_0_235-client', 'oozie_2_6_3_0_235-common', 'oozie_2_6_3_0_235-sharelib', 'oozie_2_6_3_0_235-sharelib-distcp', 'oozie_2_6_3_0_235-sharelib-hcatalog', 'oozie_2_6_3_0_235-sharelib-hive', 'oozie_2_6_3_0_235-sharelib-hive2', 'oozie_2_6_3_0_235-sharelib-mapreduce-streaming', 'oozie_2_6_3_0_235-sharelib-pig', 'oozie_2_6_3_0_235-sharelib-spark', 'oozie_2_6_3_0_235-sharelib-sqoop', 'oozie_2_6_3_0_235-webapp', 'phoenix', 'phoenix_2_6_3_0_235', 'pig', 'ranger-admin', 'ranger-atlas-plugin', 'ranger-hbase-plugin', 'ranger-hdfs-plugin', 'ranger-hive-plugin', 'ranger-kafka-plugin', 'ranger-kms', 'ranger-knox-plugin', 'ranger-solr-plugin', 'ranger-storm-plugin', 'ranger-tagsync', 'ranger-usersync', 'ranger-yarn-plugin', 'ranger_2_6_3_0_235-admin', 'ranger_2_6_3_0_235-atlas-plugin', 'ranger_2_6_3_0_235-hbase-plugin', 'ranger_2_6_3_0_235-kafka-plugin', 'ranger_2_6_3_0_235-kms', 'ranger_2_6_3_0_235-knox-plugin', 'ranger_2_6_3_0_235-solr-plugin', 'ranger_2_6_3_0_235-storm-plugin', 'ranger_2_6_3_0_235-tagsync', 'ranger_2_6_3_0_235-usersync', 'shc', 'shc_2_6_3_0_235', 'slider', 'slider_2_6_3_0_235', 'spark', 'spark-master', 'spark-python', 'spark-worker', 'spark-yarn-shuffle', 'spark2', 'spark2-master', 'spark2-python', 'spark2-worker', 'spark2-yarn-shuffle', 'spark2_2_6_3_0_235', 'spark2_2_6_3_0_235-master', 'spark2_2_6_3_0_235-python', 'spark2_2_6_3_0_235-worker', 'spark_2_6_3_0_235', 'spark_2_6_3_0_235-master', 'spark_2_6_3_0_235-python', 'spark_2_6_3_0_235-worker', 'spark_llap', 'spark_llap_2_6_3_0_235', 'sqoop', 'sqoop-metastore', 'sqoop_2_6_3_0_235', 'sqoop_2_6_3_0_235-metastore', 'storm', 'storm-slider-client', 'storm_2_6_3_0_235', 'storm_2_6_3_0_235-slider-client', 'superset', 'superset_2_6_3_0_235', 'tez', 'tez_hive2', 'zeppelin', 'zeppelin_2_6_3_0_235', 'zookeeper', 'zookeeper-server', 'zookeeper_2_6_3_0_235-server', 'R', 'R-core', 'R-core-devel', 'R-devel', 'R-java', 'R-java-devel', 'compat-readline5', 'extjs', 'fping', 'ganglia-debuginfo', 'ganglia-devel', 'ganglia-gmetad', 'ganglia-gmond', 'ganglia-gmond-modules-python', 'ganglia-web', 'hadoop-lzo', 'hadoop-lzo-native', 'libRmath', 'libRmath-devel', 'libconfuse', 'libganglia', 'libgenders', 'lua-rrdtool', 'lucidworks-hdpsearch', 'lzo-debuginfo', 'lzo-devel', 'lzo-minilzo', 'mysql-community-release', 'mysql-connector-java', 'nagios', 'nagios-debuginfo', 'nagios-devel', 'nagios-plugins', 'nagios-plugins-debuginfo', 'nagios-www', 'openblas', 'openblas-Rblas', 'openblas-devel', 'openblas-openmp', 'openblas-openmp64', 'openblas-openmp64_', 'openblas-serial64', 'openblas-serial64_', 'openblas-static', 'openblas-threads', 'openblas-threads64', 'openblas-threads64_', 'pdsh', 'perl-Crypt-DES', 'perl-Net-SNMP', 'perl-rrdtool', 'python-rrdtool', 'rrdtool', 'rrdtool-debuginfo', 'rrdtool-devel', 'ruby-rrdtool', 'snappy', 'snappy-devel', 'snappy-devel', 'tcl-rrdtool', 'hdp-select', 'hive2', 'hive2-jdbc', 'hive_2_6_3_0_235', 'hive_2_6_3_0_235-hcatalog', 'hive_2_6_3_0_235-hcatalog-server', 'hive_2_6_3_0_235-jdbc', 'hive_2_6_3_0_235-metastore', 'hive_2_6_3_0_235-server', 'hive_2_6_3_0_235-server2', 'hive_2_6_3_0_235-webhcat', 'hive_2_6_3_0_235-webhcat-server', 'tez_2_6_3_0_235'] stdout: 2017-11-09 10:05:49,139 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6 2017-11-09 10:05:49,144 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf 2017-11-09 10:05:49,145 - Group['hdfs'] {} 2017-11-09 10:05:49,146 - Group['hadoop'] {} 2017-11-09 10:05:49,146 - Group['users'] {} 2017-11-09 10:05:49,147 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-09 10:05:49,147 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-09 10:05:49,148 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-09 10:05:49,149 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None} 2017-11-09 10:05:49,149 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None} 2017-11-09 10:05:49,150 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None} 2017-11-09 10:05:49,151 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-09 10:05:49,151 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-09 10:05:49,152 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-09 10:05:49,152 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-11-09 10:05:49,154 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2017-11-09 10:05:49,158 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if 2017-11-09 10:05:49,158 - Group['hdfs'] {} 2017-11-09 10:05:49,159 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']} 2017-11-09 10:05:49,159 - FS Type: 2017-11-09 10:05:49,159 - Directory['/etc/hadoop'] {'mode': 0755} 2017-11-09 10:05:49,173 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2017-11-09 10:05:49,174 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2017-11-09 10:05:49,187 - Repository['HDP-2.6-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None} 2017-11-09 10:05:49,193 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0'} 2017-11-09 10:05:49,194 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match 2017-11-09 10:05:49,194 - Repository['HDP-UTILS-1.1.0.21-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None} 2017-11-09 10:05:49,197 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-1]\nname=HDP-UTILS-1.1.0.21-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'} 2017-11-09 10:05:49,197 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match 2017-11-09 10:05:49,198 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-11-09 10:05:49,375 - Skipping installation of existing package unzip 2017-11-09 10:05:49,375 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-11-09 10:05:49,469 - Skipping installation of existing package curl 2017-11-09 10:05:49,469 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-11-09 10:05:49,566 - Skipping installation of existing package hdp-select 2017-11-09 10:05:49,570 - The repository with version 2.6.3.0-235 for this command has been marked as resolved. It will be used to report the version of the component which was installed 2017-11-09 10:05:49,781 - MariaDB RedHat Support: false 2017-11-09 10:05:49,785 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf 2017-11-09 10:05:49,796 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20} 2017-11-09 10:05:49,818 - call returned (0, 'hive-server2 - 2.6.3.0-235') 2017-11-09 10:05:49,818 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6 2017-11-09 10:05:49,850 - Command repositories: HDP-2.6-repo-1, HDP-UTILS-1.1.0.21-repo-1 2017-11-09 10:05:49,850 - Applicable repositories: HDP-2.6-repo-1, HDP-UTILS-1.1.0.21-repo-1 2017-11-09 10:05:49,852 - Looking for matching packages in the following repositories: HDP-2.6-repo-1, HDP-UTILS-1.1.0.21-repo-1 2017-11-09 10:05:51,537 - Package['hive_2_6_3_0_235'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-11-09 10:05:51,707 - Skipping installation of existing package hive_2_6_3_0_235 2017-11-09 10:05:51,710 - Package['hive_2_6_3_0_235-hcatalog'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-11-09 10:05:51,804 - Skipping installation of existing package hive_2_6_3_0_235-hcatalog 2017-11-09 10:05:51,806 - Package['hive_2_6_3_0_235-webhcat'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-11-09 10:05:51,907 - Skipping installation of existing package hive_2_6_3_0_235-webhcat 2017-11-09 10:05:51,909 - No package found for hive2_${stack_version}(hive2_(\d|_)+$) 2017-11-09 10:05:51,911 - The repository with version 2.6.3.0-235 for this command has been marked as resolved. It will be used to report the version of the component which was installed Command failed after 1 tries

1 ACCEPTED SOLUTION

avatar
Contributor

Try running below on the node you're installing hive2.

1)yum list installed |grep hive -- make sure repo is listed as @HDP-2.6-repo-1. If it says "installed" then do below steps.

2)yum-complete-transaction -- this is important to run.

3)yum remove hive2_ .... -- all components which are having "installed" but without proper repo.

4)Goto Ambari and install again.

This is a issue is happening for almost to any component due to break/killed yum and quasi installed status.

View solution in original post

16 REPLIES 16

avatar
Super Guru

@sk kumar,

Did u try running the steps @bmasna has mentioned. That should work

avatar

@sk kumar,
How did you fixed the issue with zookeeper?

avatar
Contributor

Thanks, @Aditya Sirna. I've just tried that. It worked.

Thanks @bmasna it worked for me.

avatar

Hello,when I install Ambari 2.6,the zookeeper version is not Available,there notes:

=======================================================================================

stderr: /var/lib/ambari-agent/data/errors-29.txt
2018-10-28 22:45:43,494 - ============lljjyy1 2.6.5.0
2018-10-28 22:45:43,494 - ============lljjyy2 2.6.5.0
2018-10-28 22:45:43,494 - ============lljjyy3 2.6.5.0
2018-10-28 22:45:45,497 - The 'zookeeper-client' component did not advertise a version. This may indicate a problem with the component packaging.
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/ZOOKEEPER/3.4.5/package/scripts/zookeeper_client.py", line 79, in <module>
    ZookeeperClient().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 375, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/ZOOKEEPER/3.4.5/package/scripts/zookeeper_client.py", line 59, in install
    self.install_packages(env)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 812, in install_packages
    name = self.format_package_name(package['name'])
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 547, in format_package_name
    raise Fail("Cannot match package for regexp name {0}. Available packages: {1}".format(name, self.available_packages_in_repos))
resource_management.core.exceptions.Fail: Cannot match package for regexp name zookeeper_${stack_version}. Available packages: ['openblas', 'openblas-Rblas', 'openblas-devel', 'openblas-openmp', 'openblas-openmp64', 'openblas-openmp64_', 'openblas-serial64', 'openblas-serial64_', 'openblas-static', 'openblas-threads', 'openblas-threads64', 'openblas-threads64_', 'snappy', 'snappy-devel', 'snappy-devel']
stdout: /var/lib/ambari-agent/data/output-29.txt
2018-10-28 22:45:43,032 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-10-28 22:45:43,035 - Group['hadoop'] {}
2018-10-28 22:45:43,036 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-10-28 22:45:43,036 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-10-28 22:45:43,037 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2018-10-28 22:45:43,037 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-10-28 22:45:43,038 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-10-28 22:45:43,044 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-10-28 22:45:43,056 - Repository for HDP/2.6.5.0/HDP-2.6 is not managed by Ambari
2018-10-28 22:45:43,057 - Repository for HDP/2.6.5.0/HDP-2.6-GPL is not managed by Ambari
2018-10-28 22:45:43,057 - Repository for HDP/2.6.5.0/HDP-UTILS-1.1.0.22 is not managed by Ambari
2018-10-28 22:45:43,057 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-10-28 22:45:43,178 - Skipping installation of existing package unzip
2018-10-28 22:45:43,178 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-10-28 22:45:43,233 - Skipping installation of existing package curl
2018-10-28 22:45:43,233 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-10-28 22:45:43,291 - Skipping installation of existing package hdp-select
2018-10-28 22:45:43,343 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}
2018-10-28 22:45:43,363 - call returned (0, '')
2018-10-28 22:45:43,494 - ============lljjyy1 2.6.5.0
2018-10-28 22:45:43,494 - ============lljjyy2 2.6.5.0
2018-10-28 22:45:43,494 - ============lljjyy3 2.6.5.0
2018-10-28 22:45:43,494 - Command repositories: HDP-2.6, HDP-2.6-GPL, HDP-UTILS-1.1.0.22
2018-10-28 22:45:43,494 - Applicable repositories: HDP-2.6, HDP-2.6-GPL, HDP-UTILS-1.1.0.22
2018-10-28 22:45:43,496 - Looking for matching packages in the following repositories: HDP-2.6, HDP-2.6-GPL, HDP-UTILS-1.1.0.22
2018-10-28 22:45:45,422 - No package found for zookeeper_${stack_version}(zookeeper_(\d|_)+$)
2018-10-28 22:45:45,478 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}
2018-10-28 22:45:45,497 - call returned (0, '')
2018-10-28 22:45:45,497 - The 'zookeeper-client' component did not advertise a version. This may indicate a problem with the component packaging.

Command failed after 1 tries
==================================================================================================
Can you help me?

avatar
Contributor

@Li Jiayan Can you try below change and try to install.

Edit "/usr/lib/python2.6/site-packages/resource_management/libraries/script/http://script.py" python script and comment line "package_version = None" (line# 541).

avatar

When I installed HDP-GPL before, I used my own local Yum source. I solved it by default. Thanks.

avatar
Contributor

@Aditya Sirna & @sk kumar .Good to know it worked for you 🙂