Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Ambari Services fail to start with resource_management.core.exceptions.Fail: Cannot match package for regexp name hadooplzo_${stack_version}.

avatar
Explorer

 

Ambari services are not coming up as repo is no longer accessible and giving access denied error.

Please advise.

 

File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 538, in format_package_name
raise Fail("Cannot match package for regexp name {0}. Available packages: {1}".format(name, self.available_packages_in_repos))
resource_management.core.exceptions.Fail: Cannot match package for regexp name hadooplzo_${stack_version}.

 

Is this because of the recent repo migration for all hdp versions. How does that impcat functioning of an existing cluster. Please advise. My Production cluster is down now. Appreciate quick  help on this.

my ambari.repo file:

#VERSION_NUMBER=2.6.0.0-267
[ambari-2.6.0.0]
name=ambari Version - ambari-2.6.0.0
baseurl=http://public-repo-1.hortonworks.com/ambari/centos7/2.x/updates/2.6.0.0
gpgcheck=1
gpgkey=http://public-repo-1.hortonworks.com/ambari/centos7/2.x/updates/2.6.0.0/RPM-GPG-KEY/RPM-GPG-KEY-Jenk...
enabled=1
priority=1

 

my ambari-hdp-1.repo

[HDP-2.6-repo-1]
name=HDP-2.6-repo-1
baseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0

path=/
enabled=1
gpgcheck=0
[HDP-UTILS-1.1.0.21-repo-1]
name=HDP-UTILS-1.1.0.21-repo-1
baseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7

path=/
enabled=1

 

yum repolist throws error 401 unauthorized:

http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0/repodata/repomd.xml: [Errno 14] HTTP Error 403 - Forbidden
Trying other mirror.
http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7/repodata/repomd.xml: [Errno 14] HTTP Error 403 - Forbidden
Trying other mirror.
http://public-repo-1.hortonworks.com/ambari/centos7/2.x/updates/2.6.0.0/repodata/repomd.xml: [Errno 14] HTTP Error 403 - Forbidden
Trying other mirror.

 

I rgistered as a cloudera user and thentried to access the new repo with credentials, but again error is thrown:

403 Forbidden (varnish) the provided credentials were incorrect


Please advise as my cluster is now down and anbari services are not coming up.

 

4 REPLIES 4

avatar
Explorer

@cjervis  how does the transition of repos from public private has a n impact on the existing ambari cluster services.

We are using ambari 2.6 version. We stopped all the ambari services this morning, and now unable to bring it up as its looking for some packages which are not already available on the server.

And the public repo access is no longer available.

 

My production cluster is down now. Any help on this? We just stooped the cluster and now while starting it up its looking for new packages hadooplzo_${stack_version}.

Is there a way to bring the services up.

 

File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 538, in format_package_name
raise Fail("Cannot match package for regexp name {0}. Available packages: {1}".format(name, self.available_packages_in_repos))
resource_management.core.exceptions.Fail: Cannot match package for regexp name hadooplzo_${stack_version}. Available packages: ['atlas-metadata_2_6_3_0_235-falcon-plugin', 'atlas-metadata_2_6_3_0_235-hive-plugin', 'bigtop-jsvc', 'bigtop-tomcat', 'datafu_2_6_3_0_235', 'falcon_2_6_3_0_235', 'hadoop_2_6_3_0_235', 'hadoop_2_6_3_0_235-client', 'hadoop_2_6_3_0_235-hdfs', 'hadoop_2_6_3_0_235-libhdfs', 'hadoop_2_6_3_0_235-mapreduce', 'hadoop_2_6_3_0_235-yarn', 'hbase_2_6_3_0_235', 'hdp-select', 'hive2_2_6_3_0_235', 'hive2_2_6_3_0_235-jdbc', 'hive_2_6_3_0_235', 'hive_2_6_3_0_235-hcatalog', 'hive_2_6_3_0_235-jdbc', 'hive_2_6_3_0_235-webhcat', 'livy2_2_6_3_0_235', 'oozie_2_6_3_0_235', 'oozie_2_6_3_0_235-client', 'oozie_2_6_3_0_235-common', 'oozie_2_6_3_0_235-sharelib', 'oozie_2_6_3_0_235-sharelib-distcp', 'oozie_2_6_3_0_235-sharelib-hcatalog', 'oozie_2_6_3_0_235-sharelib-hive', 'oozie_2_6_3_0_235-sharelib-hive2', 'oozie_2_6_3_0_235-sharelib-mapreduce-streaming', 'oozie_2_6_3_0_235-sharelib-pig', 'oozie_2_6_3_0_235-sharelib-spark', 'oozie_2_6_3_0_235-sharelib-sqoop', 'oozie_2_6_3_0_235-webapp', 'pig_2_6_3_0_235', 'ranger_2_6_3_0_235-hbase-plugin', 'ranger_2_6_3_0_235-hdfs-plugin', 'ranger_2_6_3_0_235-hive-plugin', 'ranger_2_6_3_0_235-yarn-plugin', 'shc_2_6_3_0_235', 'slider_2_6_3_0_235', 'spark2_2_6_3_0_235', 'spark2_2_6_3_0_235-python', 'spark2_2_6_3_0_235-yarn-shuffle', 'spark_2_6_3_0_235-yarn-shuffle', 'spark_llap_2_6_3_0_235', 'storm_2_6_3_0_235-slider-client', 'tez_2_6_3_0_235', 'tez_hive2_2_6_3_0_235', 'zookeeper_2_6_3_0_235', 'zookeeper_2_6_3_0_235-server', 'extjs', 'snappy-devel']

avatar

@manojSinghK 

You are probably receiving the "access denied" error because your setting for baseurl references http://public-repo-1.hortonworks.com/ and, as you seem to be aware, authentication is now required for that host. Cloudera recently changed the download policy and now to access HDP-related software you need a valid subscription. Please see the announcement here: Transition to private repositories for CDH, HDP and HDF.

 

The same announcement describes new patch releases of Ambari, which are now required to access Cloudera’s private repositories, and which also now exclusively contain the new and legacy releases of HDP and other HDP-related assets. Toward the end of that announcement, you will see a section titled Frequently Asked Questions which explains the required credentials and how to properly obtain them if you don't have them already (they are not generally the same ones used to access Cloudera's website).

 

 

Bill Brooks, Community Moderator
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

avatar
Explorer

@ask_bill_brooks Hi Bill, Thanks! I got this.

But why does this have an impact on already existing setup/cluster.

We restarted services in the past, but never faced the below issue at HDFS client install step during Ambari services startup.

 

resource_management.core.exceptions.Fail: Cannot match package for regexp name hadooplzo_${stack_version}.

 

Does Ambari keep on updating itself in the background with the repos?

If yes, is there a way to disable it?

 

avatar

@manojSinghK 

I obviously don't have access to your logs to prove it, but I strongly suspect that you haven't encountered the issue you're having now upon service restarting in the past because in the past the repo Ambari is set to retrieve files from was publicly accessible, and now it is not. Since the file retrieval necessary to complete the HDFS client install step fails, it appears as if your existing cluster is being impacted.

 

If you want your installation of Ambari to continue retrieving files from the location where it appears that you have Ambari set to retrieve them (i.e., Cloudera's repositories) after January 31, 2021, you will have to have a a valid Cloudera Subscription.

 

Again, please read the announcement here: Transition to private repositories for CDH, HDP and HDF.

 

The same announcement describes new patch releases of Ambari, which are now required to access Cloudera’s private repositories, which now contain the new and legacy releases and other assets such as those necessary to add a new host to an existing HDP cluster.

 

 

Bill Brooks, Community Moderator
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.