Support Questions

Find answers, ask questions, and share your expertise

Hello,

it's not the first time i have this issue and i need a solution to reduce the downtime.

after each OS upgrade i was not able to start any service. each time i have this error message :

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/yarn_client.py", line 62, in <module>
    YarnClient().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 375, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/yarn_client.py", line 34, in install
    self.install_packages(env)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 811, in install_packages
    name = self.format_package_name(package['name'])
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 546, in format_package_name
    raise Fail("Cannot match package for regexp name {0}. Available packages: {1}".format(name, self.available_packages_in_repos))
resource_management.core.exceptions.Fail: Cannot match package for regexp name hadoop_${stack_version}-yarn. Available packages: ['accumulo', 'accumulo-conf-standalone', 'accumulo-source', 'accumulo_2_6_5_0_292', 'accumulo_2_6_5_0_292-conf-standalone', 'accumulo_2_6_5_0_292-source', 'atlas-metadata', 'atlas-metadata-falcon-plugin', 'atlas-metadata-hive-plugin', 'atlas-metadata-sqoop-plugin', 'atlas-metadata-storm-plugin', 'atlas-metadata_2_6_5_0_292', 'atlas-metadata_2_6_5_0_292-falcon-plugin', 'atlas-metadata_2_6_5_0_292-storm-plugin', 'bigtop-jsvc', 'bigtop-tomcat', 'datafu', 'druid', 'druid_2_6_5_0_292', 'falcon', 'falcon-doc', 'falcon_2_6_5_0_292', 'falcon_2_6_5_0_292-doc', 'flume', 'flume-agent', 'flume_2_6_5_0_292', 'flume_2_6_5_0_292-agent', 'hadoop', 'hadoop-client', 'hadoop-conf-pseudo', 'hadoop-doc', 'hadoop-hdfs', 'hadoop-hdfs-datanode', 'hadoop-hdfs-fuse', 'hadoop-hdfs-journalnode', 'hadoop-hdfs-namenode', 'hadoop-hdfs-secondarynamenode', 'hadoop-hdfs-zkfc', 'hadoop-httpfs', 'hadoop-httpfs-server', 'hadoop-libhdfs', 'hadoop-mapreduce', 'hadoop-mapreduce-historyserver', 'hadoop-source', 'hadoop-yarn', 'hadoop-yarn-nodemanager', 'hadoop-yarn-proxyserver', 'hadoop-yarn-resourcemanager', 'hadoop-yarn-timelineserver', 'hadoop_2_6_5_0_292-conf-pseudo', 'hadoop_2_6_5_0_292-doc', 'hadoop_2_6_5_0_292-hdfs-datanode', 'hadoop_2_6_5_0_292-hdfs-fuse', 'hadoop_2_6_5_0_292-hdfs-journalnode', 'hadoop_2_6_5_0_292-hdfs-namenode', 'hadoop_2_6_5_0_292-hdfs-secondarynamenode', 'hadoop_2_6_5_0_292-hdfs-zkfc', 'hadoop_2_6_5_0_292-httpfs', 'hadoop_2_6_5_0_292-httpfs-server', 'hadoop_2_6_5_0_292-mapreduce-historyserver', 'hadoop_2_6_5_0_292-source', 'hadoop_2_6_5_0_292-yarn-nodemanager', 'hadoop_2_6_5_0_292-yarn-proxyserver', 'hadoop_2_6_5_0_292-yarn-resourcemanager', 'hadoop_2_6_5_0_292-yarn-timelineserver', 'hbase', 'hbase-doc', 'hbase-master', 'hbase-regionserver', 'hbase-rest', 'hbase-thrift', 'hbase-thrift2', 'hbase_2_6_5_0_292-doc', 'hbase_2_6_5_0_292-master', 'hbase_2_6_5_0_292-regionserver', 'hbase_2_6_5_0_292-rest', 'hbase_2_6_5_0_292-thrift', 'hbase_2_6_5_0_292-thrift2', 'hive', 'hive-hcatalog', 'hive-hcatalog-server', 'hive-jdbc', 'hive-metastore', 'hive-server', 'hive-server2', 'hive-webhcat', 'hive-webhcat-server', 'hive2', 'hive2-jdbc', 'hive_2_6_5_0_292-hcatalog-server', 'hive_2_6_5_0_292-metastore', 'hive_2_6_5_0_292-server', 'hive_2_6_5_0_292-server2', 'hive_2_6_5_0_292-webhcat-server', 'hue', 'hue-beeswax', 'hue-common', 'hue-hcatalog', 'hue-oozie', 'hue-pig', 'hue-server', 'kafka', 'kafka_2_6_5_0_292', 'knox', 'knox_2_6_5_0_292', 'livy', 'livy2', 'livy_2_6_5_0_292', 'mahout', 'mahout-doc', 'mahout_2_6_5_0_292', 'mahout_2_6_5_0_292-doc', 'oozie', 'oozie-client', 'oozie-common', 'oozie-sharelib', 'oozie-sharelib-distcp', 'oozie-sharelib-hcatalog', 'oozie-sharelib-hive', 'oozie-sharelib-hive2', 'oozie-sharelib-mapreduce-streaming', 'oozie-sharelib-pig', 'oozie-sharelib-spark', 'oozie-sharelib-sqoop', 'oozie-webapp', 'oozie_2_6_5_0_292', 'oozie_2_6_5_0_292-client', 'oozie_2_6_5_0_292-common', 'oozie_2_6_5_0_292-sharelib', 'oozie_2_6_5_0_292-sharelib-distcp', 'oozie_2_6_5_0_292-sharelib-hcatalog', 'oozie_2_6_5_0_292-sharelib-hive', 'oozie_2_6_5_0_292-sharelib-hive2', 'oozie_2_6_5_0_292-sharelib-mapreduce-streaming', 'oozie_2_6_5_0_292-sharelib-pig', 'oozie_2_6_5_0_292-sharelib-spark', 'oozie_2_6_5_0_292-sharelib-sqoop', 'oozie_2_6_5_0_292-webapp', 'phoenix', 'phoenix-queryserver', 'phoenix_2_6_5_0_292', 'phoenix_2_6_5_0_292-queryserver', 'pig', 'ranger-admin', 'ranger-atlas-plugin', 'ranger-hbase-plugin', 'ranger-hdfs-plugin', 'ranger-hive-plugin', 'ranger-kafka-plugin', 'ranger-kms', 'ranger-knox-plugin', 'ranger-solr-plugin', 'ranger-storm-plugin', 'ranger-tagsync', 'ranger-usersync', 'ranger-yarn-plugin', 'ranger_2_6_5_0_292-admin', 'ranger_2_6_5_0_292-atlas-plugin', 'ranger_2_6_5_0_292-kafka-plugin', 'ranger_2_6_5_0_292-kms', 'ranger_2_6_5_0_292-knox-plugin', 'ranger_2_6_5_0_292-solr-plugin', 'ranger_2_6_5_0_292-storm-plugin', 'ranger_2_6_5_0_292-tagsync', 'ranger_2_6_5_0_292-usersync', 'shc', 'slider', 'spark', 'spark-history-server', 'spark-master', 'spark-python', 'spark-worker', 'spark-yarn-shuffle', 'spark2', 'spark2-history-server', 'spark2-master', 'spark2-python', 'spark2-worker', 'spark2-yarn-shuffle', 'spark2_2_6_5_0_292-history-server', 'spark2_2_6_5_0_292-master', 'spark2_2_6_5_0_292-worker', 'spark_2_6_5_0_292', 'spark_2_6_5_0_292-history-server', 'spark_2_6_5_0_292-master', 'spark_2_6_5_0_292-python', 'spark_2_6_5_0_292-worker', 'spark_llap', 'sqoop', 'sqoop-metastore', 'sqoop_2_6_5_0_292-metastore', 'storm', 'storm-slider-client', 'storm_2_6_5_0_292', 'superset', 'superset_2_6_5_0_292', 'tez', 'tez_hive2', 'zeppelin', 'zeppelin_2_6_5_0_292', 'zookeeper', 'zookeeper-server', 'Uploading', 'Report', 'langpacks,', 'R', 'R-core', 'R-core-devel', 'R-devel', 'R-java', 'R-java-devel', 'compat-readline5', 'epel-release', 'extjs', 'fping', 'ganglia-debuginfo', 'ganglia-devel', 'ganglia-gmetad', 'ganglia-gmond', 'ganglia-gmond-modules-python', 'ganglia-web', 'hadoop-lzo', 'hadoop-lzo-native', 'libRmath', 'libRmath-devel', 'libconfuse', 'libganglia', 'libgenders', 'lua-rrdtool', 'lucidworks-hdpsearch', 'lzo-debuginfo', 'lzo-devel', 'lzo-minilzo', 'nagios', 'nagios-debuginfo', 'nagios-devel', 'nagios-plugins', 'nagios-plugins-debuginfo', 'nagios-www', 'pdsh', 'perl-rrdtool', 'python-rrdtool', 'rrdtool', 'rrdtool-debuginfo', 'rrdtool-devel', 'ruby-rrdtool', 'snappy', 'snappy-devel', 'tcl-rrdtool', 'Uploading', 'Report', 'langpacks,', 'R', 'R-core', 'R-core-devel', 'R-devel', 'R-java', 'R-java-devel', 'compat-readline5', 'epel-release', 'extjs', 'fping', 'ganglia-debuginfo', 'ganglia-devel', 'ganglia-gmetad', 'ganglia-gmond', 'ganglia-gmond-modules-python', 'ganglia-web', 'hadoop-lzo', 'hadoop-lzo-native', 'libRmath', 'libRmath-devel', 'libconfuse', 'libganglia', 'libgenders', 'lua-rrdtool', 'lucidworks-hdpsearch', 'lzo-debuginfo', 'lzo-devel', 'lzo-minilzo', 'nagios', 'nagios-debuginfo', 'nagios-devel', 'nagios-plugins', 'nagios-plugins-debuginfo', 'nagios-www', 'pdsh', 'perl-rrdtool', 'python-rrdtool', 'rrdtool', 'rrdtool-debuginfo', 'rrdtool-devel', 'ruby-rrdtool', 'snappy', 'snappy-devel', 'tcl-rrdtool', 'Uploading', 'Report', 'langpacks,']

this error means that the name of the package is not found on the list of installed package.

in my opinion the error came from the name of the yum repository. In fact when we upgrade the OS, the version of the previous OS is added to the name of the (old) yum repository

Example:

before OS upgrade :

yum list installed | grep  hadoop_2_6_5
hadoop_2_6_5_0_292.x86_64         2.7.3.2.6.5.0-292      @HDP-2.6-repo-101
hadoop_2_6_5_0_292-client.x86_64  2.7.3.2.6.5.0-292      @HDP-2.6-repo-101
hadoop_2_6_5_0_292-hdfs.x86_64    2.7.3.2.6.5.0-292      @HDP-2.6-repo-101
hadoop_2_6_5_0_292-libhdfs.x86_64 2.7.3.2.6.5.0-292      @HDP-2.6-repo-101
hadoop_2_6_5_0_292-yarn.x86_64    2.7.3.2.6.5.0-292      @HDP-2.6-repo-101

After OS upgrade:

yum list installed | grep  hadoop_2_6_5
hadoop_2_6_5_0_292.x86_64         2.7.3.2.6.5.0-292      @HDP-2.6-repo-101/7.4
hadoop_2_6_5_0_292-client.x86_64  2.7.3.2.6.5.0-292      @HDP-2.6-repo-101/7.4
hadoop_2_6_5_0_292-hdfs.x86_64    2.7.3.2.6.5.0-292      @HDP-2.6-repo-101/7.4
hadoop_2_6_5_0_292-libhdfs.x86_64 2.7.3.2.6.5.0-292      @HDP-2.6-repo-101/7.4
hadoop_2_6_5_0_292-yarn.x86_64    2.7.3.2.6.5.0-292      @HDP-2.6-repo-101/7.4

Do you have a solution ? Is it possible to specify to Ambari to not check the version of the repository ? Is it possible to refresh yum database ?

The only solution i have, is to reinstall all HDP packages 😞

1 REPLY 1

Explorer

I have again this issue with an other upgrade

Do you have any idea how to fix it ?