<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Add Service on Hortonworks Data Platform in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Add-Service-on-Hortonworks-Data-Platform/m-p/368724#M240243</link>
    <description>&lt;P&gt;I just clicked the "Proceed Anyway" button, and got this error pop up.&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xedonedron_0-1681755464488.png" style="width: 400px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/37315iB16026BC43CD1B66/image-size/medium?v=v2&amp;amp;px=400" role="button" title="xedonedron_0-1681755464488.png" alt="xedonedron_0-1681755464488.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="xedonedron_1-1681755470041.png" style="width: 400px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/37316i7853F5C7AD4E6990/image-size/medium?v=v2&amp;amp;px=400" role="button" title="xedonedron_1-1681755470041.png" alt="xedonedron_1-1681755470041.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;And here are the copy&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;stderr:&lt;BR /&gt;&amp;lt;script id="metamorph-7876-start" type="text/x-placeholder"&amp;gt;&amp;lt;/script&amp;gt;Traceback (most recent call last):&lt;BR /&gt;File "/var/lib/ambari-agent/cache/common-services/AIRFLOW/1.10.0/package/scripts/airflow_scheduler_control.py", line 61, in &amp;lt;module&amp;gt;&lt;BR /&gt;AirflowScheduler().execute()&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 375, in execute&lt;BR /&gt;method(env)&lt;BR /&gt;File "/var/lib/ambari-agent/cache/common-services/AIRFLOW/1.10.0/package/scripts/airflow_scheduler_control.py", line 14, in install&lt;BR /&gt;self.install_packages(env)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 821, in install_packages&lt;BR /&gt;retry_count=agent_stack_retry_count)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__&lt;BR /&gt;self.env.run()&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run&lt;BR /&gt;self.run_action(resource, action)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action&lt;BR /&gt;provider_action()&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/providers/package/__init__.py", line 53, in action_install&lt;BR /&gt;self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/providers/package/yumrpm.py", line 264, in install_package&lt;BR /&gt;self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/providers/package/__init__.py", line 266, in checked_call_with_retries&lt;BR /&gt;return self._call_with_retries(cmd, is_checked=True, **kwargs)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/providers/package/__init__.py", line 283, in _call_with_retries&lt;BR /&gt;code, out = func(cmd, **kwargs)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner&lt;BR /&gt;result = function(command, **kwargs)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call&lt;BR /&gt;tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper&lt;BR /&gt;result = _call(command, **kwargs_copy)&lt;BR /&gt;File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 303, in _call&lt;BR /&gt;raise ExecutionFailed(err_msg, code, out, err)&lt;BR /&gt;resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install mysql-devel' returned 1. One of the configured repositories failed (HDP-2.6-repo-1),&lt;BR /&gt;and yum doesn't have enough cached data to continue. At this point the only&lt;BR /&gt;safe thing yum can do is fail. There are a few ways to work "fix" this:&lt;/P&gt;&lt;P&gt;1. Contact the upstream for the repository and get them to fix the problem.&lt;/P&gt;&lt;P&gt;2. Reconfigure the baseurl/etc. for the repository, to point to a working&lt;BR /&gt;upstream. This is most often useful if you are using a newer&lt;BR /&gt;distribution release than is supported by the repository (and the&lt;BR /&gt;packages for the previous distribution release still work).&lt;/P&gt;&lt;P&gt;3. Run the command with the repository temporarily disabled&lt;BR /&gt;yum --disablerepo=HDP-2.6-repo-1 ...&lt;/P&gt;&lt;P&gt;4. Disable the repository permanently, so yum won't use it by default. Yum&lt;BR /&gt;will then just ignore the repository until you permanently enable it&lt;BR /&gt;again or use --enablerepo for temporary usage:&lt;/P&gt;&lt;P&gt;yum-config-manager --disable HDP-2.6-repo-1&lt;BR /&gt;or&lt;BR /&gt;subscription-manager repos --disable=HDP-2.6-repo-1&lt;/P&gt;&lt;P&gt;5. Configure the failing repository to be skipped, if it is unavailable.&lt;BR /&gt;Note that yum will try to contact the repo. when it runs most commands,&lt;BR /&gt;so will have to try and fail each time (and thus. yum will be be much&lt;BR /&gt;slower). If it is a very temporary problem though, this is often a nice&lt;BR /&gt;compromise:&lt;/P&gt;&lt;P&gt;yum-config-manager --save --setopt=HDP-2.6-repo-1.skip_if_unavailable=true&lt;/P&gt;&lt;P&gt;failure: repodata/repomd.xml from HDP-2.6-repo-1: [Errno 256] No more mirrors to try.&lt;BR /&gt;&lt;A href="http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.0/repodata/repomd.xml" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.0/repodata/repomd.xml&lt;/A&gt;: [Errno 14] HTTP Error 403 - Forbidden&amp;lt;script id="metamorph-7876-end" type="text/x-placeholder"&amp;gt;&amp;lt;/script&amp;gt;&lt;BR /&gt;stdout:&lt;BR /&gt;&amp;lt;script id="metamorph-7878-start" type="text/x-placeholder"&amp;gt;&amp;lt;/script&amp;gt;2023-04-17 18:06:58,026 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -&amp;gt; 2.6&lt;BR /&gt;2023-04-17 18:06:58,039 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf&lt;BR /&gt;2023-04-17 18:06:58,041 - Group['livy'] {}&lt;BR /&gt;2023-04-17 18:06:58,043 - Group['spark'] {}&lt;BR /&gt;2023-04-17 18:06:58,043 - Group['ranger'] {}&lt;BR /&gt;2023-04-17 18:06:58,043 - Group['hdfs'] {}&lt;BR /&gt;2023-04-17 18:06:58,043 - Group['zeppelin'] {}&lt;BR /&gt;2023-04-17 18:06:58,044 - Group['hadoop'] {}&lt;BR /&gt;2023-04-17 18:06:58,044 - Group['users'] {}&lt;BR /&gt;2023-04-17 18:06:58,044 - Group['knox'] {}&lt;BR /&gt;2023-04-17 18:06:58,045 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,047 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,049 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,051 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,053 - User['superset'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,055 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,057 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,059 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,061 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'ranger'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,063 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,065 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,068 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,070 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,071 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,073 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,075 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,077 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,079 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,080 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,082 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,084 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,087 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,089 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,091 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}&lt;BR /&gt;2023-04-17 18:06:58,092 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;BR /&gt;2023-04-17 18:06:58,097 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}&lt;BR /&gt;2023-04-17 18:06:58,102 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if&lt;BR /&gt;2023-04-17 18:06:58,103 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}&lt;BR /&gt;2023-04-17 18:06:58,104 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;BR /&gt;2023-04-17 18:06:58,107 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;BR /&gt;2023-04-17 18:06:58,108 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}&lt;BR /&gt;2023-04-17 18:06:58,115 - call returned (0, '1014')&lt;BR /&gt;2023-04-17 18:06:58,116 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}&lt;BR /&gt;2023-04-17 18:06:58,121 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] due to not_if&lt;BR /&gt;2023-04-17 18:06:58,121 - Group['hdfs'] {}&lt;BR /&gt;2023-04-17 18:06:58,122 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}&lt;BR /&gt;2023-04-17 18:06:58,123 - FS Type:&lt;BR /&gt;2023-04-17 18:06:58,123 - Directory['/etc/hadoop'] {'mode': 0755}&lt;BR /&gt;2023-04-17 18:06:58,153 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}&lt;BR /&gt;2023-04-17 18:06:58,154 - Writing File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] because contents don't match&lt;BR /&gt;2023-04-17 18:06:58,155 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}&lt;BR /&gt;2023-04-17 18:06:58,176 - Repository['HDP-2.6-repo-1'] {'append_to_file': False, 'base_url': '&lt;A href="http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.0" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.0&lt;/A&gt;', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}&lt;BR /&gt;2023-04-17 18:06:58,190 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=&lt;A href="http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.0\" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.0\&lt;/A&gt;n\npath=/\nenabled=1\ngpgcheck=0'}&lt;BR /&gt;2023-04-17 18:06:58,191 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match&lt;BR /&gt;2023-04-17 18:06:58,191 - Repository with url &lt;A href="http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.5.0" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.5.0&lt;/A&gt; is not created due to its tags: set([u'GPL'])&lt;BR /&gt;2023-04-17 18:06:58,192 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'append_to_file': True, 'base_url': '&lt;A href="http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7&lt;/A&gt;', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}&lt;BR /&gt;2023-04-17 18:06:58,198 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=&lt;A href="http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.0\" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.0\&lt;/A&gt;n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=&lt;A href="http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\&lt;/A&gt;n\npath=/\nenabled=1\ngpgcheck=0'}&lt;BR /&gt;2023-04-17 18:06:58,198 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match&lt;BR /&gt;2023-04-17 18:06:58,199 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;BR /&gt;2023-04-17 18:06:58,375 - Skipping installation of existing package unzip&lt;BR /&gt;2023-04-17 18:06:58,375 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;BR /&gt;2023-04-17 18:06:58,392 - Skipping installation of existing package curl&lt;BR /&gt;2023-04-17 18:06:58,392 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;BR /&gt;2023-04-17 18:06:58,411 - Skipping installation of existing package hdp-select&lt;BR /&gt;2023-04-17 18:06:58,419 - The repository with version 2.6.5.0-292 for this command has been marked as resolved. It will be used to report the version of the component which was installed&lt;BR /&gt;2023-04-17 18:06:58,431 - Skipping stack-select on AIRFLOW because it does not exist in the stack-select package structure.&lt;BR /&gt;2023-04-17 18:06:58,816 - Package['krb5-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;BR /&gt;2023-04-17 18:06:58,918 - Skipping installation of existing package krb5-devel&lt;BR /&gt;2023-04-17 18:06:58,920 - Package['python-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;BR /&gt;2023-04-17 18:06:58,938 - Installing package python-devel ('/usr/bin/yum -d 0 -e 0 -y install python-devel')&lt;BR /&gt;2023-04-17 18:10:11,740 - Package['sqlite-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;BR /&gt;2023-04-17 18:10:11,764 - Installing package sqlite-devel ('/usr/bin/yum -d 0 -e 0 -y install sqlite-devel')&lt;BR /&gt;2023-04-17 18:10:20,870 - Package['openssl-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;BR /&gt;2023-04-17 18:10:20,888 - Skipping installation of existing package openssl-devel&lt;BR /&gt;2023-04-17 18:10:20,889 - Package['mysql-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;BR /&gt;2023-04-17 18:10:20,909 - Installing package mysql-devel ('/usr/bin/yum -d 0 -e 0 -y install mysql-devel')&lt;BR /&gt;2023-04-17 18:10:30,423 - Execution of '/usr/bin/yum -d 0 -e 0 -y install mysql-devel' returned 1. Error: mariadb101u-libs conflicts with mysql-community-libs-5.7.42-1.el7.x86_64&lt;BR /&gt;Error: mariadb101u-config conflicts with mysql-community-server-5.7.42-1.el7.x86_64&lt;BR /&gt;Error: mariadb101u-common conflicts with mysql-community-common-5.7.42-1.el7.x86_64&lt;BR /&gt;You could try using --skip-broken to work around the problem&lt;BR /&gt;You could try running: rpm -Va --nofiles --nodigest&lt;BR /&gt;2023-04-17 18:10:30,424 - Failed to install package mysql-devel. Executing '/usr/bin/yum clean metadata'&lt;BR /&gt;2023-04-17 18:10:30,755 - Retrying to install package mysql-devel after 30 seconds&lt;BR /&gt;2023-04-17 18:11:02,195 - The repository with version 2.6.5.0-292 for this command has been marked as resolved. It will be used to report the version of the component which was installed&lt;BR /&gt;2023-04-17 18:11:02,208 - Skipping stack-select on AIRFLOW because it does not exist in the stack-select package structure.&lt;/P&gt;&lt;P&gt;Command failed after 1 tries&lt;BR /&gt;&amp;lt;script id="metamorph-7878-end" type="text/x-placeholder"&amp;gt;&amp;lt;/script&amp;gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I'am using Airflow mpack from public Github repository, is it cause the error or what? I'am using HDP 2.6.5, and i can switch to HDP 3 too if necessary.&lt;/P&gt;</description>
    <pubDate>Mon, 17 Apr 2023 18:24:09 GMT</pubDate>
    <dc:creator>xedonedron</dc:creator>
    <dc:date>2023-04-17T18:24:09Z</dc:date>
  </channel>
</rss>

