Created 04-14-2023 09:06 PM
I'm installed HDP 2.6.5 and 3.0.1 and want to add another service like Ambari Metrics, Airflow and integrate with HDF/CDF service, that is NiFi. I already choose the service that i want to add like Ambari Metrics or NiFi, done with Assign Masters, Slaves and Clients but im stuck with customize services (for here is to set password).
Here for the recommended changes, acutally i have replace the current value with the recommended value. But im still got an error
Any advice? Thanks before...
Created 04-17-2023 04:52 AM
@xedonedron, Welcome to our community! To help you get the best possible answer, I have tagged in our HDP experts @Kartik_Agarwal @nikhilm @ywu @shehbazk who may be able to assist you further.
Please feel free to provide any additional information or details about your query, and we hope that you will find a satisfactory solution to your question.
Regards,
Vidya Sargur,Created 04-17-2023 05:06 AM
Hello @xedonedron, What is the error you are receiving? You can safely click on "Proceed Anyway" and install the service on Ambari.
Created 04-17-2023 11:24 AM
I just clicked the "Proceed Anyway" button, and got this error pop up.
And here are the copy
stderr:
<script id="metamorph-7876-start" type="text/x-placeholder"></script>Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/AIRFLOW/1.10.0/package/scripts/airflow_scheduler_control.py", line 61, in <module>
AirflowScheduler().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 375, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/AIRFLOW/1.10.0/package/scripts/airflow_scheduler_control.py", line 14, in install
self.install_packages(env)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 821, in install_packages
retry_count=agent_stack_retry_count)
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/package/__init__.py", line 53, in action_install
self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/package/yumrpm.py", line 264, in install_package
self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/package/__init__.py", line 266, in checked_call_with_retries
return self._call_with_retries(cmd, is_checked=True, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/package/__init__.py", line 283, in _call_with_retries
code, out = func(cmd, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install mysql-devel' returned 1. One of the configured repositories failed (HDP-2.6-repo-1),
and yum doesn't have enough cached data to continue. At this point the only
safe thing yum can do is fail. There are a few ways to work "fix" this:
1. Contact the upstream for the repository and get them to fix the problem.
2. Reconfigure the baseurl/etc. for the repository, to point to a working
upstream. This is most often useful if you are using a newer
distribution release than is supported by the repository (and the
packages for the previous distribution release still work).
3. Run the command with the repository temporarily disabled
yum --disablerepo=HDP-2.6-repo-1 ...
4. Disable the repository permanently, so yum won't use it by default. Yum
will then just ignore the repository until you permanently enable it
again or use --enablerepo for temporary usage:
yum-config-manager --disable HDP-2.6-repo-1
or
subscription-manager repos --disable=HDP-2.6-repo-1
5. Configure the failing repository to be skipped, if it is unavailable.
Note that yum will try to contact the repo. when it runs most commands,
so will have to try and fail each time (and thus. yum will be be much
slower). If it is a very temporary problem though, this is often a nice
compromise:
yum-config-manager --save --setopt=HDP-2.6-repo-1.skip_if_unavailable=true
failure: repodata/repomd.xml from HDP-2.6-repo-1: [Errno 256] No more mirrors to try.
http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.0/repodata/repomd.xml: [Errno 14] HTTP Error 403 - Forbidden<script id="metamorph-7876-end" type="text/x-placeholder"></script>
stdout:
<script id="metamorph-7878-start" type="text/x-placeholder"></script>2023-04-17 18:06:58,026 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2023-04-17 18:06:58,039 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2023-04-17 18:06:58,041 - Group['livy'] {}
2023-04-17 18:06:58,043 - Group['spark'] {}
2023-04-17 18:06:58,043 - Group['ranger'] {}
2023-04-17 18:06:58,043 - Group['hdfs'] {}
2023-04-17 18:06:58,043 - Group['zeppelin'] {}
2023-04-17 18:06:58,044 - Group['hadoop'] {}
2023-04-17 18:06:58,044 - Group['users'] {}
2023-04-17 18:06:58,044 - Group['knox'] {}
2023-04-17 18:06:58,045 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,047 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,049 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,051 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,053 - User['superset'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,055 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,057 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2023-04-17 18:06:58,059 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2023-04-17 18:06:58,061 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'ranger'], 'uid': None}
2023-04-17 18:06:58,063 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2023-04-17 18:06:58,065 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid': None}
2023-04-17 18:06:58,068 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,070 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,071 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,073 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2023-04-17 18:06:58,075 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,077 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,079 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2023-04-17 18:06:58,080 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,082 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,084 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,087 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,089 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,091 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2023-04-17 18:06:58,092 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2023-04-17 18:06:58,097 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2023-04-17 18:06:58,102 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2023-04-17 18:06:58,103 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2023-04-17 18:06:58,104 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2023-04-17 18:06:58,107 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2023-04-17 18:06:58,108 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2023-04-17 18:06:58,115 - call returned (0, '1014')
2023-04-17 18:06:58,116 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2023-04-17 18:06:58,121 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] due to not_if
2023-04-17 18:06:58,121 - Group['hdfs'] {}
2023-04-17 18:06:58,122 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2023-04-17 18:06:58,123 - FS Type:
2023-04-17 18:06:58,123 - Directory['/etc/hadoop'] {'mode': 0755}
2023-04-17 18:06:58,153 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2023-04-17 18:06:58,154 - Writing File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] because contents don't match
2023-04-17 18:06:58,155 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2023-04-17 18:06:58,176 - Repository['HDP-2.6-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2023-04-17 18:06:58,190 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2023-04-17 18:06:58,191 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2023-04-17 18:06:58,191 - Repository with url http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.5.0 is not created due to its tags: set([u'GPL'])
2023-04-17 18:06:58,192 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2023-04-17 18:06:58,198 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.5.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2023-04-17 18:06:58,198 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2023-04-17 18:06:58,199 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2023-04-17 18:06:58,375 - Skipping installation of existing package unzip
2023-04-17 18:06:58,375 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2023-04-17 18:06:58,392 - Skipping installation of existing package curl
2023-04-17 18:06:58,392 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2023-04-17 18:06:58,411 - Skipping installation of existing package hdp-select
2023-04-17 18:06:58,419 - The repository with version 2.6.5.0-292 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2023-04-17 18:06:58,431 - Skipping stack-select on AIRFLOW because it does not exist in the stack-select package structure.
2023-04-17 18:06:58,816 - Package['krb5-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2023-04-17 18:06:58,918 - Skipping installation of existing package krb5-devel
2023-04-17 18:06:58,920 - Package['python-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2023-04-17 18:06:58,938 - Installing package python-devel ('/usr/bin/yum -d 0 -e 0 -y install python-devel')
2023-04-17 18:10:11,740 - Package['sqlite-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2023-04-17 18:10:11,764 - Installing package sqlite-devel ('/usr/bin/yum -d 0 -e 0 -y install sqlite-devel')
2023-04-17 18:10:20,870 - Package['openssl-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2023-04-17 18:10:20,888 - Skipping installation of existing package openssl-devel
2023-04-17 18:10:20,889 - Package['mysql-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2023-04-17 18:10:20,909 - Installing package mysql-devel ('/usr/bin/yum -d 0 -e 0 -y install mysql-devel')
2023-04-17 18:10:30,423 - Execution of '/usr/bin/yum -d 0 -e 0 -y install mysql-devel' returned 1. Error: mariadb101u-libs conflicts with mysql-community-libs-5.7.42-1.el7.x86_64
Error: mariadb101u-config conflicts with mysql-community-server-5.7.42-1.el7.x86_64
Error: mariadb101u-common conflicts with mysql-community-common-5.7.42-1.el7.x86_64
You could try using --skip-broken to work around the problem
You could try running: rpm -Va --nofiles --nodigest
2023-04-17 18:10:30,424 - Failed to install package mysql-devel. Executing '/usr/bin/yum clean metadata'
2023-04-17 18:10:30,755 - Retrying to install package mysql-devel after 30 seconds
2023-04-17 18:11:02,195 - The repository with version 2.6.5.0-292 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2023-04-17 18:11:02,208 - Skipping stack-select on AIRFLOW because it does not exist in the stack-select package structure.
Command failed after 1 tries
<script id="metamorph-7878-end" type="text/x-placeholder"></script>
I'am using Airflow mpack from public Github repository, is it cause the error or what? I'am using HDP 2.6.5, and i can switch to HDP 3 too if necessary.
Created 04-20-2023 01:01 AM
@nikhilm can you help me? I'd really waiting for your answer, big thanks before.
Created 04-20-2023 05:09 AM
Hello @xedonedron, I believe that something is wrong with the configured repository on the hosts. The 403-Forbidden states that either you do not have access to the repository or it is not available.
As far as I know the public repositories for HDP have been disabled.
Created 04-21-2023 09:07 PM
So there's no way to merge Airflow and NiFi to the HDP?