Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Ambari failed to install spark-atlas-connector

Highlighted

Ambari failed to install spark-atlas-connector

New Contributor

Hi,

I am currently stuck at the wizard installation of the ambari 2.7.4 with HDP 3.0.1.0. It seems like one of the machines is failing to install the spark-atlas-connector:

 

stderr: 
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/spark_client.py", line 55, in <module>
    SparkClient().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/spark_client.py", line 34, in install
    self.install_packages(env)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 849, in install_packages
    retry_count=agent_stack_retry_count)
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/packaging.py", line 30, in action_install
    self._pkg_manager.install_package(package_name, self.__create_context())
  File "/usr/lib/ambari-agent/lib/ambari_commons/repo_manager/yum_manager.py", line 219, in install_package
    shell.repository_manager_executor(cmd, self.properties, context)
  File "/usr/lib/ambari-agent/lib/ambari_commons/shell.py", line 753, in repository_manager_executor
    raise RuntimeError(message)
RuntimeError: Failed to execute command '/usr/bin/yum -y install spark-atlas-connector_3_0_1_0_187', exited with code '1', message: 'Error: Nothing to do
'
 stdout:
2020-01-24 00:03:53,623 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2020-01-24 00:03:53,626 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2020-01-24 00:03:53,628 - Group['kms'] {}
2020-01-24 00:03:53,628 - Group['livy'] {}
2020-01-24 00:03:53,628 - Group['spark'] {}
2020-01-24 00:03:53,629 - Group['ranger'] {}
2020-01-24 00:03:53,629 - Group['hdfs'] {}
2020-01-24 00:03:53,629 - Group['zeppelin'] {}
2020-01-24 00:03:53,629 - Group['hadoop'] {}
2020-01-24 00:03:53,629 - Group['users'] {}
2020-01-24 00:03:53,629 - Group['knox'] {}
2020-01-24 00:03:53,629 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2020-01-24 00:03:53,630 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2020-01-24 00:03:53,631 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2020-01-24 00:03:53,631 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2020-01-24 00:03:53,632 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2020-01-24 00:03:53,633 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2020-01-24 00:03:53,633 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2020-01-24 00:03:53,634 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2020-01-24 00:03:53,635 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2020-01-24 00:03:53,635 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2020-01-24 00:03:53,636 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2020-01-24 00:03:53,637 - User['kms'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['kms', 'hadoop'], 'uid': None}
2020-01-24 00:03:53,637 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2020-01-24 00:03:53,638 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2020-01-24 00:03:53,638 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2020-01-24 00:03:53,639 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2020-01-24 00:03:53,640 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2020-01-24 00:03:53,640 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2020-01-24 00:03:53,641 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2020-01-24 00:03:53,642 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2020-01-24 00:03:53,642 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2020-01-24 00:03:53,643 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2020-01-24 00:03:53,643 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2020-01-24 00:03:53,644 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
2020-01-24 00:03:53,645 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2020-01-24 00:03:53,646 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2020-01-24 00:03:53,656 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2020-01-24 00:03:53,656 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2020-01-24 00:03:53,657 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2020-01-24 00:03:53,658 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2020-01-24 00:03:53,658 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2020-01-24 00:03:53,674 - call returned (0, '1045')
2020-01-24 00:03:53,675 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1045'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2020-01-24 00:03:53,685 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1045'] due to not_if
2020-01-24 00:03:53,685 - Group['hdfs'] {}
2020-01-24 00:03:53,686 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2020-01-24 00:03:53,687 - FS Type: HDFS
2020-01-24 00:03:53,687 - Directory['/etc/hadoop'] {'mode': 0755}
2020-01-24 00:03:53,705 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2020-01-24 00:03:53,705 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2020-01-24 00:03:53,720 - Repository['HDP-3.0-repo-1'] {'base_url': '<a href="http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0" target="_blank">http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0</a>', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2020-01-24 00:03:53,730 - Repository['HDP-3.0-GPL-repo-1'] {'base_url': '<a href="http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.1.0" target="_blank">http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.1.0</a>', 'action': ['prepare'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2020-01-24 00:03:53,732 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'base_url': '<a href="http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7" target="_blank">http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7</a>', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2020-01-24 00:03:53,734 - Repository[None] {'action': ['create']}
2020-01-24 00:03:53,735 - File['/tmp/tmp6siNUu'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=<a href="http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0\n\npath=/\nenabled=1\ngpgcheck=..." target="_blank">http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0\n\npath=/\nenabled=1\ngpgcheck=...>'}
2020-01-24 00:03:53,735 - Writing File['/tmp/tmp6siNUu'] because contents don't match
2020-01-24 00:03:53,735 - File['/tmp/tmpZE7pez'] {'content': StaticFile('/etc/yum.repos.d/ambari-hdp-1.repo')}
2020-01-24 00:03:53,736 - Writing File['/tmp/tmpZE7pez'] because contents don't match
2020-01-24 00:03:53,736 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-01-24 00:03:53,867 - Skipping installation of existing package unzip
2020-01-24 00:03:53,867 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-01-24 00:03:53,939 - Skipping installation of existing package curl
2020-01-24 00:03:53,939 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-01-24 00:03:54,011 - Skipping installation of existing package hdp-select
2020-01-24 00:03:54,014 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2020-01-24 00:03:54,252 - Package['spark2_3_0_1_0_187'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-01-24 00:03:54,380 - Skipping installation of existing package spark2_3_0_1_0_187
2020-01-24 00:03:54,381 - Package['spark2_3_0_1_0_187-python'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-01-24 00:03:54,452 - Skipping installation of existing package spark2_3_0_1_0_187-python
2020-01-24 00:03:54,453 - Package['livy2_3_0_1_0_187'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-01-24 00:03:54,526 - Skipping installation of existing package livy2_3_0_1_0_187
2020-01-24 00:03:54,527 - Package['spark-atlas-connector_3_0_1_0_187'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2020-01-24 00:03:54,598 - Installing package spark-atlas-connector_3_0_1_0_187 ('/usr/bin/yum -y install spark-atlas-connector_3_0_1_0_187')
2020-01-24 00:03:57,994 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed

Command failed after 1 tries

 

I tried to install the spark-atlas connector manually with yum, but it seems like the package is not available:

 

yum -y install spark-atlas-connector_3_0_1_0_187

No package spark-atlas-connector_3_0_1_0_187 available.
Error: Nothing to do

 

Checking the repo list, it seems like all repos are included:

 

yum repolist

repo id                                repo name                              status
HDP-3.0-GPL-repo-1                     HDP-3.0-GPL-repo-1                        4
HDP-3.0-repo-1                         HDP-3.0-repo-1                          197
HDP-UTILS-1.1.0.22-repo-1              HDP-UTILS-1.1.0.22-repo-1                16
ambari-2.7.4.0                         ambari Version - ambari-2.7.4.0          13 

 

 

 

Do you have any workaround for this issue?

2 REPLIES 2

Re: Ambari failed to install spark-atlas-connector

Mentor

@Rahai 

 

It seems somehow you hdp and hdp-utils repo is being renamed  or appended with xxx-1

i.e HDP-3.0-GPL-repo-1,HDP-3.0-repo-1,HDP-UTILS-1.1.0.22-repo-1

Can you go to the Ambari GUI , Navigate to Admin --> Stack and Versions --> Versions --> Ensure the repo version are correct with the -1 after correcting the error and restart ambari server it will redistribute the new configuration to all hosts and retry running the install

Hope that helps

Highlighted

Re: Ambari failed to install spark-atlas-connector

New Contributor

Hello,

 

Strangely package spark-atlas-connector_3_0_1_0_187 is missing in repository http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0 ...
So we must first bypass its installation so that the deployment continues then if necessary manually install spark-atlas-connector

 

# Fake rpm creation

 

[root@ambari ~]# yum install rpm-build

[root@ambari ~]# mkdir -p rpmbuild/SPECS/

[root@ambari ~]# vi rpmbuild/SPECS/spark-atlas-connector.spec
Name: spark-atlas-connector_3_0_1_0_187
Version: 3_0_1_0
Release: 187
Summary: -
Group: -
License: -
URL: http://
Vendor: -
Source: http://
Prefix: %{_prefix}
Packager: Karthik
BuildRoot: %{_tmppath}/%{name}-root

%description
Fake
%prep
%build
%install
%clean
%files
%changelog


[root@ambari ~]# rpmbuild -ba rpmbuild/SPECS/spark-atlas-connector.spec
[root@ambari ~]# mkdir -p repo/spark-atlas-connector
[root@ambari ~]# cp rpmbuild/RPMS/x86_64/spark-atlas-connector_3_0_1_0_187-3_0_1_0-187.x86_64.rpm repo/spark-atlas-connector/


# Local YUM repository creation & registering

 

[root@ambari ~]# createrepo repo/spark-atlas-connector

[root@ambari ~]# vi /etc/yum.repos.d/localrepo.repo
[localrepo]
name=spark-atlas-connector localrepo
baseurl=file:///root/repo/spark-atlas-connector
gpgcheck=0
enabled=1


[root@ambari ~]# yum makecache


# Retry deployment

 

Have a nice day !

Don't have an account?
Coming from Hortonworks? Activate your account here