Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Ambari Spark2 installation failed on Ubunut 16.04 with RuntimeError: Failed to execute command '/usr/bin/apt-get -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install spark-atlas-connector-3-0-1-0-187', exited with code '100', message: 'E: Unable to locate package spark-atl

Ambari Spark2 installation failed on Ubunut 16.04 with RuntimeError: Failed to execute command '/usr/bin/apt-get -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install spark-atlas-connector-3-0-1-0-187', exited with code '100', message: 'E: Unable to locate package spark-atl

New Contributor

Tried installing Install Spark2 History Server from Ambari UI

stderr:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/job_history_server.py", line 102, in <module>
    JobHistoryServer().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/job_history_server.py", line 42, in install
    self.install_packages(env)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 849, in install_packages
    retry_count=agent_stack_retry_count)
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/packaging.py", line 30, in action_install
    self._pkg_manager.install_package(package_name, self.__create_context())
  File "/usr/lib/ambari-agent/lib/ambari_commons/repo_manager/apt_manager.py", line 35, in wrapper
    return function_to_decorate(self, name, *args[2:], **kwargs)
  File "/usr/lib/ambari-agent/lib/ambari_commons/repo_manager/apt_manager.py", line 279, in install_package
    shell.repository_manager_executor(cmd, self.properties, context, env=self.properties.install_cmd_env)
  File "/usr/lib/ambari-agent/lib/ambari_commons/shell.py", line 753, in repository_manager_executor
    raise RuntimeError(message)
RuntimeError: Failed to execute command '/usr/bin/apt-get -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install spark-atlas-connector-3-0-1-0-187', exited with code '100', message: 'E: Unable to locate package spark-atlas-connector-3-0-1-0-187
'

stdout:

2019-12-24 14:48:57,890 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2019-12-24 14:48:57,894 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2019-12-24 14:48:57,895 - Group['livy'] {}
2019-12-24 14:48:57,896 - Group['ubuntu'] {}
2019-12-24 14:48:57,896 - Group['spark'] {}
2019-12-24 14:48:57,896 - Group['hdfs'] {}
2019-12-24 14:48:57,897 - User['livy'] {'gid': 'ubuntu', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'ubuntu'], 'uid': None}
2019-12-24 14:48:57,897 - User['ubuntu'] {'gid': 'ubuntu', 'fetch_nonlocal_groups': True, 'groups': ['ubuntu'], 'uid': None}
2019-12-24 14:48:57,898 - User['spark'] {'gid': 'ubuntu', 'fetch_nonlocal_groups': True, 'groups': ['ubuntu', 'spark'], 'uid': None}
2019-12-24 14:48:57,898 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-12-24 14:48:57,899 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ubuntu /tmp/hadoop-ubuntu,/tmp/hsperfdata_ubuntu,/home/ubuntu,/tmp/ubuntu,/tmp/sqoop-ubuntu 0'] {'not_if': '(test $(id -u ubuntu) -gt 1000) || (false)'}
2019-12-24 14:48:57,921 - Group['ubuntu'] {}
2019-12-24 14:48:57,922 - User['ubuntu'] {'fetch_nonlocal_groups': True, 'groups': ['ubuntu', u'ubuntu']}
2019-12-24 14:48:57,922 - FS Type: HDFS
2019-12-24 14:48:57,922 - Directory['/etc/hadoop'] {'mode': 0755}
2019-12-24 14:48:57,933 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'ubuntu', 'group': 'ubuntu'}
2019-12-24 14:48:57,934 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'ubuntu', 'group': 'ubuntu', 'mode': 01777}
2019-12-24 14:48:57,947 - Repository['HDP-3.0-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/ubuntu16/3.x/updates/3.0.1.0', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2019-12-24 14:48:57,951 - Repository with url http://public-repo-1.hortonworks.com/HDP-GPL/ubuntu16/3.x/updates/3.0.1.0 is not created due to its tags: set([u'GPL'])
2019-12-24 14:48:57,951 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/ubuntu16', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2019-12-24 14:48:57,952 - Repository[None] {'action': ['create']}
2019-12-24 14:48:57,954 - File['/tmp/tmpv32KZc'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP/ubuntu16/3.x/updates/3.0.1.0 HDP main\ndeb http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/ubuntu16 HDP-UTILS main'}
2019-12-24 14:48:57,954 - Writing File['/tmp/tmpv32KZc'] because contents don't match
2019-12-24 14:48:57,954 - File['/tmp/tmpaUg0V6'] {'content': StaticFile('/etc/apt/sources.list.d/ambari-hdp-1.list')}
2019-12-24 14:48:57,955 - Writing File['/tmp/tmpaUg0V6'] because contents don't match
2019-12-24 14:48:57,955 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:57,993 - Skipping installation of existing package unzip
2019-12-24 14:48:57,994 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,028 - Skipping installation of existing package curl
2019-12-24 14:48:58,029 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,063 - Skipping installation of existing package hdp-select
2019-12-24 14:48:58,067 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2019-12-24 14:48:58,214 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2019-12-24 14:48:58,223 - Package['spark2-3-0-1-0-187'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,261 - Skipping installation of existing package spark2-3-0-1-0-187
2019-12-24 14:48:58,261 - Package['spark2-3-0-1-0-187-python'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,295 - Skipping installation of existing package spark2-3-0-1-0-187-python
2019-12-24 14:48:58,296 - Package['livy2-3-0-1-0-187'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,322 - Skipping installation of existing package livy2-3-0-1-0-187
2019-12-24 14:48:58,322 - Package['spark-atlas-connector-3-0-1-0-187'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,357 - Installing package spark-atlas-connector-3-0-1-0-187 ('/usr/bin/apt-get -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install spark-atlas-connector-3-0-1-0-187')
2019-12-24 14:49:30,887 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed

Command failed after 1 tries

 

Ambari: HDP-3.0.1.0

Ubuntu 16.04

repo: http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.7.4.0 Ambari main

Do i miss anything to check or configure?

 

1 REPLY 1
Highlighted

Re: Ambari Spark2 installation failed on Ubunut 16.04 with RuntimeError: Failed to execute command '/usr/bin/apt-get -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install spark-atlas-connector-3-0-1-0-187', exited with code '100', message: 'E: Unable to locate package spark

New Contributor

same issue.

Got any Updates?

Don't have an account?
Coming from Hortonworks? Activate your account here