Member since
12-13-2019
5
Posts
0
Kudos Received
0
Solutions
12-28-2019
12:09 PM
1. Stop Hive service. 2. Delete hive service followed by its dependencies. 3. Restart HDFS, YARN, ZOOKEEPER. 4. Install HIVE, refresh hive client config on every host. 5. Start Hive service. Check logs in back-end, some times you could see permission denied to remove "/var/lib/spark2/shs_db/listing.ldb", just remove it manually and restart the service.
... View more
12-24-2019
07:06 AM
Tried installing Install Spark2 History Server from Ambari UI stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/job_history_server.py", line 102, in <module>
JobHistoryServer().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/job_history_server.py", line 42, in install
self.install_packages(env)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 849, in install_packages
retry_count=agent_stack_retry_count)
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/packaging.py", line 30, in action_install
self._pkg_manager.install_package(package_name, self.__create_context())
File "/usr/lib/ambari-agent/lib/ambari_commons/repo_manager/apt_manager.py", line 35, in wrapper
return function_to_decorate(self, name, *args[2:], **kwargs)
File "/usr/lib/ambari-agent/lib/ambari_commons/repo_manager/apt_manager.py", line 279, in install_package
shell.repository_manager_executor(cmd, self.properties, context, env=self.properties.install_cmd_env)
File "/usr/lib/ambari-agent/lib/ambari_commons/shell.py", line 753, in repository_manager_executor
raise RuntimeError(message)
RuntimeError: Failed to execute command '/usr/bin/apt-get -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install spark-atlas-connector-3-0-1-0-187', exited with code '100', message: 'E: Unable to locate package spark-atlas-connector-3-0-1-0-187
'
stdout:
2019-12-24 14:48:57,890 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2019-12-24 14:48:57,894 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2019-12-24 14:48:57,895 - Group['livy'] {}
2019-12-24 14:48:57,896 - Group['ubuntu'] {}
2019-12-24 14:48:57,896 - Group['spark'] {}
2019-12-24 14:48:57,896 - Group['hdfs'] {}
2019-12-24 14:48:57,897 - User['livy'] {'gid': 'ubuntu', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'ubuntu'], 'uid': None}
2019-12-24 14:48:57,897 - User['ubuntu'] {'gid': 'ubuntu', 'fetch_nonlocal_groups': True, 'groups': ['ubuntu'], 'uid': None}
2019-12-24 14:48:57,898 - User['spark'] {'gid': 'ubuntu', 'fetch_nonlocal_groups': True, 'groups': ['ubuntu', 'spark'], 'uid': None}
2019-12-24 14:48:57,898 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-12-24 14:48:57,899 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ubuntu /tmp/hadoop-ubuntu,/tmp/hsperfdata_ubuntu,/home/ubuntu,/tmp/ubuntu,/tmp/sqoop-ubuntu 0'] {'not_if': '(test $(id -u ubuntu) -gt 1000) || (false)'}
2019-12-24 14:48:57,921 - Group['ubuntu'] {}
2019-12-24 14:48:57,922 - User['ubuntu'] {'fetch_nonlocal_groups': True, 'groups': ['ubuntu', u'ubuntu']}
2019-12-24 14:48:57,922 - FS Type: HDFS
2019-12-24 14:48:57,922 - Directory['/etc/hadoop'] {'mode': 0755}
2019-12-24 14:48:57,933 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'ubuntu', 'group': 'ubuntu'}
2019-12-24 14:48:57,934 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'ubuntu', 'group': 'ubuntu', 'mode': 01777}
2019-12-24 14:48:57,947 - Repository['HDP-3.0-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/ubuntu16/3.x/updates/3.0.1.0', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2019-12-24 14:48:57,951 - Repository with url http://public-repo-1.hortonworks.com/HDP-GPL/ubuntu16/3.x/updates/3.0.1.0 is not created due to its tags: set([u'GPL'])
2019-12-24 14:48:57,951 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/ubuntu16', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2019-12-24 14:48:57,952 - Repository[None] {'action': ['create']}
2019-12-24 14:48:57,954 - File['/tmp/tmpv32KZc'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP/ubuntu16/3.x/updates/3.0.1.0 HDP main\ndeb http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/ubuntu16 HDP-UTILS main'}
2019-12-24 14:48:57,954 - Writing File['/tmp/tmpv32KZc'] because contents don't match
2019-12-24 14:48:57,954 - File['/tmp/tmpaUg0V6'] {'content': StaticFile('/etc/apt/sources.list.d/ambari-hdp-1.list')}
2019-12-24 14:48:57,955 - Writing File['/tmp/tmpaUg0V6'] because contents don't match
2019-12-24 14:48:57,955 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:57,993 - Skipping installation of existing package unzip
2019-12-24 14:48:57,994 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,028 - Skipping installation of existing package curl
2019-12-24 14:48:58,029 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,063 - Skipping installation of existing package hdp-select
2019-12-24 14:48:58,067 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2019-12-24 14:48:58,214 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2019-12-24 14:48:58,223 - Package['spark2-3-0-1-0-187'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,261 - Skipping installation of existing package spark2-3-0-1-0-187
2019-12-24 14:48:58,261 - Package['spark2-3-0-1-0-187-python'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,295 - Skipping installation of existing package spark2-3-0-1-0-187-python
2019-12-24 14:48:58,296 - Package['livy2-3-0-1-0-187'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,322 - Skipping installation of existing package livy2-3-0-1-0-187
2019-12-24 14:48:58,322 - Package['spark-atlas-connector-3-0-1-0-187'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,357 - Installing package spark-atlas-connector-3-0-1-0-187 ('/usr/bin/apt-get -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install spark-atlas-connector-3-0-1-0-187')
2019-12-24 14:49:30,887 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed
Command failed after 1 tries
Ambari: HDP-3.0.1.0
Ubuntu 16.04
repo: http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.7.4.0 Ambari main
Do i miss anything to check or configure?
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Spark
12-13-2019
08:51 AM
Enter advanced database configuration [y/n] (n)? n Configuring database... INFO: Loading properties from /etc/ambari-server/conf/ambari.properties INFO: Loading properties from /etc/ambari-server/conf/ambari.properties INFO: Loading properties from /etc/ambari-server/conf/ambari.properties INFO: Loading properties from /etc/ambari-server/conf/ambari.properties Traceback (most recent call last): File "/usr/sbin/ambari-server.py", line 1078, in <module> mainBody() File "/usr/sbin/ambari-server.py", line 1048, in mainBody main(options, args, parser) File "/usr/sbin/ambari-server.py", line 998, in main action_obj.execute() File "/usr/sbin/ambari-server.py", line 78, in execute self.fn(*self.args, **self.kwargs) File "/usr/lib/ambari-server/lib/ambari_server/serverSetup.py", line 1208, in setup _setup_database(options) File "/usr/lib/ambari-server/lib/ambari_server/serverSetup.py", line 1014, in _setup_database dbmsAmbari = factory.create(options, properties, "Ambari") File "/usr/lib/ambari-server/lib/ambari_server/dbConfiguration.py", line 511, in create dbmsConfig = desc.create_config(options, properties, dbId) File "/usr/lib/ambari-server/lib/ambari_server/dbConfiguration.py", line 81, in create_config return self.fn_create_config(options, properties, self.storage_key, dbId) File "/usr/lib/ambari-server/lib/ambari_server/dbConfiguration_linux.py", line 899, in createPGConfig return PGConfig(options, properties, storage_type) File "/usr/lib/ambari-server/lib/ambari_server/dbConfiguration_linux.py", line 435, in __init__ PGConfig.PG_STATUS_RUNNING = get_postgre_running_status() File "/usr/lib/ambari-server/lib/ambari_server/utils.py", line 296, in get_postgre_running_status return os.path.join(get_ubuntu_pg_version(), "main") File "/usr/lib/python2.7/posixpath.py", line 70, in join elif path == '' or path.endswith('/'): AttributeError: 'list' object has no attribute 'endswith' Running as root user in ubuntu-16.04
... View more