Member since
02-20-2018
25
Posts
2
Kudos Received
0
Solutions
02-26-2018
04:02 PM
I figured out why the /usr/hdp/current/oozie-client/conf directory keeps coming back. The oozie_client.py tried to create the conf directory under /usr/hdp/current/oozie-client which is supposed to be a symbolic link but is missing. There for the actual directory is created /usr/hdp/current/oozie-client with a conf subdirectory. This in turn caused the error, "symlink target /usr/hdp/current/oozie-client for oozie already exists and it is not a symlink". Now that I got the symbolic link created for oozie-client /usr/hdp/current/oozie-client ->/usr/hdp/2.6.4.0-91/oozie But oozie is missing under /usr/hdp/2.6.4.0-91. How do I make Ambari create it ? Will it help if I delete /var/lib/ambari-agent/cache ? @Aditya Sirna
... View more
02-26-2018
04:00 PM
1 Kudo
Thank you, @Aditya Sirna. How do I tell whether the following OS is of ppc ? $ uname -a
Linux my.hostname 3.10.0-327.36.3.el7.x86_64 #1 SMP Thu Oct 20 04:56:07 EDT 2016 x86_64 x86_64 x86_64 GNU/Linux
... View more
02-26-2018
03:48 PM
1 Kudo
I figured out why the /usr/hdp/current/oozie-client/conf directory keeps coming back. The oozie_client.py tried to create the conf directory under /usr/hdp/current/oozie-client which is supposed to be a symbolic link but is missing. There for the actual directory is created /usr/hdp/current/oozie-client with a conf subdirectory. This in turn caused the error, "symlink target /usr/hdp/current/oozie-client for oozie already exists and it is not a symlink". Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie_client.py", line 71, in <module>
OozieClient().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 375, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie_client.py", line 34, in install
self.configure(env)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 120, in locking_configure
original_configure(obj, *args, **kw)
File "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie_client.py", line 41, in configure
oozie(is_server=False)
File "/usr/lib/ambari-agent/lib/ambari_commons/os_family_impl.py", line 89, in thunk
return fn(*args, **kwargs)
File "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie.py", line 117, in oozie
group = params.user_group
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 185, in action_create
sudo.makedirs(path, self.resource.mode or 0755)
File "/usr/lib/ambari-agent/lib/resource_management/core/sudo.py", line 107, in makedirs
raise Fail("Cannot create directory '{0}' as '{1}' is a broken symlink".format(path, dirname))
resource_management.core.exceptions.Fail: Cannot create directory '/usr/hdp/current/oozie-client/conf' as '/usr/hdp/current/oozie-client' is a broken symlink Now that I got the symbolic link created for oozie-client /usr/hdp/current/oozie-client -> /usr/hdp/2.6.4.0-91/oozie But oozie is missing under /usr/hdp/2.6.4.0-91. How do I make Ambari create it ? Will it help if I delete /var/lib/ambari-agent/cache ?
... View more
02-26-2018
02:41 PM
My initial cluster installation failed due to lack of space under the / directory. I cleaned the / directory and made more room of disk space and tried to Retry the cluster installation via Ambari Cluster installation wizard. But every Retry failed with oozie-client installation I can see the error is actually caused by the following. See the entire error log at the end of this post. $ hdp-select set oozie-client 2.6.4.0-91
symlink target /usr/hdp/current/oozie-client for oozie already exists and it is not a symlink.
$ echo $?
1 I removed /usr/hdp/current/oozie-client, removed oozie-client rpm, deleted the service component, oozie-client, from database via Ambari rest API, and even removed the hdp-select rpm, then retried the cluster installation, but still the Ambari Cluster Installation Wizard failed at oozie-client installation with exactly the same error message and the /usr/hdp/current/oozie-client directory somehow came back even though it had been removed. $ find /usr/hdp/current/oozie-client
/usr/hdp/current/oozie-client
/usr/hdp/current/oozie-client/conf
/usr/hdp/current/oozie-client/conf/oozie-site.jceks
/usr/hdp/current/oozie-client/conf/oozie-site.xml
/usr/hdp/current/oozie-client/conf/oozie-env.sh
/usr/hdp/current/oozie-client/conf/oozie-log4j.properties
/usr/hdp/current/oozie-client/conf/adminusers.txt
/usr/hdp/current/oozie-client/conf/hadoop-config.xml
/usr/hdp/current/oozie-client/conf/oozie-default.xml
/usr/hdp/current/oozie-client/conf/action-conf
/usr/hdp/current/oozie-client/conf/action-conf/hive.xml
How do I do a fresh re-install of oozie-client in order to continue the cluster installation via Ambari ? The following is the entire error log when the cluster installation Wizard failed 2018-02-25 21:06:31,433 - The 'oozie-client' component did not advertise a version. This may indicate a problem with the component packaging. However, the stack-select tool was able to report a single version installed (2.6.4.0-91). This is the version that will be reported.
2018-02-25 21:08:48,022 - Could not determine stack version for component oozie-client by calling '/usr/bin/hdp-select status oozie-client > /tmp/tmpITYi8Q'. Return Code: 1, Output: .
2018-02-25 21:08:48,062 - The 'oozie-client' component did not advertise a version. This may indicate a problem with the component packaging. However, the stack-select tool was able to report a single version installed (2.6.4.0-91). This is the version that will be reported.
2018-02-25 21:08:48,633 - Could not determine stack version for component oozie-client by calling '/usr/bin/hdp-select status oozie-client > /tmp/tmpKWJkBC'. Return Code: 1, Output: .
2018-02-25 21:08:48,659 - The 'oozie-client' component did not advertise a version. This may indicate a problem with the component packaging. However, the stack-select tool was able to report a single version installed (2.6.4.0-91). This is the version that will be reported.
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", line 37, in <module>
AfterInstallHook().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 375, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", line 31, in hook
setup_stack_symlinks(self.stroutfile)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py", line 62, in setup_stack_symlinks
stack_select.select(package, json_version)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 313, in select
Execute(command, sudo=True)
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 262, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-python-wrap /usr/bin/hdp-select set oozie-client 2.6.4.0-91' returned 1. symlink target /usr/hdp/current/oozie-client for oozie already exists and it is not a symlink.
stdout: /var/lib/ambari-agent/data/output-671.txt
2018-02-25 21:06:30,584 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-02-25 21:06:30,589 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-02-25 21:06:30,590 - Group['hdfs'] {}
2018-02-25 21:06:30,592 - Group['hadoop'] {}
2018-02-25 21:06:30,592 - Group['users'] {}
2018-02-25 21:06:30,593 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-25 21:06:30,595 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-25 21:06:30,597 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-02-25 21:06:30,599 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-02-25 21:06:30,601 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-02-25 21:06:30,602 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-25 21:06:30,604 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-25 21:06:30,606 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-02-25 21:06:30,608 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-02-25 21:06:30,615 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-02-25 21:06:30,615 - Group['hdfs'] {}
2018-02-25 21:06:30,616 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2018-02-25 21:06:30,617 - FS Type:
2018-02-25 21:06:30,617 - Directory['/etc/hadoop'] {'mode': 0755}
2018-02-25 21:06:30,634 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-02-25 21:06:30,635 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-02-25 21:06:30,652 - Repository['HDP-2.6-repo-11'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.4.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-11', 'mirror_list': None}
2018-02-25 21:06:30,668 - File['/etc/yum.repos.d/ambari-hdp-11.repo'] {'content': '[HDP-2.6-repo-11]\nname=HDP-2.6-repo-11\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-02-25 21:06:30,669 - Writing File['/etc/yum.repos.d/ambari-hdp-11.repo'] because contents don't match
2018-02-25 21:06:30,669 - Repository with url http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.4.0 is not created due to its tags: set([u'GPL'])
2018-02-25 21:06:30,669 - Repository['HDP-UTILS-1.1.0.22-repo-11'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-11', 'mirror_list': None}
2018-02-25 21:06:30,673 - File['/etc/yum.repos.d/ambari-hdp-11.repo'] {'content': '[HDP-2.6-repo-11]\nname=HDP-2.6-repo-11\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-11]\nname=HDP-UTILS-1.1.0.22-repo-11\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-02-25 21:06:30,673 - Writing File['/etc/yum.repos.d/ambari-hdp-11.repo'] because contents don't match
2018-02-25 21:06:30,674 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-25 21:06:31,196 - Skipping installation of existing package unzip
2018-02-25 21:06:31,197 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-25 21:06:31,250 - Skipping installation of existing package curl
2018-02-25 21:06:31,251 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-25 21:06:31,301 - Skipping installation of existing package hdp-select
2018-02-25 21:06:31,390 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}
2018-02-25 21:06:31,433 - call returned (0, '2.6.4.0-91')
2018-02-25 21:06:31,433 - The 'oozie-client' component did not advertise a version. This may indicate a problem with the component packaging. However, the stack-select tool was able to report a single version installed (2.6.4.0-91). This is the version that will be reported.
2018-02-25 21:06:31,713 - Package['zip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-25 21:06:32,173 - Skipping installation of existing package zip
2018-02-25 21:06:32,174 - Command repositories: HDP-2.6-repo-11, HDP-2.6-GPL-repo-11, HDP-UTILS-1.1.0.22-repo-11
2018-02-25 21:06:32,174 - Applicable repositories: HDP-2.6-repo-11, HDP-2.6-GPL-repo-11, HDP-UTILS-1.1.0.22-repo-11
2018-02-25 21:06:32,175 - Looking for matching packages in the following repositories: HDP-2.6-repo-11, HDP-2.6-GPL-repo-11, HDP-UTILS-1.1.0.22-repo-11
2018-02-25 21:06:37,649 - Adding fallback repositories: HDP-2.6-repo-7, HDP-2.6-repo-10, HDP-UTILS-1.1.0.22, HDP-UTILS-1.1.0.22-repo-10, HDP-2.6-repo-6, HDP-2.6-repo-5, HDP-2.6-repo-4, HDP-2.6-repo-3, HDP-2.6-repo-2, HDP-2.6-repo-1, HDP-UTILS-1.1.0.22-repo-8, HDP-UTILS-1.1.0.22-repo-4, HDP-2.6-repo-9, HDP-2.6-repo-8, HDP-UTILS-1.1.0.22-repo-6, HDP-UTILS-1.1.0.22-repo-7, HDP-UTILS-1.1.0.22-repo-5, HDP-UTILS-1.1.0.22-repo-2, HDP-UTILS-1.1.0.22-repo-3, HDP-UTILS-1.1.0.22-repo-1, HDP-UTILS-1.1.0.22-repo-9, HDP-2.6.4.0
2018-02-25 21:07:14,720 - Package['oozie_2_6_4_0_91'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-25 21:07:14,785 - Skipping installation of existing package oozie_2_6_4_0_91
2018-02-25 21:07:14,788 - Package['falcon_2_6_4_0_91'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-25 21:07:14,850 - Installing package falcon_2_6_4_0_91 ('/usr/bin/yum -d 0 -e 0 -y install falcon_2_6_4_0_91')
2018-02-25 21:08:47,036 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-02-25 21:08:47,039 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-02-25 21:08:47,050 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://r00pvdn0c.bnymellon.net:8080/resources/CredentialUtil.jar'), 'mode': 0755}
2018-02-25 21:08:47,051 - Not downloading the file from http://r00pvdn0c.bnymellon.net:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists
2018-02-25 21:08:47,051 - checked_call[('/usr/java/default/bin/java', '-cp', u'/var/lib/ambari-agent/cred/lib/*', 'org.apache.ambari.server.credentialapi.CredentialUtil', 'get', 'oozie.service.JPAService.jdbc.password', '-provider', u'jceks://file/var/lib/ambari-agent/cred/conf/oozie_client/oozie-site.jceks')] {}
2018-02-25 21:08:47,656 - checked_call returned (0, 'SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".\nSLF4J: Defaulting to no-operation (NOP) logger implementation\nSLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.\nFeb 25, 2018 9:08:47 PM org.apache.hadoop.util.NativeCodeLoader <clinit>\nWARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable\noozie')
2018-02-25 21:08:47,662 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-02-25 21:08:47,669 - Directory['/usr/hdp/current/oozie-client/conf'] {'owner': 'oozie', 'create_parents': True, 'group': 'hadoop'}
2018-02-25 21:08:47,669 - Creating directory Directory['/usr/hdp/current/oozie-client/conf'] since it doesn't exist.
2018-02-25 21:08:47,670 - Changing owner for /usr/hdp/current/oozie-client/conf from 0 to oozie
2018-02-25 21:08:47,670 - Changing group for /usr/hdp/current/oozie-client/conf from 0 to hadoop
2018-02-25 21:08:47,671 - File['/usr/hdp/current/oozie-client/conf/oozie-site.jceks'] {'content': StaticFile('/var/lib/ambari-agent/cred/conf/oozie_client/oozie-site.jceks'), 'owner': 'oozie', 'group': 'hadoop', 'mode': 0640}
2018-02-25 21:08:47,671 - Writing File['/usr/hdp/current/oozie-client/conf/oozie-site.jceks'] because it doesn't exist
2018-02-25 21:08:47,672 - Changing owner for /usr/hdp/current/oozie-client/conf/oozie-site.jceks from 0 to oozie
2018-02-25 21:08:47,672 - Changing group for /usr/hdp/current/oozie-client/conf/oozie-site.jceks from 0 to hadoop
2018-02-25 21:08:47,672 - Changing permission for /usr/hdp/current/oozie-client/conf/oozie-site.jceks from 644 to 640
2018-02-25 21:08:47,673 - XmlConfig['oozie-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/oozie-client/conf', 'mode': 0664, 'configuration_attributes': {}, 'owner': 'oozie', 'configurations': ...}
2018-02-25 21:08:47,683 - Generating config: /usr/hdp/current/oozie-client/conf/oozie-site.xml
2018-02-25 21:08:47,683 - File['/usr/hdp/current/oozie-client/conf/oozie-site.xml'] {'owner': 'oozie', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0664, 'encoding': 'UTF-8'}
2018-02-25 21:08:47,695 - Writing File['/usr/hdp/current/oozie-client/conf/oozie-site.xml'] because it doesn't exist
2018-02-25 21:08:47,696 - Changing owner for /usr/hdp/current/oozie-client/conf/oozie-site.xml from 0 to oozie
2018-02-25 21:08:47,696 - Changing group for /usr/hdp/current/oozie-client/conf/oozie-site.xml from 0 to hadoop
2018-02-25 21:08:47,696 - Changing permission for /usr/hdp/current/oozie-client/conf/oozie-site.xml from 644 to 664
2018-02-25 21:08:47,702 - File['/usr/hdp/current/oozie-client/conf/oozie-env.sh'] {'content': InlineTemplate(...), 'owner': 'oozie', 'group': 'hadoop'}
2018-02-25 21:08:47,702 - Writing File['/usr/hdp/current/oozie-client/conf/oozie-env.sh'] because it doesn't exist
2018-02-25 21:08:47,702 - Changing owner for /usr/hdp/current/oozie-client/conf/oozie-env.sh from 0 to oozie
2018-02-25 21:08:47,703 - Changing group for /usr/hdp/current/oozie-client/conf/oozie-env.sh from 0 to hadoop
2018-02-25 21:08:47,703 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2018-02-25 21:08:47,705 - File['/etc/security/limits.d/oozie.conf'] {'content': Template('oozie.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2018-02-25 21:08:47,708 - File['/usr/hdp/current/oozie-client/conf/oozie-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'oozie', 'group': 'hadoop', 'mode': 0644}
2018-02-25 21:08:47,709 - Writing File['/usr/hdp/current/oozie-client/conf/oozie-log4j.properties'] because it doesn't exist
2018-02-25 21:08:47,709 - Changing owner for /usr/hdp/current/oozie-client/conf/oozie-log4j.properties from 0 to oozie
2018-02-25 21:08:47,710 - Changing group for /usr/hdp/current/oozie-client/conf/oozie-log4j.properties from 0 to hadoop
2018-02-25 21:08:47,715 - File['/usr/hdp/current/oozie-client/conf/adminusers.txt'] {'content': Template('adminusers.txt.j2'), 'owner': 'oozie', 'group': 'hadoop', 'mode': 0644}
2018-02-25 21:08:47,715 - Writing File['/usr/hdp/current/oozie-client/conf/adminusers.txt'] because it doesn't exist
2018-02-25 21:08:47,716 - Changing owner for /usr/hdp/current/oozie-client/conf/adminusers.txt from 0 to oozie
2018-02-25 21:08:47,716 - Changing group for /usr/hdp/current/oozie-client/conf/adminusers.txt from 0 to hadoop
2018-02-25 21:08:47,717 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('http://r00pvdn0c.bnymellon.net:8080/resources/DBConnectionVerification.jar')}
2018-02-25 21:08:47,717 - Not downloading the file from http://r00pvdn0c.bnymellon.net:8080/resources/DBConnectionVerification.jar, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists
2018-02-25 21:08:47,717 - File['/usr/hdp/current/oozie-client/conf/hadoop-config.xml'] {'owner': 'oozie', 'group': 'hadoop'}
2018-02-25 21:08:47,717 - Writing File['/usr/hdp/current/oozie-client/conf/hadoop-config.xml'] because it doesn't exist
2018-02-25 21:08:47,718 - Changing owner for /usr/hdp/current/oozie-client/conf/hadoop-config.xml from 0 to oozie
2018-02-25 21:08:47,718 - Changing group for /usr/hdp/current/oozie-client/conf/hadoop-config.xml from 0 to hadoop
2018-02-25 21:08:47,718 - File['/usr/hdp/current/oozie-client/conf/oozie-default.xml'] {'owner': 'oozie', 'group': 'hadoop'}
2018-02-25 21:08:47,718 - Writing File['/usr/hdp/current/oozie-client/conf/oozie-default.xml'] because it doesn't exist
2018-02-25 21:08:47,720 - Changing owner for /usr/hdp/current/oozie-client/conf/oozie-default.xml from 0 to oozie
2018-02-25 21:08:47,720 - Changing group for /usr/hdp/current/oozie-client/conf/oozie-default.xml from 0 to hadoop
2018-02-25 21:08:47,721 - Directory['/usr/hdp/current/oozie-client/conf/action-conf'] {'owner': 'oozie', 'group': 'hadoop'}
2018-02-25 21:08:47,721 - Creating directory Directory['/usr/hdp/current/oozie-client/conf/action-conf'] since it doesn't exist.
2018-02-25 21:08:47,722 - Changing owner for /usr/hdp/current/oozie-client/conf/action-conf from 0 to oozie
2018-02-25 21:08:47,723 - Changing group for /usr/hdp/current/oozie-client/conf/action-conf from 0 to hadoop
2018-02-25 21:08:47,723 - File['/usr/hdp/current/oozie-client/conf/action-conf/hive.xml'] {'owner': 'oozie', 'group': 'hadoop'}
2018-02-25 21:08:47,724 - Writing File['/usr/hdp/current/oozie-client/conf/action-conf/hive.xml'] because it doesn't exist
2018-02-25 21:08:47,724 - Changing owner for /usr/hdp/current/oozie-client/conf/action-conf/hive.xml from 0 to oozie
2018-02-25 21:08:47,725 - Changing group for /usr/hdp/current/oozie-client/conf/action-conf/hive.xml from 0 to hadoop
2018-02-25 21:08:48,022 - Could not determine stack version for component oozie-client by calling '/usr/bin/hdp-select status oozie-client > /tmp/tmpITYi8Q'. Return Code: 1, Output: .
2018-02-25 21:08:48,022 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}
2018-02-25 21:08:48,062 - call returned (0, '2.6.4.0-91')
2018-02-25 21:08:48,062 - The 'oozie-client' component did not advertise a version. This may indicate a problem with the component packaging. However, the stack-select tool was able to report a single version installed (2.6.4.0-91). This is the version that will be reported.
2018-02-25 21:08:48,351 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-02-25 21:08:48,388 - Execute[('ambari-python-wrap', u'/usr/bin/hdp-select', 'set', u'oozie-client', u'2.6.4.0-91')] {'sudo': True}
2018-02-25 21:08:48,633 - Could not determine stack version for component oozie-client by calling '/usr/bin/hdp-select status oozie-client > /tmp/tmpKWJkBC'. Return Code: 1, Output: .
2018-02-25 21:08:48,633 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}
2018-02-25 21:08:48,659 - call returned (0, '2.6.4.0-91')
2018-02-25 21:08:48,659 - The 'oozie-client' component did not advertise a version. This may indicate a problem with the component packaging. However, the stack-select tool was able to report a single version installed (2.6.4.0-91). This is the version that will be reported.
Command failed after 1 tries
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Oozie
02-26-2018
01:42 PM
Unable to use yum to remove oozie server # yum remove oozie_2_6_4_0_91 -y
Loaded plugins: langpacks, priorities, product-id, rhnplugin, search-disabled-
: repos, subscription-manager
This system is receiving updates from RHN Classic or Red Hat Satellite.
Resolving Dependencies
--> Running transaction check
---> Package oozie_2_6_4_0_91.noarch 0:4.2.0.2.6.4.0-91 will be erased
--> Finished Dependency Resolution
Dependencies Resolved
================================================================================
Package Arch Version Repository Size
================================================================================
Removing:
oozie_2_6_4_0_91 noarch 4.2.0.2.6.4.0-91 @HDP-2.6.4.0 0.0
Transaction Summary
================================================================================
Remove 1 Package
Installed size: 0
Downloading packages:
Running transaction check
Running transaction test
Transaction test succeeded
Running transaction
/var/tmp/rpm-tmp.4p0Hg2: line 1: pushd: /usr/hdp/2.6.4.0-91/oozie: No such file or directory
/var/tmp/rpm-tmp.4p0Hg2: line 3: popd: directory stack empty
error: %preun(oozie_2_6_4_0_91-4.2.0.2.6.4.0-91.noarch) scriptlet failed, exit status 1
Error in PREUN scriptlet in rpm package oozie_2_6_4_0_91-4.2.0.2.6.4.0-91.noarch
3529 packages excluded due to repository priority protections
Verifying : oozie_2_6_4_0_91-4.2.0.2.6.4.0-91.noarch 1/1
Failed:
oozie_2_6_4_0_91.noarch 0:4.2.0.2.6.4.0-91
Complete!
... View more
Labels:
- Labels:
-
Apache Oozie
02-26-2018
04:11 AM
Googling "redhat-ppc7" doesn't really return any meaningful explanation. What is redhat=ppc7 and what is the difference between redhat-ppc7 and redhat7 ?
... View more
Labels:
- Labels:
-
Apache Ambari
02-25-2018
02:26 AM
I removed /usr/hdp/current/oozie-client and /usr/hdp/current/slider-client and re-run Step 1 and Step 3. Then I retried the cluster installation in Ambari console. Same error happened. resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-python-wrap /usr/bin/hdp-select set oozie-client 2.6.4.0-91' returned 1. symlink target /usr/hdp/current/oozie-client for oozie already exists and it is not a symlink. And the /usr/hdp/current/oozie-client and /usr/hdp/current/slider-client came back. The ambari database in postgresql shows select * from hostcomponentstate where service_name='OOZIE';
259 8 OOZIE_SERVER UNKNOWN INSTALL_FAILED 1 OOZIE NONE UNSECURED
285 8 OOZIE_CLIENT UNKNOWN INSTALL_FAILED 1 OOZIE NONE UNSECURED
... View more
02-24-2018
05:28 PM
Thank you, @Aditya Sirna I followed your instructions above and get the following error message. Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", line 37, in <module>
AfterInstallHook().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 375, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", line 31, in hook
setup_stack_symlinks(self.stroutfile)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py", line 62, in setup_stack_symlinks
stack_select.select(package, json_version)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 313, in select
Execute(command, sudo=True)
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 262, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-python-wrap /usr/bin/hdp-select set oozie-client 2.6.4.0-91' returned 1. symlink target /usr/hdp/current/oozie-client for oozie already exists and it is not a symlink. Step 1 and Step 3 went fine.But when I tried Step 2 and Step 4 I get the following responses and retry cluster installation failed. Should I actually delete those components first ? {
"status" : 409,
"message" : "Attempted to create a host_component which already exists: [clusterName=bxp_singlenode_cluster, hostName=r00pvdn0c.bnymellon.net, componentName=OOZIE_CLIENT]"
}
{
"status" : 409,
"message" : "Attempted to create a host_component which already exists: [clusterName=bxp_singlenode_cluster, hostName=r00pvdn0c.bnymellon.net, componentName=SLIDER]"
}
... View more
02-24-2018
04:18 AM
When I try to re-install cluster via Ambari it failed with the following error. parent directory /usr/hdp/current/slider-client/lib doesn't exist Checking /usr/hdp/current/ and find two directories that are not symlink drwxr-xr-x 3 root root 17 Feb 23 21:39 oozie-client
drwxr-xr-x 3 root root 17 Feb 23 22:39 slider-client
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Oozie
-
Apache Slider
02-23-2018
09:46 PM
My machine has 200G disk space in total but only close to 10g under the / directory. I keep getting errors that says "needs xx MB in the / file system" when installing HDP via Ambari. Is it possible to change the default HDP installation directories to other than the "/" directory ?
... View more
Labels:
02-23-2018
09:31 PM
Ok. All under / ? The entire disk space under / is only 9.5 G although the entire space of all disks of the host is 200G. Is there anyway configure the HDP installation to use other directories as installation root ?
... View more
02-23-2018
09:25 PM
Thank you, Jay, for the reply. How much disk space under / file system does the entire HDP 2.6.4 require ? Can we designate another disk drive for the installation ?
... View more
02-23-2018
07:55 PM
Tried to install HDP 2.6.4 via Ambari. Everything else is fine except Storm due to slider installation failed. The root cause is installing package storm_2_6_4_0_91-slider-client-1.1.0.2.6.4.0-91.x86_64 needs 8MB on the / filesystem See the following log for more info. stderr:
2018-02-23 14:40:35,378 - The 'slider-client' component did not advertise a version. This may indicate a problem with the component packaging. However, the stack-select tool was able to report a single version installed (2.6.4.0-91). This is the version that will be reported.
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/SLIDER/0.60.0.2.2/package/scripts/slider_client.py", line 62, in <module>
SliderClient().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 375, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/SLIDER/0.60.0.2.2/package/scripts/slider_client.py", line 45, in install
self.install_packages(env)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 821, in install_packages
retry_count=agent_stack_retry_count)
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/package/__init__.py", line 53, in action_install
self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/package/yumrpm.py", line 264, in install_package
self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/package/__init__.py", line 266, in checked_call_with_retries
return self._call_with_retries(cmd, is_checked=True, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/package/__init__.py", line 283, in _call_with_retries
code, out = func(cmd, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install storm_2_6_4_0_91-slider-client' returned 1. Transaction check error:
installing package storm_2_6_4_0_91-slider-client-1.1.0.2.6.4.0-91.x86_64 needs 8MB on the / filesystem
Error Summary
-------------
Disk Requirements:
At least 8MB more space needed on the / filesystem.
stdout:
2018-02-23 14:40:34,577 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-02-23 14:40:34,583 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-02-23 14:40:34,584 - Group['livy'] {}
2018-02-23 14:40:34,585 - Group['spark'] {}
2018-02-23 14:40:34,586 - Group['hdfs'] {}
2018-02-23 14:40:34,586 - Group['zeppelin'] {}
2018-02-23 14:40:34,586 - Group['hadoop'] {}
2018-02-23 14:40:34,587 - Group['users'] {}
2018-02-23 14:40:34,587 - Group['knox'] {}
2018-02-23 14:40:34,587 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-23 14:40:34,590 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-23 14:40:34,592 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-23 14:40:34,594 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-23 14:40:34,596 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-02-23 14:40:34,598 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-23 14:40:34,600 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-23 14:40:34,602 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-02-23 14:40:34,604 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid': None}
2018-02-23 14:40:34,607 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-23 14:40:34,609 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-23 14:40:34,611 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-02-23 14:40:34,613 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-23 14:40:34,615 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-23 14:40:34,617 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-02-23 14:40:34,619 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-23 14:40:34,621 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-23 14:40:34,623 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-23 14:40:34,625 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-23 14:40:34,627 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-23 14:40:34,629 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-02-23 14:40:34,631 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-02-23 14:40:34,633 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-02-23 14:40:34,643 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-02-23 14:40:34,644 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-02-23 14:40:34,646 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-02-23 14:40:34,650 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-02-23 14:40:34,652 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-02-23 14:40:34,668 - call returned (0, '55022')
2018-02-23 14:40:34,669 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 55022'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-02-23 14:40:34,676 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 55022'] due to not_if
2018-02-23 14:40:34,676 - Group['hdfs'] {}
2018-02-23 14:40:34,677 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2018-02-23 14:40:34,679 - FS Type:
2018-02-23 14:40:34,679 - Directory['/etc/hadoop'] {'mode': 0755}
2018-02-23 14:40:34,693 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-02-23 14:40:34,694 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-02-23 14:40:34,711 - Repository['HDP-2.6-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.4.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-02-23 14:40:34,722 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-02-23 14:40:34,723 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-02-23 14:40:34,723 - Repository with url http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.4.0 is not created due to its tags: set([u'GPL'])
2018-02-23 14:40:34,723 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-02-23 14:40:34,727 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-02-23 14:40:34,727 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-02-23 14:40:34,728 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-23 14:40:35,163 - Skipping installation of existing package unzip
2018-02-23 14:40:35,163 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-23 14:40:35,206 - Skipping installation of existing package curl
2018-02-23 14:40:35,206 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-23 14:40:35,253 - Skipping installation of existing package hdp-select
2018-02-23 14:40:35,346 - call[('ambari-python-wrap', u'/usr/bin/hdp-select', 'versions')] {}
2018-02-23 14:40:35,378 - call returned (0, '2.6.4.0-91')
2018-02-23 14:40:35,378 - The 'slider-client' component did not advertise a version. This may indicate a problem with the component packaging. However, the stack-select tool was able to report a single version installed (2.6.4.0-91). This is the version that will be reported.
2018-02-23 14:40:35,649 - Command repositories: HDP-2.6-repo-1, HDP-2.6-GPL-repo-1, HDP-UTILS-1.1.0.22-repo-1
2018-02-23 14:40:35,650 - Applicable repositories: HDP-2.6-repo-1, HDP-2.6-GPL-repo-1, HDP-UTILS-1.1.0.22-repo-1
2018-02-23 14:40:35,653 - Looking for matching packages in the following repositories: HDP-2.6-repo-1, HDP-2.6-GPL-repo-1, HDP-UTILS-1.1.0.22-repo-1
2018-02-23 14:40:41,709 - Adding fallback repositories: HDP-UTILS-1.1.0.22, HDP-2.6.4.0
2018-02-23 14:40:45,021 - Package['slider_2_6_4_0_91'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-23 14:40:45,485 - Installing package slider_2_6_4_0_91 ('/usr/bin/yum -d 0 -e 0 -y install slider_2_6_4_0_91')
2018-02-23 14:41:34,694 - Package['storm_2_6_4_0_91-slider-client'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-23 14:41:34,732 - Installing package storm_2_6_4_0_91-slider-client ('/usr/bin/yum -d 0 -e 0 -y install storm_2_6_4_0_91-slider-client')
2018-02-23 14:42:59,109 - Execution of '/usr/bin/yum -d 0 -e 0 -y install storm_2_6_4_0_91-slider-client' returned 1. Transaction check error:
installing package storm_2_6_4_0_91-slider-client-1.1.0.2.6.4.0-91.x86_64 needs 8MB on the / filesystem
Error Summary
-------------
Disk Requirements:
At least 8MB more space needed on the / filesystem.
2018-02-23 14:42:59,110 - Failed to install package storm_2_6_4_0_91-slider-client. Executing '/usr/bin/yum clean metadata'
2018-02-23 14:42:59,767 - Retrying to install package storm_2_6_4_0_91-slider-client after 30 seconds
Command failed after 1 tries
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Slider
-
Apache Storm
02-23-2018
06:40 PM
When I try to use Ambari's install cluster wizard it requires to install Amarbi Metrics which in turn requires "Grafana Admin Password". Am I supposed to install and run Grafana before create a cluster ? If not is Ambari going to install and configure Grafana using the "Grafana Admin Password" I give me arbitrarily ? How do I know Grafana's Admin Password ?
... View more
Labels:
- Labels:
-
Apache Ambari
02-22-2018
11:14 PM
I am trying to create a cluster which installs hive. I am suing the Ambari embedded postgres db and created database hive. But testing db connection failed in Ambari console with the following error message. I wonder whether I need to add an entry to pg_hba.conf for the newly created hive user or not. Feb 22, 2018 6:05:59 PM org.postgresql.Driver connect
SEVERE: Connection error:
org.postgresql.util.PSQLException: FATAL: no pg_hba.conf entry for host "10.63.6.134", user "hive", database "hive", SSL off
at org.postgresql.core.v3.ConnectionFactoryImpl.doAuthentication(ConnectionFactoryImpl.java:475)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:207)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)
at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:195)
at org.postgresql.Driver.makeConnection(Driver.java:452)
at org.postgresql.Driver.connect(Driver.java:254)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.apache.ambari.server.DBConnectionVerification.main(DBConnectionVerification.java:37)
ERROR: Unable to connect to the DB. Please check DB connection properties.
org.postgresql.util.PSQLException: FATAL: no pg_hba.conf entry for host "10.63.6.134", user "hive", database "hive", SSL off
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hive
02-22-2018
10:42 PM
Thank you, Jay, for the prompt response. I used the following cmd to to run from script and it worked cmd="echo ${sql}|psql -U postgres"
su - postgres -c "$cmd"
... View more
02-22-2018
03:43 AM
I installed Ambari as root user with its default/embedded Postgresql db and would like to use it to create hive and oozie databases. Ambari was installed successfully and postgress is running. But when running the following command to create hive database as root user echo "CREATE DATABASE hive;" | psql -U postgres I get the following error: psql: FATAL: Peer authentication failed for user "postgres" Here is /var/lib/pgsql/data/pg_hba.conf local all postgres peer
# IPv4 local connections:
host all postgres 127.0.0.1/32 ident
local all ambari,mapred md5
host all ambari,mapred 0.0.0.0/0 md5
host all ambari,mapred ::/0 md5
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hive
02-22-2018
02:49 AM
Thank you, Jay, for the update
... View more
02-21-2018
10:37 PM
I am following the instructions from the following doc https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.1.3/bk_ambari-administration/content/using_hive_with_postgresql.html The confusing part is Step 1: Installing JDBC driver. ambari-server setup --jdbc-db=postgres --jdbc-driver=/path/to/postgres/postgresql.jar
ls /usr/share/java/postgresql-jdbc.jar
ambari-server setup --jdbc-db=postgres --jdbc-driver=/usr/share/java/postgresql-jdbc.jar The instruction asks to run ambari-server setup twice each with a different path to jdbc jar. The first copies the jdbc jar to /var/lib/ambari-server/resources. But then the doc asks to verify the jdbc jar at /usr/share/java/ which doesn't exist ls /usr/share/java/postgresql-jdbc.jar The second execution of ambari-server setup simply copy /usr/share/java/postgresql-jdbc.jar (which doesn't exist) to /var/lib/ambari-server/resources again which doesn't make sense
... View more
Labels:
- Labels:
-
Apache Ambari
02-21-2018
05:24 PM
Ok. I have the agent running now. Install the Ambari Agents manually doc is missing a lot properties required to start the agent. Ambari assumes most users will choose the automatic installation option with SSH private key which is true but our company won't grant root access
... View more
02-21-2018
02:43 PM
@Jay Kumar SenSharma, Thank you for your answer. I added the property, prefix, as you suggested and that fixed the original error. Then I notice that amari-agent needs more properties which are not mentioned in the Hortworks Amari-Agent manual installation doc. I run into a different issue now with Ambari agent start. Will ask a different question
... View more
02-20-2018
10:54 PM
The source code in /usr/lib/ambari-agent/lib/ambari_agent/main.py shows that the following config is required to start agent. Can someone give me an example on how to config [agent] prefix and what it is used for ? [agent] prefix=xxx
... View more
02-20-2018
10:30 PM
Thank you, Geoffrey The issue remains after installing epel-release rpm. From /var/log/ambari-agent/ambari-agent.log INFO 2018-02-20 17:15:51,895 main.py:145 - loglevel=logging.INFO
INFO 2018-02-20 17:15:51,895 main.py:145 - loglevel=logging.INFO
INFO 2018-02-20 17:15:51,895 main.py:145 - loglevel=logging.INFO
ERROR 2018-02-20 17:15:51,897 main.py:259 - Ambari prefix dir %s not configured, can't continue INFO 2018-02-20 17:15:51,897 ExitHelper.py:56 - Performing cleanup before exiting... Is this because I am installing ambari-agent on the same host with ambari-server ? Also if ambari-agent and ambari-server are on the same host can ambari cluster installation wizard continue without SSH private key ?
... View more
02-20-2018
09:13 PM
Our company has strict restriction on granting root access. So I had to manually install ambari-agent to register it with server. But the agent failed with the following error message: "tput: No value for $TERM and no -T specified
ambari-agent currently not running
tput: No value for $TERM and no -T specified
Verifying Python version compatibility... Using python /usr/bin/python Checking for previously running Ambari Agent... Checking ambari-common dir... Starting ambari-agent" "Ambari prefix dir %s not configured, can't continue" my ambari-agent.init is like this hostname=some.host.name url_port=8440 secured_url_port=8441 piddir=/var/run/ambari-agent logdir=/var/log/ambari-agent keysdir=/var/lib/ambari-agent
... View more
Labels:
- Labels:
-
Apache Ambari