Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

The problem of installing some componets

The problem of installing some componets

New Contributor

I could not to install some components.

What is this error?

If you have the answer, please give me that.

Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 35, in <module> BeforeAnyHook().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 32, in hook setup_java() File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 216, in setup_java raise Fail(format("Unable to access {java_exec}. Confirm you have copied jdk to this host.")) resource_management.core.exceptions.Fail: Unable to access /usr/lib/jvm/java-1.8.0-openjdk/bin/java. Confirm you have copied jdk to this host. Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-3955.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-3955.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-INSTALL/scripts/hook.py", line 37, in <module> BeforeInstallHook().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 382, in execute self.save_component_version_to_structured_out(self.command_name) File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 244, in save_component_version_to_structured_out stack_select_package_name = stack_select.get_package_name() File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 110, in get_package_name package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name) File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 224, in get_packages supported_packages = get_supported_packages() File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 148, in get_supported_packages raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path)) resource_management.core.exceptions.Fail: Unable to query for supported packages using /usr/bin/hdp-select stdout: /var/lib/ambari-agent/data/output-3955.txt 2018-05-21 09:44:33,525 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6 2018-05-21 09:44:33,530 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2018-05-21 09:44:33,531 - Group['livy'] {} 2018-05-21 09:44:33,533 - Adding group Group['livy'] 2018-05-21 09:44:33,557 - Group['spark'] {} 2018-05-21 09:44:33,557 - Adding group Group['spark'] 2018-05-21 09:44:33,573 - Group['hdfs'] {} 2018-05-21 09:44:33,573 - Adding group Group['hdfs'] 2018-05-21 09:44:33,589 - Group['zeppelin'] {} 2018-05-21 09:44:33,589 - Adding group Group['zeppelin'] 2018-05-21 09:44:33,604 - Group['hadoop'] {} 2018-05-21 09:44:33,605 - Adding group Group['hadoop'] 2018-05-21 09:44:33,620 - Group['users'] {} 2018-05-21 09:44:33,621 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-05-21 09:44:33,621 - Adding user User['hive'] 2018-05-21 09:44:33,655 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-05-21 09:44:33,655 - Adding user User['zookeeper'] 2018-05-21 09:44:33,682 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-05-21 09:44:33,682 - Adding user User['ams'] 2018-05-21 09:44:33,708 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None} 2018-05-21 09:44:33,708 - Adding user User['tez'] 2018-05-21 09:44:33,736 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid': None} 2018-05-21 09:44:33,737 - Adding user User['zeppelin'] 2018-05-21 09:44:33,764 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-05-21 09:44:33,764 - Adding user User['livy'] 2018-05-21 09:44:33,790 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-05-21 09:44:33,790 - Adding user User['spark'] 2018-05-21 09:44:33,818 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None} 2018-05-21 09:44:33,818 - Adding user User['ambari-qa'] 2018-05-21 09:44:33,846 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-05-21 09:44:33,846 - Adding user User['kafka'] 2018-05-21 09:44:33,873 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None} 2018-05-21 09:44:33,874 - Adding user User['hdfs'] 2018-05-21 09:44:33,901 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-05-21 09:44:33,901 - Adding user User['sqoop'] 2018-05-21 09:44:33,928 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-05-21 09:44:33,929 - Adding user User['yarn'] 2018-05-21 09:44:33,956 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-05-21 09:44:33,956 - Adding user User['mapred'] 2018-05-21 09:44:33,984 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-05-21 09:44:33,984 - Adding user User['hbase'] 2018-05-21 09:44:34,011 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-05-21 09:44:34,011 - Adding user User['hcat'] 2018-05-21 09:44:34,036 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-05-21 09:44:34,040 - Writing File['/var/lib/ambari-agent/tmp/changeUid.sh'] because it doesn't exist 2018-05-21 09:44:34,040 - Changing permission for /var/lib/ambari-agent/tmp/changeUid.sh from 644 to 555 2018-05-21 09:44:34,041 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2018-05-21 09:44:34,045 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if 2018-05-21 09:44:34,046 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'} 2018-05-21 09:44:34,046 - Creating directory Directory['/tmp/hbase-hbase'] since it doesn't exist. 2018-05-21 09:44:34,046 - Changing owner for /tmp/hbase-hbase from 0 to hbase 2018-05-21 09:44:34,047 - Changing permission for /tmp/hbase-hbase from 755 to 775 2018-05-21 09:44:34,047 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-05-21 09:44:34,048 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-05-21 09:44:34,049 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {} 2018-05-21 09:44:34,056 - call returned (0, '1014') 2018-05-21 09:44:34,057 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2018-05-21 09:44:34,061 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] due to not_if 2018-05-21 09:44:34,062 - Group['hdfs'] {} 2018-05-21 09:44:34,062 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']} 2018-05-21 09:44:34,062 - FS Type: 2018-05-21 09:44:34,063 - Directory['/etc/hadoop'] {'mode': 0755} 2018-05-21 09:44:34,063 - Creating directory Directory['/etc/hadoop'] since it doesn't exist. 2018-05-21 09:44:34,063 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2018-05-21 09:44:34,063 - Creating directory Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] since it doesn't exist. 2018-05-21 09:44:34,064 - Changing owner for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hdfs 2018-05-21 09:44:34,064 - Changing group for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hadoop 2018-05-21 09:44:34,064 - Changing permission for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 755 to 1777 Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-3955.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-3955.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', ''] 2018-05-21 09:44:34,082 - The repository with version 2.6.4.0-91 for this command has been marked as resolved. It will be used to report the version of the component which was installed Command failed after 1 tries

Regards,

2 REPLIES 2
Highlighted

Re: The problem of installing some componets

Super Mentor

@Hiroshi Shidara

Are you running Ambari Agent as a Non Root user? If yes, then you will need to make sure that sudoers permissions are set correctly and the ambari agent user is able to run the "hdp-select" command without any issue.

We are seeing the failure here:

File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py",
 line 148, in get_supported_packages raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path)) 
resource_management.core.exceptions.Fail: Unable to query for supported packages using /usr/bin/hdp-select 

.

Please refer to the following link and double check if the command permissions are setup correctly for the non root user or not?

https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.1.0/bk_ambari-security/content/commands_agent.h...

Example:

# Ambari: Hadoop and Configuration Commands
ambari ALL=(ALL) NOPASSWD:SETENV: /usr/bin/hdp-select, /usr/bin/conf-select, /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh, /usr/lib/hadoop/bin/hadoop-daemon.sh, /usr/lib/hadoop/sbin/hadoop-daemon.sh, /usr/bin/ambari-python-wrap *

# Ambari: Core System Commands
ambari ALL=(ALL) NOPASSWD:SETENV: /usr/bin/yum,/usr/bin/zypper,/usr/bin/apt-get, /bin/mkdir, /usr/bin/test, /bin/ln, /bin/ls, /bin/chown, /bin/chmod, /bin/chgrp, /bin/cp, /usr/sbin/setenforce, /usr/bin/test, /usr/bin/stat, /bin/mv, /bin/sed, /bin/rm, /bin/kill, /bin/readlink, /usr/bin/pgrep, /bin/cat, /usr/bin/unzip, /bin/tar, /usr/bin/tee, /bin/touch, /usr/bin/mysql, /sbin/service mysqld *, /usr/bin/dpkg *, /bin/rpm *, /usr/sbin/hst *, /sbin/service rpcbind *, /sbin/service portmap *

.

Re: The problem of installing some componets

New Contributor

@Jay Kumar SenSharma

Thank you for replying.

I have soluted this problem.

It is problem that I do not install Java, snappy and some items.

I appreciate your reply firstly.

Regards,

Don't have an account?
Coming from Hortonworks? Activate your account here