Support Questions

Find answers, ask questions, and share your expertise

Ambari cluster deploy

avatar
Rising Star

Hi.

Im deploying hdp 2..6.1 in rhel 7

i have installed ambari. and when deploying the cluster, i get the following error , at the first step of the deploy

----

Traceback (most recent call last):

File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 35, in <module>

BeforeAnyHook().execute()

File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute

method(env)

File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 29, in hook

setup_users()

File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 51, in setup_users

fetch_nonlocal_groups = params.fetch_nonlocal_groups,

File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__

self.env.run()

File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run

self.run_action(resource, action)

File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action

provider_action()

File "/usr/lib/python2.6/site-packages/resource_management/core/providers/accounts.py", line 84, in action_create

shell.checked_call(command, sudo=True)

File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner

result = function(command, **kwargs)

File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call

tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)

File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper

result = _call(command, **kwargs_copy)

File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call

raise ExecutionFailed(err_msg, code, out, err)

resource_management.core.exceptions.ExecutionFailed: Execution of 'usermod -G hdfs -g hadoop hdfs' returned 6. usermod: user 'hdfs' does not exist in /etc/passwd

Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-3626.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-3626.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']Traceback (most recent call last):

File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-INSTALL/scripts/hook.py", line 37, in <module>

BeforeInstallHook().execute()

File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 382, in execute

self.save_component_version_to_structured_out(self.command_name)

File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 244, in save_component_version_to_structured_out

stack_select_package_name = stack_select.get_package_name()

File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 110, in get_package_name

package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name)

File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 224, in get_packages

supported_packages = get_supported_packages()

File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 148, in get_supported_packages

raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path))

resource_management.core.exceptions.Fail: Unable to query for supported packages using /usr/bin/hdp-select

stdout: /var/lib/ambari-agent/data/output-3626.txt

2018-05-14 14:54:16,835 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6

2018-05-14 14:54:16,841 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf

2018-05-14 14:54:16,843 - Group['livy'] {}

2018-05-14 14:54:16,844 - Group['spark'] {}

2018-05-14 14:54:16,844 - Group['hdfs'] {}

2018-05-14 14:54:16,844 - Group['hadoop'] {}

2018-05-14 14:54:16,844 - Group['users'] {}

2018-05-14 14:54:16,845 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}

2018-05-14 14:54:16,853 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}

2018-05-14 14:54:16,860 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}

2018-05-14 14:54:16,866 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}

2018-05-14 14:54:16,872 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}

2018-05-14 14:54:16,879 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}

2018-05-14 14:54:16,885 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}

2018-05-14 14:54:16,891 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}

2018-05-14 14:54:16,898 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}

2018-05-14 14:54:16,904 - Modifying user hdfs

Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-3626.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-3626.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']

Command failed after 1

---------------------------------------------

all the users exist but not the hdfs user.

eny ideas?

thanks

1 ACCEPTED SOLUTION

avatar
Master Collaborator

I was facing the similar error and got it resolved by added Hadoop users to passwd file. 

 

resource_management.core.exceptions.ExecutionFailed: Execution of 'usermod -G hadoop -g hadoop hive' returned 6. usermod: user 'hive' does not exist in /etc/passwd
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-59009.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-59009.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1_2', '']

 

>> File location /etc/passwd

>> Adduser hadoop 

 

 

View solution in original post

2 REPLIES 2

avatar
Master Mentor

@Marcelo Natalio Saied

The problem seems to be because of the following error (other errors seems it's side effect)

resource_management.core.exceptions.ExecutionFailed: Execution of 'usermod -G hdfs -g hadoop hdfs' returned 6. usermod: user 'hdfs' does not exist in /etc/passwd

Cause: The above error indicates that you might be running the Ambari Agent as "Non Root" user and the privileges are not setup correctly for this user. As this user does not have some privileges like creating/modifying user "hdfs" thats why it is failing to run the "usermod" / "useradd" command.

So please check the following:

1. Are you running ambari agent as Non Root user?

# grep 'run_as_user' /etc/ambari-agent/conf/ambari-agent.ini


2. If running ambari agent as non root uset then please make sure that you have customized the "/etc/sudoers" file as described in https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.4/bk_security/content/_sudoer_configuration.h...

3. Have you configured the "su" commands and corresponding Hadoop service accounts as mentioned in https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.4/bk_security/content/_customizable_users.htm... And the Sudoer's default : https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.4/bk_security/content/_sudo_defaults.html

# Ambari Customizable Users
ambari ALL=(ALL) NOPASSWD:SETENV: /bin/su hdfs *,/bin/su ambari-qa *,/bin/su ranger *,/bin/su zookeeper *,/bin/su knox *,/bin/su falcon *,/bin/su ams *, /bin/su flume *,/bin/su hbase *,/bin/su spark *,/bin/su accumulo *,/bin/su hive *,/bin/su hcat *,/bin/su kafka *,/bin/su mapred *,/bin/su oozie *,/bin/su sqoop *,/bin/su storm *,/bin/su tez *,/bin/su atlas *,/bin/su yarn *,/bin/su kms *,/bin/su activity_analyzer *,/bin/su livy *,/bin/su zeppelin *,/bin/su infra-solr *,/bin/su logsearch *<br>


4. The Most important thing is the following Doc which shows what all commands agent needs access: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.4/bk_security/content/_commands.html

.

avatar
Master Collaborator

I was facing the similar error and got it resolved by added Hadoop users to passwd file. 

 

resource_management.core.exceptions.ExecutionFailed: Execution of 'usermod -G hadoop -g hadoop hive' returned 6. usermod: user 'hive' does not exist in /etc/passwd
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-59009.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-59009.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1_2', '']

 

>> File location /etc/passwd

>> Adduser hadoop