Support Questions
Find answers, ask questions, and share your expertise

Ambari cluster deploy

Ambari cluster deploy



Im deploying hdp 2..6.1 in rhel 7

i have installed ambari. and when deploying the cluster, i get the following error , at the first step of the deploy


Traceback (most recent call last):

File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/", line 35, in <module>


File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/", line 375, in execute


File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/", line 29, in hook


File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/", line 51, in setup_users

fetch_nonlocal_groups = params.fetch_nonlocal_groups,

File "/usr/lib/python2.6/site-packages/resource_management/core/", line 166, in __init__

File "/usr/lib/python2.6/site-packages/resource_management/core/", line 160, in run

self.run_action(resource, action)

File "/usr/lib/python2.6/site-packages/resource_management/core/", line 124, in run_action


File "/usr/lib/python2.6/site-packages/resource_management/core/providers/", line 84, in action_create

shell.checked_call(command, sudo=True)

File "/usr/lib/python2.6/site-packages/resource_management/core/", line 72, in inner

result = function(command, **kwargs)

File "/usr/lib/python2.6/site-packages/resource_management/core/", line 102, in checked_call

tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)

File "/usr/lib/python2.6/site-packages/resource_management/core/", line 150, in _call_wrapper

result = _call(command, **kwargs_copy)

File "/usr/lib/python2.6/site-packages/resource_management/core/", line 303, in _call

raise ExecutionFailed(err_msg, code, out, err)

resource_management.core.exceptions.ExecutionFailed: Execution of 'usermod -G hdfs -g hadoop hdfs' returned 6. usermod: user 'hdfs' does not exist in /etc/passwd

Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/', 'ANY', '/var/lib/ambari-agent/data/command-3626.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-3626.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']Traceback (most recent call last):

File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-INSTALL/scripts/", line 37, in <module>


File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/", line 382, in execute


File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/", line 244, in save_component_version_to_structured_out

stack_select_package_name = stack_select.get_package_name()

File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/", line 110, in get_package_name

package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name)

File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/", line 224, in get_packages

supported_packages = get_supported_packages()

File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/", line 148, in get_supported_packages

raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path))

resource_management.core.exceptions.Fail: Unable to query for supported packages using /usr/bin/hdp-select

stdout: /var/lib/ambari-agent/data/output-3626.txt

2018-05-14 14:54:16,835 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6

2018-05-14 14:54:16,841 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf

2018-05-14 14:54:16,843 - Group['livy'] {}

2018-05-14 14:54:16,844 - Group['spark'] {}

2018-05-14 14:54:16,844 - Group['hdfs'] {}

2018-05-14 14:54:16,844 - Group['hadoop'] {}

2018-05-14 14:54:16,844 - Group['users'] {}

2018-05-14 14:54:16,845 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}

2018-05-14 14:54:16,853 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}

2018-05-14 14:54:16,860 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}

2018-05-14 14:54:16,866 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}

2018-05-14 14:54:16,872 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}

2018-05-14 14:54:16,879 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}

2018-05-14 14:54:16,885 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}

2018-05-14 14:54:16,891 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}

2018-05-14 14:54:16,898 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}

2018-05-14 14:54:16,904 - Modifying user hdfs

Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/', 'ANY', '/var/lib/ambari-agent/data/command-3626.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-3626.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']

Command failed after 1


all the users exist but not the hdfs user.

eny ideas?



Re: Ambari cluster deploy

Super Mentor

@Marcelo Natalio Saied

The problem seems to be because of the following error (other errors seems it's side effect)

resource_management.core.exceptions.ExecutionFailed: Execution of 'usermod -G hdfs -g hadoop hdfs' returned 6. usermod: user 'hdfs' does not exist in /etc/passwd

Cause: The above error indicates that you might be running the Ambari Agent as "Non Root" user and the privileges are not setup correctly for this user. As this user does not have some privileges like creating/modifying user "hdfs" thats why it is failing to run the "usermod" / "useradd" command.

So please check the following:

1. Are you running ambari agent as Non Root user?

# grep 'run_as_user' /etc/ambari-agent/conf/ambari-agent.ini

2. If running ambari agent as non root uset then please make sure that you have customized the "/etc/sudoers" file as described in

3. Have you configured the "su" commands and corresponding Hadoop service accounts as mentioned in And the Sudoer's default :

# Ambari Customizable Users
ambari ALL=(ALL) NOPASSWD:SETENV: /bin/su hdfs *,/bin/su ambari-qa *,/bin/su ranger *,/bin/su zookeeper *,/bin/su knox *,/bin/su falcon *,/bin/su ams *, /bin/su flume *,/bin/su hbase *,/bin/su spark *,/bin/su accumulo *,/bin/su hive *,/bin/su hcat *,/bin/su kafka *,/bin/su mapred *,/bin/su oozie *,/bin/su sqoop *,/bin/su storm *,/bin/su tez *,/bin/su atlas *,/bin/su yarn *,/bin/su kms *,/bin/su activity_analyzer *,/bin/su livy *,/bin/su zeppelin *,/bin/su infra-solr *,/bin/su logsearch *<br>

4. The Most important thing is the following Doc which shows what all commands agent needs access: