Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

HDP Installation is getting failed while installing through ambai

HDP Installation is getting failed while installing through ambai

New Contributor

I have tried installing HDP but frequently getting this error. Can someone check and update me why it is causing the issue?


Please find the error:


2019-05-08 09:42:03,338 - The 'hadoop-yarn-client' component did not advertise a version. This may indicate a problem with the component packaging. Traceback (most recent call last):  File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/yarn_client.py", line 62, in <module>    YarnClient().execute()  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute    method(env)  File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/yarn_client.py", line 35, in install    self.configure(env)  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 120, in locking_configure    original_configure(obj, *args, **kw)  File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/yarn_client.py", line 40, in configure    yarn()  File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk    return fn(*args, **kwargs)  File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/yarn.py", line 407, in yarn    mode=params.container_executor_mode  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__    self.env.run()  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run    self.run_action(resource, action)  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action    provider_action()  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 120, in action_create    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname)) resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-yarn-client/bin/container-executor'] failed, parent directory /usr/hdp/current/hadoop-yarn-client/bin doesn't exist

stdout: /var/lib/ambari-agent/data/output-163.txt


2019-05-08 09:41:58,836 - Stack Feature Version Info: Cluster Stack=2.5, Command Stack=None, Command Version=None -> 2.5 2019-05-08 09:41:58,837 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2019-05-08 09:41:58,839 - Group['hdfs'] {} 2019-05-08 09:41:58,841 - Group['hadoop'] {} 2019-05-08 09:41:58,841 - Group['users'] {} 2019-05-08 09:41:58,842 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2019-05-08 09:41:58,843 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2019-05-08 09:41:58,844 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None} 2019-05-08 09:41:58,846 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None} 2019-05-08 09:41:58,848 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2019-05-08 09:41:58,849 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2019-05-08 09:41:58,850 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2019-05-08 09:41:58,852 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2019-05-08 09:41:58,859 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if 2019-05-08 09:41:58,861 - Group['hdfs'] {} 2019-05-08 09:41:58,861 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hdfs']} 2019-05-08 09:41:58,862 - FS Type: 2019-05-08 09:41:58,863 - Directory['/etc/hadoop'] {'mode': 0755} 2019-05-08 09:41:58,888 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2019-05-08 09:41:58,889 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2019-05-08 09:41:58,916 - Repository['HDP-2.5-repo-2'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.5.3.0', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None} 2019-05-08 09:41:58,928 - File['/etc/yum.repos.d/ambari-hdp-2.repo'] {'content': '[HDP-2.5-repo-2]\nname=HDP-2.5-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.5.3.0\n\npath=/\nenabled=1\ngpgcheck=0'} 2019-05-08 09:41:58,929 - Writing File['/etc/yum.repos.d/ambari-hdp-2.repo'] because contents don't match 2019-05-08 09:41:58,930 - Repository['HDP-UTILS-1.1.0.21-repo-2'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None} 2019-05-08 09:41:58,935 - File['/etc/yum.repos.d/ambari-hdp-2.repo'] {'content': '[HDP-2.5-repo-2]\nname=HDP-2.5-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.5.3.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-2]\nname=HDP-UTILS-1.1.0.21-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6\n\npath=/\nenabled=1\ngpgcheck=0'} 2019-05-08 09:41:58,936 - Writing File['/etc/yum.repos.d/ambari-hdp-2.repo'] because contents don't match 2019-05-08 09:41:58,936 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-05-08 09:41:59,018 - Skipping installation of existing package unzip 2019-05-08 09:41:59,019 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-05-08 09:41:59,037 - Skipping installation of existing package curl 2019-05-08 09:41:59,037 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-05-08 09:41:59,055 - Skipping installation of existing package hdp-select 2019-05-08 09:41:59,130 - call[('ambari-python-wrap', '/usr/bin/hdp-select', 'versions')] {} 2019-05-08 09:41:59,167 - call returned (0, '2.5.3.0-37\n2.6.1.0-129') 2019-05-08 09:41:59,390 - Command repositories: HDP-2.5-repo-2, HDP-UTILS-1.1.0.21-repo-2 2019-05-08 09:41:59,390 - Applicable repositories: HDP-2.5-repo-2, HDP-UTILS-1.1.0.21-repo-2 2019-05-08 09:41:59,395 - Looking for matching packages in the following repositories: HDP-2.5-repo-2, HDP-UTILS-1.1.0.21-repo-2 2019-05-08 09:42:01,398 - Adding fallback repositories: HDP-UTILS-1.1.0.21-repo-1 2019-05-08 09:42:02,372 - Package['hadoop_2_5_3_0_37-yarn'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-05-08 09:42:02,454 - Skipping installation of existing package hadoop_2_5_3_0_37-yarn 2019-05-08 09:42:02,458 - Package['hadoop_2_5_3_0_37-mapreduce'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-05-08 09:42:02,484 - Skipping installation of existing package hadoop_2_5_3_0_37-mapreduce 2019-05-08 09:42:02,487 - Package['hadoop_2_5_3_0_37-hdfs'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-05-08 09:42:02,505 - Skipping installation of existing package hadoop_2_5_3_0_37-hdfs 2019-05-08 09:42:02,517 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2019-05-08 09:42:02,518 - Stack Feature Version Info: Cluster Stack=2.5, Command Stack=None, Command Version=None -> 2.5 2019-05-08 09:42:02,519 - call['ambari-python-wrap /usr/bin/hdp-select status hadoop-yarn-resourcemanager'] {'timeout': 20} 2019-05-08 09:42:02,556 - call returned (0, 'hadoop-yarn-resourcemanager - None') 2019-05-08 09:42:02,558 - Failed to get extracted version with /usr/bin/hdp-select 2019-05-08 09:42:02,605 - call['ambari-python-wrap /usr/bin/hdp-select status hadoop-yarn-client'] {'timeout': 20} 2019-05-08 09:42:02,642 - call returned (0, 'hadoop-yarn-client - None') 2019-05-08 09:42:02,643 - Failed to get extracted version with /usr/bin/hdp-select 2019-05-08 09:42:02,643 - Unable to determine hdp-select version for hadoop-yarn-client 2019-05-08 09:42:02,685 - call['ambari-python-wrap /usr/bin/hdp-select status hadoop-yarn-client'] {'timeout': 20} 2019-05-08 09:42:02,723 - call returned (0, 'hadoop-yarn-client - None') 2019-05-08 09:42:02,724 - Failed to get extracted version with /usr/bin/hdp-select 2019-05-08 09:42:02,724 - Unable to determine hdp-select version for hadoop-yarn-client 2019-05-08 09:42:02,730 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2019-05-08 09:42:02,736 - Stack Feature Version Info: Cluster Stack=2.5, Command Stack=None, Command Version=None -> 2.5 2019-05-08 09:42:02,743 - Directory['/var/log/hadoop-yarn/nodemanager/recovery-state'] {'owner': 'yarn', 'group': 'hadoop', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'} 2019-05-08 09:42:02,746 - Directory['/var/run/hadoop-yarn'] {'owner': 'yarn', 'create_parents': True, 'group': 'hadoop', 'cd_access': 'a'} 2019-05-08 09:42:02,747 - Directory['/var/run/hadoop-yarn/yarn'] {'owner': 'yarn', 'create_parents': True, 'group': 'hadoop', 'cd_access': 'a'} 2019-05-08 09:42:02,748 - Directory['/var/log/hadoop-yarn/yarn'] {'owner': 'yarn', 'group': 'hadoop', 'create_parents': True, 'cd_access': 'a'} 2019-05-08 09:42:02,749 - Directory['/var/run/hadoop-mapreduce'] {'owner': 'mapred', 'create_parents': True, 'group': 'hadoop', 'cd_access': 'a'} 2019-05-08 09:42:02,750 - Directory['/var/run/hadoop-mapreduce/mapred'] {'owner': 'mapred', 'create_parents': True, 'group': 'hadoop', 'cd_access': 'a'} 2019-05-08 09:42:02,751 - Directory['/var/log/hadoop-mapreduce'] {'owner': 'mapred', 'create_parents': True, 'group': 'hadoop', 'cd_access': 'a'} 2019-05-08 09:42:02,752 - Directory['/var/log/hadoop-mapreduce/mapred'] {'owner': 'mapred', 'group': 'hadoop', 'create_parents': True, 'cd_access': 'a'} 2019-05-08 09:42:02,752 - Directory['/var/log/hadoop-yarn'] {'owner': 'yarn', 'group': 'hadoop', 'ignore_failures': True, 'create_parents': True, 'cd_access': 'a'} 2019-05-08 09:42:02,753 - XmlConfig['core-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'mode': 0644, 'configuration_attributes': {'final': {'fs.defaultFS': 'true'}}, 'owner': 'hdfs', 'configurations': ...} 2019-05-08 09:42:02,767 - Generating config: /usr/hdp/current/hadoop-client/conf/core-site.xml 2019-05-08 09:42:02,767 - File['/usr/hdp/current/hadoop-client/conf/core-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'} 2019-05-08 09:42:02,797 - XmlConfig['hdfs-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'mode': 0644, 'configuration_attributes': {'final': {'dfs.support.append': 'true', 'dfs.datanode.data.dir': 'true', 'dfs.namenode.http-address': 'true', 'dfs.namenode.name.dir': 'true', 'dfs.webhdfs.enabled': 'true', 'dfs.datanode.failed.volumes.tolerated': 'true'}}, 'owner': 'hdfs', 'configurations': ...} 2019-05-08 09:42:02,809 - Generating config: /usr/hdp/current/hadoop-client/conf/hdfs-site.xml 2019-05-08 09:42:02,810 - File['/usr/hdp/current/hadoop-client/conf/hdfs-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'} 2019-05-08 09:42:02,883 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'yarn', 'configurations': ...} 2019-05-08 09:42:02,897 - Generating config: /usr/hdp/current/hadoop-client/conf/mapred-site.xml 2019-05-08 09:42:02,898 - File['/usr/hdp/current/hadoop-client/conf/mapred-site.xml'] {'owner': 'yarn', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'} 2019-05-08 09:42:02,966 - XmlConfig['yarn-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'yarn', 'configurations': ...} 2019-05-08 09:42:02,980 - Generating config: /usr/hdp/current/hadoop-client/conf/yarn-site.xml 2019-05-08 09:42:02,981 - File['/usr/hdp/current/hadoop-client/conf/yarn-site.xml'] {'owner': 'yarn', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'} 2019-05-08 09:42:03,134 - XmlConfig['capacity-scheduler.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'yarn', 'configurations': ...} 2019-05-08 09:42:03,148 - Generating config: /usr/hdp/current/hadoop-client/conf/capacity-scheduler.xml 2019-05-08 09:42:03,148 - File['/usr/hdp/current/hadoop-client/conf/capacity-scheduler.xml'] {'owner': 'yarn', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'} 2019-05-08 09:42:03,177 - File['/etc/security/limits.d/yarn.conf'] {'content': Template('yarn.conf.j2'), 'mode': 0644} 2019-05-08 09:42:03,180 - File['/etc/security/limits.d/mapreduce.conf'] {'content': Template('mapreduce.conf.j2'), 'mode': 0644} 2019-05-08 09:42:03,194 - File['/usr/hdp/current/hadoop-client/conf/yarn-env.sh'] {'content': InlineTemplate(...), 'owner': 'yarn', 'group': 'hadoop', 'mode': 0755} 2019-05-08 09:42:03,196 - File['/usr/hdp/current/hadoop-yarn-client/bin/container-executor'] {'group': 'hadoop', 'mode': 02050} 2019-05-08 09:42:03,298 - call[('ambari-python-wrap', '/usr/bin/hdp-select', 'versions')] {} 2019-05-08 09:42:03,337 - call returned (0, '2.5.3.0-37\n2.6.1.0-129') 2019-05-08 09:42:03,338 - The 'hadoop-yarn-client' component did not advertise a version. This may indicate a problem with the component packaging. 
Command failed after 1 tries

OK

7 REPLIES 7

Re: HDP Installation is getting failed while installing through ambai

New Contributor

Please find the error message while installing map reduce client


19-05-08 09:56:58,153 - Writing File['/usr/hdp/2.6.1.0-129/hadoop/conf/yarn-env.sh'] because contents don't match 2019-05-08 09:56:58,155 - File['/usr/hdp/current/hadoop-yarn-client/bin/container-executor'] {'group': 'hadoop', 'mode': 02050} 2019-05-08 09:56:58,156 - The repository with version 2.6.1.0-129 for this command has been marked as resolved. It will be used to report the version of the component which was installed 
Command failed after 1 tries

Re: HDP Installation is getting failed while installing through ambai

Super Mentor

@sai venkatesh

As we see message like following :

2019-05-08 09:42:02,724 - Failed to get extracted version with /usr/bin/hdp-select 
2019-05-08 09:42:02,724 - Unable to determine hdp-select version for hadoop-yarn-client



Followed by:

2019-05-08 09:42:03,298 - call[('ambari-python-wrap', '/usr/bin/hdp-select', 'versions')] {} 
2019-05-08 09:42:03,337 - call returned (0, '2.5.3.0-37\n2.6.1.0-129')


So looks like you might need to update the "hadoop-yarn-client" as it might not be updated properly.

So can you please check this and let us know the output on the poroblematic host? To verify if it is showing "2.6.1.0-129" version or the old one?

# /usr/bin/hdp-select | grep yarn


If you see that hdp-select is showing correcvt version for other components but not for "hadoop-yarn-client" then you will need to update it:


Check if the binaries are updated? (Better to first compare those binaries and hdp-select oputput on other working nodes)

# yum info hadoop-yarn
# yum update hadoop-yarn


.

Highlighted

Re: HDP Installation is getting failed while installing through ambai

New Contributor

@sensharma will post the output soon. just want to inform as this is the first time i am installing the hdp packages in vmware machine through amabri

Re: HDP Installation is getting failed while installing through ambai

New Contributor

Please find the screenshot

108622-1557404700757.png

Re: HDP Installation is getting failed while installing through ambai

New Contributor

HDFS Client has failed in all the three nodes

Traceback (most recent call last):  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 120, in <module>    HdfsClient().execute()  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute    method(env)  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 36, in install    self.configure(env)  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 41, in configure    hdfs()  File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk    return fn(*args, **kwargs)  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs.py", line 61, in hdfs    group=params.user_group  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__    self.env.run()  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run    self.run_action(resource, action)  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action    provider_action()  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py", line 66, in action_create    encoding = self.resource.encoding  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__    self.env.run()  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run    self.run_action(resource, action)  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action    provider_action()  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 87, in action_create    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname)) resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist

Re: HDP Installation is getting failed while installing through ambai

New Contributor

please find the output

[root@edge conf]# /usr/bin/hdp-select | grep hdfs

hadoop-hdfs-datanode - 2.4.3.0-227

hadoop-hdfs-journalnode - 2.4.3.0-227

hadoop-hdfs-namenode - 2.4.3.0-227

hadoop-hdfs-nfs3 - 2.4.3.0-227

hadoop-hdfs-portmap - 2.4.3.0-227

hadoop-hdfs-secondarynamenode - 2.4.3.0-227




[root@node1 ~]# /usr/bin/hdp-select | grep hdfs

hadoop-hdfs-datanode - 2.4.3.0-227

hadoop-hdfs-journalnode - 2.4.3.0-227

hadoop-hdfs-namenode - 2.4.3.0-227

hadoop-hdfs-nfs3 - 2.4.3.0-227

hadoop-hdfs-portmap - 2.4.3.0-227

hadoop-hdfs-secondarynamenode - 2.4.3.0-227

[root@node1 ~]#

[root@edge conf]# yum info hadoop-yarn^C

[root@edge conf]# yum info hadoop-hdfs

Loaded plugins: fastestmirror, security

Repository HDP-UTILS-1.1.0.20 is listed more than once in the configuration

Loading mirror speeds from cached hostfile

Available Packages

Name : hadoop-hdfs

Arch : noarch

Version : 2.7.1.2.4.3.0

Release : 227.el6

Size : 2.5 k

Repo : HDP-2.4

Summary : hadoop-hdfs Distro virtual package

License : APL2

Description : hadoop-hdfs-2.7.1.2.4.3.0 virtual package


Re: HDP Installation is getting failed while installing through ambai

New Contributor

hdfs client failed in all the three nodes

Please find the error log

Traceback (most recent call last):  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 120, in <module>    HdfsClient().execute()  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute    method(env)  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 36, in install    self.configure(env)  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 41, in configure    hdfs()  File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk    return fn(*args, **kwargs)  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs.py", line 61, in hdfs    group=params.user_group  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__    self.env.run()  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run    self.run_action(resource, action)  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action    provider_action()  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py", line 66, in action_create    encoding = self.resource.encoding  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__    self.env.run()  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run    self.run_action(resource, action)  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action    provider_action()  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 87, in action_create    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname)) resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist

stdout: /var/lib/ambari-agent/data/output-133.txt

2019-05-09 05:23:24,188 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2019-05-09 05:23:24,190 - Group['hadoop'] {} 2019-05-09 05:23:24,192 - Group['users'] {} 2019-05-09 05:23:24,193 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2019-05-09 05:23:24,195 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']} 2019-05-09 05:23:24,196 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2019-05-09 05:23:24,197 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2019-05-09 05:23:24,200 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2019-05-09 05:23:24,207 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if 2019-05-09 05:23:24,208 - Group['hdfs'] {} 2019-05-09 05:23:24,209 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']} 2019-05-09 05:23:24,211 - FS Type: 2019-05-09 05:23:24,211 - Directory['/etc/hadoop'] {'mode': 0755} 2019-05-09 05:23:24,212 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777} 2019-05-09 05:23:24,238 - Repository['HDP-2.4'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.4.3.0', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}
		{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None} 2019-05-09 05:23:24,253 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.4]\nname=HDP-2.4\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.4.3.0\n\npath=/\nenabled=1\ngpgcheck=0'} 2019-05-09 05:23:24,255 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None} 2019-05-09 05:23:24,263 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.20]\nname=HDP-UTILS-1.1.0.20\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos6\n\npath=/\nenabled=1\ngpgcheck=0'} 2019-05-09 05:23:24,264 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-05-09 05:23:24,378 - Skipping installation of existing package unzip 2019-05-09 05:23:24,378 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-05-09 05:23:24,403 - Skipping installation of existing package curl 2019-05-09 05:23:24,404 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-05-09 05:23:24,429 - Skipping installation of existing package hdp-select 2019-05-09 05:23:24,701 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2019-05-09 05:23:24,711 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2019-05-09 05:23:24,719 - Package['rpcbind'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-05-09 05:23:24,837 - Skipping installation of existing package rpcbind 2019-05-09 05:23:24,839 - Package['hadoop_2_4_*'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-05-09 05:23:24,867 - Skipping installation of existing package hadoop_2_4_* 2019-05-09 05:23:24,868 - Package['snappy'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-05-09 05:23:24,897 - Skipping installation of existing package snappy 2019-05-09 05:23:24,899 - Package['snappy-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-05-09 05:23:24,926 - Skipping installation of existing package snappy-devel 2019-05-09 05:23:24,927 - Package['hadoop_2_4_*-libhdfs'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-05-09 05:23:24,954 - Skipping installation of existing package hadoop_2_4_*-libhdfs 2019-05-09 05:23:24,964 - Directory['/etc/security/limits.d'] {'owner': 'root', 'group': 'root', 'recursive': True} 2019-05-09 05:23:24,978 - File['/etc/security/limits.d/hdfs.conf'] {'content': Template('hdfs.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644} 2019-05-09 05:23:24,980 - XmlConfig['hadoop-policy.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'configuration_attributes': {}, 'configurations': ...} 2019-05-09 05:23:25,007 - Generating config: /usr/hdp/current/hadoop-client/conf/hadoop-policy.xml 2019-05-09 05:23:25,007 - File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}