Support Questions

Find answers, ask questions, and share your expertise

/usr/hdp/current/hadoop-client/conf doesn't exist error

avatar
Expert Contributor

I am asking this question after struggling to find a solution for two days. I was using HortonWorks 2.3 Sandbox to create a two node cluster. The nodes were Node1 (Sandbox) and Node2 (second node).

I wanted to implement a cluster manually, so I stopped using the VM and I installed Ambari (Version 2.1.2) manually on a fresh Centos 6.4 OS based PC (Node1). The Ambari installation was successful and I created a cluster 'mycluster' and made Node1 as the master node (Node1). I have also created a local repository successfully. I am now trying to add Node2 (This node is from the earlier cluster, without reinstalling CentOS) to this cluster. The installation was 'green', but I encountered some warnings. The warning suggested that I run the following script

python /usr/lib/python2.6/site-packages/ambari_agent/HostCleanup.py --silent --skip=users

I executed the above script and then proceeded with node add. Under "Assign Slaves and Clients" I selected all the clients displayed. However, on "Install, Start and Test" wizard window, I saw the following error. It seems that the HDFS client installation failed. The error message tells me that "/usr/hdp/current/hadoop-client/conf" doesn't exist. I am unable to understand why Ambari is not creating this folder when I have run the python script and cleaned all the previous installation. What am I missing. Please help. Here is the full error message.

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 120, in <module>
    HdfsClient().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 36, in install
    self.configure(env)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 41, in configure
    hdfs()
  File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
    return fn(*args, **kwargs)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs.py", line 61, in hdfs
    group=params.user_group
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py", line 67, in action_create
    encoding = self.resource.encoding
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 87, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist
1 ACCEPTED SOLUTION

avatar
Contributor

hi

i faced the same issue while starting up nimbus (ambari 2.2.0.0 , hdp 2.3.2.0).

To resolve:

i installed hdfs client on the host and nimbus was able to start up as it found

"/usr/hdp/current/hadoop-client/conf"

View solution in original post

6 REPLIES 6

avatar

@PRADEEP /usr/hdp/current contains symlinks to versioned directories

815-2015-12-15-07-35-33.png

As always, verify correct permissions exist on the directories.

avatar
Contributor

hi

i faced the same issue while starting up nimbus (ambari 2.2.0.0 , hdp 2.3.2.0).

To resolve:

i installed hdfs client on the host and nimbus was able to start up as it found

"/usr/hdp/current/hadoop-client/conf"

avatar
Contributor

I also found that if you cleanup hadoop directories on the filesystem, you might need to force Ambari to re-install the packages by removing hdp-select. This works when you're going through "Install, Start, Test." Retry the failures after running this on each affected node.

yum -y erase hdp-select

avatar
Contributor

I have installed the hdfs client package manually , problem solved ofcourse I had to manually install nc rpm

avatar
Explorer

Faced similar issue with HDP 2.5.3, the article is good but might be good to include in ambari the run of the

yum -y erase hdp-select

Otherwhise each time the installation fail and you have to retry you get the issue

avatar
Explorer

i got the similar issues above with HDP2.6

when i run yum -y erase hdp-select on each host, still there exist problem where

Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
User Group mapping (user_group) is missing in the hostLevelParams

Kindly advise.