- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
HDFS Client failed to install due to bad symlink
- Labels:
-
Apache Ambari
Created ‎02-01-2016 08:56 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
We are installing HDP 2.3 using Ambari 2.1.2 on CentOS 6.5 and running into an issue where hdfs client fails to install due to what appears to be a bad symlink.
There is a symlink in this location which appears to be broken, linking to /etc/hadoop/conf, which then links back to /usr/hdp/current/hadoop-client/conf. This results in a cyclical link. We are importing a blueprint to create the cluster. This same blueprint works fine in another environment. I am not sure what to look for in order to resolve this error.
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 87, in action_create raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname)) resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist
Created ‎02-02-2016 06:21 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I figured it out. We had multiple versions of the rpms in our local yum repo (build 2434 and build 2557) so ambari was confused and pulling older rpms, causing this error.
Created ‎02-01-2016 09:57 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It did happen when I used the environment for the install after cleaning up previous install. If that's the case then you need to make sure that the cleanup did happen completely.
Created ‎05-08-2017 04:39 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hey @Jason Reslock , i am facing similar issue while trying to reinstall , but how to make sure that cleanup did happen completely?
Created ‎02-01-2016 10:46 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Jason Reslock
Is hdfs-client installed on that host?
As always, verify correct permissions exist on the directories.
Ex:
lrwxrwxrwx 1 root root 28 Oct 14 18:24 hadoop-client -> /usr/hdp/2.2.0.0-2041/hadoop
Created ‎02-02-2016 06:21 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I figured it out. We had multiple versions of the rpms in our local yum repo (build 2434 and build 2557) so ambari was confused and pulling older rpms, causing this error.
