Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Too many levels of symbolic links when running basic installation

Highlighted

Too many levels of symbolic links when running basic installation

Latest Ambari 2.2 download 5/22/16. Installing HDP 2.3 on a ten-node cluster.

HDFS-Client and HBase-Client installs fail with: Applying File['/usr/hdp/current/hadoop-client/conf/hdfs-site.xml'] failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist (similar for the HBase Client.) Symbolic link appears circular:

$ cd /usr/hdp/current/hadoop-client

$ ls -al conf lrwxrwxrwx 1 root root 16 May 22 05:44 conf -> /etc/hadoop/conf 
$ ls -al /etc/hadoop/conf lrwxrwxrwx 1 root root 35 May 22 05:44 /etc/hadoop/conf -> /usr/hdp/current/hadoop-client/conf

Is this a known issue? Is there a way out of it?

5 REPLIES 5

Re: Too many levels of symbolic links when running basic installation

Super Collaborator

Could you please provide the full version of HDP,, including build? The default chain of links is supposed to be :

/etc/hadoop-/conf -> /usr/hdp/current/hadoop-client/conf
/usr/hdp/current/hadoop/client/conf -> /etc/hadoop/<vershion HDP>/0

You may manually restore it.

Re: Too many levels of symbolic links when running basic installation

How do you check that?

Re: Too many levels of symbolic links when running basic installation

Super Collaborator

Check the list of directories in /usr/hdp/ :

[root@t1 ~]# ls -la /usr/hdp
total 16
drwxr-xr-x  4 root root 4096 May 15 14:55 .
drwxr-xr-x 15 root root 4096 May 15 14:53 ..
drwxr-xr-x 16 root root 4096 May 15 15:00 2.4.2.0-258
drwxr-xr-x  2 root root 4096 May 19 23:41 current

2.4.2.0-258 in my case.

Re: Too many levels of symbolic links when running basic installation

drwxr-xr-x 22 root root 4096 May 22 05:48 2.3.4.7-4

Re: Too many levels of symbolic links when running basic installation

Guru

You can take out the softlinks from /etc/hadoop/conf with unlink (unlink /etc/hadoop/conf). Reinstall client and restart services on that node. This should take care of circular soft link issues.

Don't have an account?
Coming from Hortonworks? Activate your account here