<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: HDP-2.5.0: Too many levels of symbolic links when installing Clients in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/HDP-2-5-0-Too-many-levels-of-symbolic-links-when-installing/m-p/138697#M101326</link>
    <description>&lt;P&gt;The message boards here are just fine. You can either copy/paste them in a code block or compress them and upload them directly. &lt;/P&gt;&lt;P&gt;What I'm looking for is something like this as part of the hadoop client install on a host with the problem:&lt;/P&gt;&lt;PRE&gt;2016-08-31 15:50:29,421 - Checking if need to create versioned conf dir /etc/hadoop/2.4.2.0-236/0
2016-08-31 15:50:29,422 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'dry-run-create', '--package', 'hadoop', '--stack-version', '2.4.2.0-236', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-08-31 15:50:29,439 - call returned (0, '/etc/hadoop/2.4.2.0-236/0', '')
2016-08-31 15:50:29,439 - Package hadoop will have new conf directories: /etc/hadoop/2.4.2.0-236/0
2016-08-31 15:50:29,439 - Checking if need to create versioned conf dir /etc/hadoop/2.4.2.0-236/0
2016-08-31 15:50:29,440 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.4.2.0-236', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-08-31 15:50:29,457 - call returned (0, '/etc/hadoop/2.4.2.0-236/0', '')
...

2016-08-31 15:50:29,492 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.4.2.0-236', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-08-31 15:50:29,509 - checked_call returned (0, '/usr/hdp/2.4.2.0-236/hadoop/conf -&amp;gt; /etc/hadoop/2.4.2.0-236/0')
2016-08-31 15:50:29,510 - Ensuring that hadoop has the correct symlink structure&lt;/PRE&gt;</description>
    <pubDate>Wed, 07 Sep 2016 02:09:02 GMT</pubDate>
    <dc:creator>jonathanhurley</dc:creator>
    <dc:date>2016-09-07T02:09:02Z</dc:date>
    <item>
      <title>HDP-2.5.0: Too many levels of symbolic links when installing Clients</title>
      <link>https://community.cloudera.com/t5/Support-Questions/HDP-2-5-0-Too-many-levels-of-symbolic-links-when-installing/m-p/138693#M101322</link>
      <description>&lt;P&gt;Installing HDP-2.5.0 Clients using Ambari 2.4.0.1 fails with Warnings/Errors regarding 'Too many levels of symbolic links' when trying to populate /usr/hdp/current/[appName]/conf with configs. &lt;/P&gt;&lt;P&gt;This error is thrown when a targeted working directory is symlinked to a folder which is in turn symlinked back to the original targeted working directory. IE: /usr/hdp/current/[appName]/conf will be symlinked to /etc/[appName]/conf but /etc/[appName]/conf will also be symlinked back to /usr/hdp/current/[appName]/conf.&lt;/P&gt;&lt;PRE&gt;# hadoop-clients example:

[b84cb1ae teal:hadoop-clients ~] # ll /usr/hdp/current/hadoop-client/conf
lrwxrwxrwx. 1 root root 16 Sep  6 05:31 /usr/hdp/current/hadoop-client/conf -&amp;gt; /etc/hadoop/conf
[b84cb1ae teal:hadoop-clients ~] # ll /etc/hadoop/conf
lrwxrwxrwx. 1 root root 35 Sep  6 05:34 /etc/hadoop/conf -&amp;gt; /usr/hdp/current/hadoop-client/conf&lt;/PRE&gt;&lt;PRE&gt;# error thrown in /var/log/ambari-agent/ambari-agent.log:

INFO 2016-09-06 05:43:17,682 ActionQueue.py:104 - Adding STATUS_COMMAND for component HDFS_CLIENT of service HDFS of cluster hart to the queue.
INFO 2016-09-06 05:43:17,690 ActionQueue.py:104 - Adding STATUS_COMMAND for component YARN_CLIENT of service YARN of cluster hart to the queue.
INFO 2016-09-06 05:43:17,699 ActionQueue.py:104 - Adding STATUS_COMMAND for component MAPREDUCE2_CLIENT of service MAPREDUCE2 of cluster hart to the queue.
INFO 2016-09-06 05:43:17,707 ActionQueue.py:104 - Adding STATUS_COMMAND for component TEZ_CLIENT of service TEZ of cluster hart to the queue.
INFO 2016-09-06 05:43:17,715 ActionQueue.py:104 - Adding STATUS_COMMAND for component HCAT of service HIVE of cluster hart to the queue.
INFO 2016-09-06 05:43:17,723 ActionQueue.py:104 - Adding STATUS_COMMAND for component HIVE_CLIENT of service HIVE of cluster hart to the queue.
INFO 2016-09-06 05:43:17,732 ActionQueue.py:104 - Adding STATUS_COMMAND for component PIG of service PIG of cluster hart to the queue.
INFO 2016-09-06 05:43:17,740 ActionQueue.py:104 - Adding STATUS_COMMAND for component SQOOP of service SQOOP of cluster hart to the queue.
INFO 2016-09-06 05:43:17,748 ActionQueue.py:104 - Adding STATUS_COMMAND for component ZOOKEEPER_CLIENT of service ZOOKEEPER of cluster hart to the queue.
INFO 2016-09-06 05:43:17,756 ActionQueue.py:104 - Adding STATUS_COMMAND for component SPARK_CLIENT of service SPARK of cluster hart to the queue.
INFO 2016-09-06 05:43:17,766 ActionQueue.py:104 - Adding STATUS_COMMAND for component SLIDER of service SLIDER of cluster hart to the queue.
INFO 2016-09-06 05:43:17,774 ActionQueue.py:104 - Adding STATUS_COMMAND for component METRICS_MONITOR of service AMBARI_METRICS of cluster hart to the queue.
INFO 2016-09-06 05:43:17,782 ActionQueue.py:104 - Adding STATUS_COMMAND for component HST_AGENT of service SMARTSENSE of cluster hart to the queue.
INFO 2016-09-06 05:43:19,232 PythonReflectiveExecutor.py:65 - Reflective command failed with exception:
Traceback (most recent call last):
  File "/usr/lib/python2.6/site-packages/ambari_agent/PythonReflectiveExecutor.py", line 57, in run_file
    imp.load_source('__main__', script)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 123, in &amp;lt;module&amp;gt;
    HdfsClient().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 84, in security_status
    {'core-site.xml': FILE_TYPE_XML})
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/security_commons.py", line 129, in get_params_from_filesystem
    configuration = ET.parse(conf_dir + os.sep + config_file)
  File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1182, in parse
    tree.parse(source, parser)
  File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 647, in parse
    source = open(source, "rb")
IOError: [Errno 40] Too many levels of symbolic links: u'/usr/hdp/current/hadoop-client/conf/core-site.xml'&lt;/PRE&gt;&lt;P&gt;This is on brand new HDP-2.5.0 clusters and is repeatable. Failures occur on nodes running Clients only without additional applications. I can go through and manually resolve these issues but with the number of applications i am installing and the number of client only nodes i am creating, manual intervention is not practical.&lt;/P&gt;&lt;P&gt;Is this a known issue? Possibly a directory creation race condition?&lt;/P&gt;</description>
      <pubDate>Wed, 07 Sep 2016 00:44:58 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/HDP-2-5-0-Too-many-levels-of-symbolic-links-when-installing/m-p/138693#M101322</guid>
      <dc:creator>hansohn</dc:creator>
      <dc:date>2016-09-07T00:44:58Z</dc:date>
    </item>
    <item>
      <title>Re: HDP-2.5.0: Too many levels of symbolic links when installing Clients</title>
      <link>https://community.cloudera.com/t5/Support-Questions/HDP-2-5-0-Too-many-levels-of-symbolic-links-when-installing/m-p/138694#M101323</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/12045/ryanhansohn.html" nodeid="12045"&gt;@Ryan Hanson&lt;/A&gt;&lt;/P&gt;&lt;P&gt;The obvious issue is the circular symlink references. Have you created symlinks prior to running the installer?&lt;/P&gt;</description>
      <pubDate>Wed, 07 Sep 2016 01:03:44 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/HDP-2-5-0-Too-many-levels-of-symbolic-links-when-installing/m-p/138694#M101323</guid>
      <dc:creator>emaxwell</dc:creator>
      <dc:date>2016-09-07T01:03:44Z</dc:date>
    </item>
    <item>
      <title>Re: HDP-2.5.0: Too many levels of symbolic links when installing Clients</title>
      <link>https://community.cloudera.com/t5/Support-Questions/HDP-2-5-0-Too-many-levels-of-symbolic-links-when-installing/m-p/138695#M101324</link>
      <description>&lt;P&gt;This seems like a bug, perhaps caused by client-only hosts. &lt;/P&gt;&lt;P&gt;/etc/&amp;lt;component&amp;gt;/conf -&amp;gt; /usr/hdp/current/hadoop-client/conf is correct.&lt;/P&gt;&lt;P&gt;What should have happened is that conf-select should have changed /usr/hdp/current/hadoop-client/conf to point to something like /usr/hdp/2.5.0.0-1234/hadoop/conf/0&lt;/P&gt;&lt;P&gt;I'm guessing that the conf-select step failed. If you could post the entire output from your client install command, that can help us determine why it failed. &lt;/P&gt;</description>
      <pubDate>Wed, 07 Sep 2016 01:07:10 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/HDP-2-5-0-Too-many-levels-of-symbolic-links-when-installing/m-p/138695#M101324</guid>
      <dc:creator>jonathanhurley</dc:creator>
      <dc:date>2016-09-07T01:07:10Z</dc:date>
    </item>
    <item>
      <title>Re: HDP-2.5.0: Too many levels of symbolic links when installing Clients</title>
      <link>https://community.cloudera.com/t5/Support-Questions/HDP-2-5-0-Too-many-levels-of-symbolic-links-when-installing/m-p/138696#M101325</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/98/emaxwell.html" nodeid="98"&gt;@emaxwell&lt;/A&gt; - Nope, I haven't tampered with any of the HDP directories prior to, or post, installation.&lt;/P&gt;&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/485/jhurley.html" nodeid="485"&gt;@Jonathan Hurley&lt;/A&gt; - Will do, I destroyed the cluster so will need to respin it back up. Takes a bit. Will post back in a few hours. Also is this the best venue for these logs or should I email/post them elsewhere?&lt;/P&gt;</description>
      <pubDate>Wed, 07 Sep 2016 01:14:51 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/HDP-2-5-0-Too-many-levels-of-symbolic-links-when-installing/m-p/138696#M101325</guid>
      <dc:creator>hansohn</dc:creator>
      <dc:date>2016-09-07T01:14:51Z</dc:date>
    </item>
    <item>
      <title>Re: HDP-2.5.0: Too many levels of symbolic links when installing Clients</title>
      <link>https://community.cloudera.com/t5/Support-Questions/HDP-2-5-0-Too-many-levels-of-symbolic-links-when-installing/m-p/138697#M101326</link>
      <description>&lt;P&gt;The message boards here are just fine. You can either copy/paste them in a code block or compress them and upload them directly. &lt;/P&gt;&lt;P&gt;What I'm looking for is something like this as part of the hadoop client install on a host with the problem:&lt;/P&gt;&lt;PRE&gt;2016-08-31 15:50:29,421 - Checking if need to create versioned conf dir /etc/hadoop/2.4.2.0-236/0
2016-08-31 15:50:29,422 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'dry-run-create', '--package', 'hadoop', '--stack-version', '2.4.2.0-236', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-08-31 15:50:29,439 - call returned (0, '/etc/hadoop/2.4.2.0-236/0', '')
2016-08-31 15:50:29,439 - Package hadoop will have new conf directories: /etc/hadoop/2.4.2.0-236/0
2016-08-31 15:50:29,439 - Checking if need to create versioned conf dir /etc/hadoop/2.4.2.0-236/0
2016-08-31 15:50:29,440 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.4.2.0-236', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-08-31 15:50:29,457 - call returned (0, '/etc/hadoop/2.4.2.0-236/0', '')
...

2016-08-31 15:50:29,492 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.4.2.0-236', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-08-31 15:50:29,509 - checked_call returned (0, '/usr/hdp/2.4.2.0-236/hadoop/conf -&amp;gt; /etc/hadoop/2.4.2.0-236/0')
2016-08-31 15:50:29,510 - Ensuring that hadoop has the correct symlink structure&lt;/PRE&gt;</description>
      <pubDate>Wed, 07 Sep 2016 02:09:02 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/HDP-2-5-0-Too-many-levels-of-symbolic-links-when-installing/m-p/138697#M101326</guid>
      <dc:creator>jonathanhurley</dc:creator>
      <dc:date>2016-09-07T02:09:02Z</dc:date>
    </item>
    <item>
      <title>Re: HDP-2.5.0: Too many levels of symbolic links when installing Clients</title>
      <link>https://community.cloudera.com/t5/Support-Questions/HDP-2-5-0-Too-many-levels-of-symbolic-links-when-installing/m-p/138698#M101327</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/485/jhurley.html" nodeid="485"&gt;@Jonathan Hurley&lt;/A&gt; Sorry for the late reply but based on your lead I was able to figure out the root cause of my issue.&lt;/P&gt;&lt;P&gt;When I pulled the logs you mentioned I found the following:&lt;/P&gt;&lt;PRE&gt;2016-09-06 20:44:23,410 - Backing up /etc/hadoop/conf to /etc/hadoop/conf.backup if destination doesn't exist already.
2016-09-06 20:44:23,411 - Execute[('cp', '-R', '-p', '/etc/hadoop/conf', '/etc/hadoop/conf.backup')] {'not_if': 'test -e /etc/hadoop/conf.backup', 'sudo': True}
2016-09-06 20:44:23,436 - Checking if need to create versioned conf dir /etc/hadoop/2.5.0.0-1245/0
2016-09-06 20:44:23,438 - call[('ambari-python-wrap', u'/usr/bin/conf-select', 'dry-run-create', '--package', 'hadoop', '--stack-version', u'2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-09-06 20:44:23,466 - call returned (1, '', "Sorry, user ambari is not allowed to execute '/bin/ambari-python-wrap /usr/bin/conf-select dry-run-create --package hadoop --stack-version 2.5.0.0-1245 --conf-version 0' as root on myserver.mydomain.com.")
2016-09-06 20:44:23,466 - Package hadoop will have new conf directories:
2016-09-06 20:44:23,468 - Checking if need to create versioned conf dir /etc/hadoop/2.5.0.0-1245/0
2016-09-06 20:44:23,470 - call[('ambari-python-wrap', u'/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', u'2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-09-06 20:44:23,496 - call returned (1, '', "Sorry, user ambari is not allowed to execute '/bin/ambari-python-wrap /usr/bin/conf-select create-conf-dir --package hadoop --stack-version 2.5.0.0-1245 --conf-version 0' as root on myserver.mydomain.com.")
2016-09-06 20:44:23,496 - checked_call[('ambari-python-wrap', u'/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', u'2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-09-06 20:44:23,524 - Could not select the directory for package hadoop. Error: Execution of 'ambari-python-wrap /usr/bin/conf-select set-conf-dir --package hadoop --stack-version 2.5.0.0-1245 --conf-version 0' returned 1. Sorry, user ambari is not allowed to execute '/bin/ambari-python-wrap /usr/bin/conf-select set-conf-dir --package hadoop --stack-version 2.5.0.0-1245 --conf-version 0' as root on myserver.mydomain.com.
2016-09-06 20:44:23,524 - Directory['/etc/hadoop/conf'] {'action': ['delete']}
2016-09-06 20:44:23,563 - Removing directory Directory['/etc/hadoop/conf'] and all its content
2016-09-06 20:44:23,593 - Link['/etc/hadoop/conf'] {'to': '/usr/hdp/current/hadoop-client/conf'}
2016-09-06 20:44:23,862 - Warning: linking to nonexistent location /usr/hdp/current/hadoop-client/conf
2016-09-06 20:44:23,862 - Creating symbolic Link['/etc/hadoop/conf'] to /usr/hdp/current/hadoop-client/conf&lt;/PRE&gt;&lt;P&gt;In particular, one line that stood out:&lt;/P&gt;&lt;PRE&gt;2016-09-06 20:44:23,466 - call returned (1, '', "Sorry, user ambari is not allowed to execute '/bin/ambari-python-wrap /usr/bin/conf-select dry-run-create --package hadoop --stack-version 2.5.0.0-1245 --conf-version 0' as root on myserver.mydomain.com.")&lt;/PRE&gt;&lt;P&gt;In my instance of HDP, I choose to run Ambari-Agent under a non-root user account called 'ambari' with sudoers permissions. I based my sudoers file off of the documentation found here: &lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.hortonworks.com/HDPDocuments/Ambari-2.2.0.0/bk_Ambari_Security_Guide/content/_how_to_configure_an_ambari_agent_for_non-root.html" target="_blank"&gt;https://docs.hortonworks.com/HDPDocuments/Ambari-2.2.0.0/bk_Ambari_Security_Guide/content/_how_to_configure_an_ambari_agent_for_non-root.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;What I didn't have included was '/bin/ambari-python-wrap' in the list of approved commands. After adding that and recreating the cluster, all of my issues were resolved. So if anyone is having similar issues in HDP-2.5.0 and is running Ambari-Agent under a non-root user account, make sure '/bin/ambari-python-wrap' is listed under your commands section in your sudoers file.&lt;/P&gt;</description>
      <pubDate>Thu, 08 Sep 2016 00:30:05 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/HDP-2-5-0-Too-many-levels-of-symbolic-links-when-installing/m-p/138698#M101327</guid>
      <dc:creator>hansohn</dc:creator>
      <dc:date>2016-09-08T00:30:05Z</dc:date>
    </item>
    <item>
      <title>Re: HDP-2.5.0: Too many levels of symbolic links when installing Clients</title>
      <link>https://community.cloudera.com/t5/Support-Questions/HDP-2-5-0-Too-many-levels-of-symbolic-links-when-installing/m-p/138699#M101328</link>
      <description>&lt;P&gt;Very nice! That's exactly what I was looking for and the cause was spot-on. Perhaps Ambari shouldn't fail silently anymore. conf-select used to have a ton of issues which is why we ignored errors invoking it. &lt;/P&gt;</description>
      <pubDate>Thu, 08 Sep 2016 02:30:24 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/HDP-2-5-0-Too-many-levels-of-symbolic-links-when-installing/m-p/138699#M101328</guid>
      <dc:creator>jonathanhurley</dc:creator>
      <dc:date>2016-09-08T02:30:24Z</dc:date>
    </item>
  </channel>
</rss>

