Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

/usr/hdp/current/hadoop-client/conf doesn't exist error

Highlighted

/usr/hdp/current/hadoop-client/conf doesn't exist error

Explorer

I am trying to install hortonworks cluster for 10 nodes with local repositories using Ambari.

OS - Red Hat Enterprise Linux Server release 6.4 (Santiago).

HDP 2.3.4.0-3485, HDP-UTILS-1.1.0.20, Ambari 2.2.0.0-1310.

On step 12 (Install, Start and Test) installation failed, unable to install HDFS Client. Here is the full error message:

stderr:   /var/lib/ambari-agent/data/errors-47.txt 
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 120, in <module>
    HdfsClient().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 36, in install
    self.configure(env)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 41, in configure
    hdfs()
  File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
    return fn(*args, **kwargs)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs.py", line 61, in hdfs
    group=params.user_group
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py", line 67, in action_create
    encoding = self.resource.encoding
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 87, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist

In directory /usr/hdp exist only subdirectory “current”. No subdirectory “2.3.4.0-3485”

Before this installation I had another cluster on those nodes – phd. I removed all repos from /etc/yum.repos.d/ and remove all links to PHD repositories from /var/www/html.

But during installation subfolders created under the /usr/phd.

In directory /etc/yum.repos.d/ no phd.repo

# ls -l /etc/yum.repos.d/
total 84
-rw-r--r--. 1 root root456 Feb9 00:59 ambari.repo
-rw-r--r--. 1 root root1059 Dec 19 02:07 epel.repo
-rw-r--r--. 1 root root940 Feb9 00:54 hdp.repo
-rw-r--r--. 1 root root125 Feb9 01:08 HDP.repo
-rw-r--r--. 1 root root148 Feb9 01:08 HDP-UTILS.repo
-rw-r--r--. 1 root root 58473 Feb9 09:56 redhat.repo
-rw-r--r--. 1 root root529 Dec 18 14:58 rhel-source.repo

Yum repolist don’t show any other repositories

# yum repolist
Loaded plugins:
priorities, product-id, security, subscription-manager
This system is
receiving updates from Red Hat Subscription Management.
Repository
HDP-UTILS-1.1.0.20 is listed more than once in the configuration
Trying other mirror.
rhel-6-server-rpms| 3.7 kB00:00
rhel-server-dts-6-rpms| 2.9 kB00:00
rhel-server-dts2-6-rpms| 2.9 kB00:00
175 packages excluded
due to repository priority protections
repo idrepo namestatus
HDP-2.3HDP-2.30+175
HDP-2.3.4.0HDP Version - HDP-2.3.4.0175
HDP-UTILS-1.1.0.20HDP-UTILS-1.1.0.2043
Updates-ambari-2.2.0.0ambari-2.2.0.0 - Updates7
epelExtra Packages for
Enterprise Linux 6 - x86_6411,992
rhel-6-server-rpmsRed Hat Enterprise Linux 6 Server
(RPMs)16,510
rhel-server-dts-6-rpmsRed Hat Developer Toolset RPMs for Red Hat
Enterprise Linux 6 Serv84
rhel-server-dts2-6-rpms
Red Hat Developer Toolset 2 RPMs for Red Hat Enterprise Linux 6 Se469
repolist: 29,280

What settings do I need to install hortonworks cluster in this case?

3 REPLIES 3

Re: /usr/hdp/current/hadoop-client/conf doesn't exist error

Mentor
@Leonid Zavadskiy

firstly I see you have two hdp.repo files, remove one. Secondly, it could mean a couple of things, your upgrade didn't complete or you didn't install hdfs client on the machine. Try reinstalling client on the machine first through Ambari, if that doesn't work go to admin page and confirm all hosts are upgraded/installed completely. Thirdly, make sure your yum repolist returns only one HDP repo in that case remove 2nd HDP entry, yum clean all

Re: /usr/hdp/current/hadoop-client/conf doesn't exist error

@Leonid Zavadskiy

"Before this installation I had another cluster on those nodes – phd. I removed all repos from /etc/yum.repos.d/ and remove all links to PHD repositories from /var/www/html."

This is known issue and you have to be very careful while reinstalling the cluster.

You have to do the cleanup and make sure that you don't miss anything.

This is a note from my notes "Only removing the existing conf directories would make it truly “clean” (the /etc/[component]/conf dirs). I don’t think rpm cleanup will do conf dirs."

Re: /usr/hdp/current/hadoop-client/conf doesn't exist error

New Contributor

Try the following command:

yum -y erase hdp-select

Reference

Don't have an account?
Coming from Hortonworks? Activate your account here