Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Error with installing any service OSError: [Errrno 17]File exists

Highlighted

Error with installing any service OSError: [Errrno 17]File exists

Contributor

Hi! I'm trying to add some service on 2.5. hdp and 2.4. ambari but what ever I'm trying to install i got this kind of error: Does anyone has an idea where is problem?

File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 293, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh /usr/bin/hdp-select set all `ambari-python-wrap /usr/bin/hdp-select versions | grep ^2.5.3.0-37 | tail -1`' returned 1. Traceback (most recent call last):
  File "/usr/bin/hdp-select", line 391, in <module>
    setPackages(pkgs, args[2], options.rpm_mode)
  File "/usr/bin/hdp-select", line 290, in setPackages
    os.symlink(target + "/" + dir, linkname)
OSError: [Errno 17] File exists

UPDATE:

Also, I removed folder spark2-client as you said, but if I try to

hdp-select set all <version>

and afterwards

hdp-select status

actually nothing happend, all of them are on "None" where version has to be set.

If I try to run in /usr/hdp/current/ command ls hive client it gives me hive-client as result that it exists. But, if I run

hpd-select set hive-client 2.5.3.0-37

As a result I got:

ERROR: Invalid package - hive-client
Packages:
  accumulo-client
  accumulo-gc
  accumulo-master
  accumulo-monitor
  accumulo-tablet
  accumulo-tracer
  atlas-client
  atlas-server
  falcon-client
  falcon-server
  flume-server
  hadoop-client
  hadoop-hdfs-datanode
  hadoop-hdfs-journalnode
  hadoop-hdfs-namenode
  hadoop-hdfs-nfs3
  hadoop-hdfs-portmap
  hadoop-hdfs-secondarynamenode
  hadoop-hdfs-zkfc
  hadoop-httpfs
  hadoop-mapreduce-historyserver
  hadoop-yarn-nodemanager
  hadoop-yarn-resourcemanager
  hadoop-yarn-timelineserver
  hbase-client
  hbase-master
  hbase-regionserver
  hive-metastore
  hive-server2
  hive-server2-hive2
  hive-webhcat
  kafka-broker
  knox-server
  livy-server
  mahout-client
  oozie-client
  oozie-server
  phoenix-client
  phoenix-server
  ranger-admin
  ranger-kms
  ranger-tagsync
  ranger-usersync
  slider-client
  spark-client
  spark-historyserver
  spark-thriftserver
  spark2-client
  spark2-historyserver
  spark2-thriftserver
  sqoop-client
  sqoop-server
  storm-client
  storm-nimbus
  storm-slider-client
  storm-supervisor
  zeppelin-server
  zookeeper-client
  zookeeper-server
Aliases:
  accumulo-server
  all
  client
  hadoop-hdfs-server
  hadoop-mapreduce-server
  hadoop-yarn-server
  hive-server

And now with command

ls -l /usr/hdp/current

I got output where every dir is pointing to

usr/hdp/2.5.3.0-37/

One little update. This is now what I've got as an error: stderr:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 35, in <module>
    BeforeAnyHook().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 26, in hook
    import params
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/params.py", line 191, in <module>
    hadoop_conf_dir = conf_select.get_hadoop_conf_dir(force_latest_on_upgrade=True)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/conf_select.py", line 477, in get_hadoop_conf_dir
    select(stack_name, "hadoop", version)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/conf_select.py", line 315, in select
    shell.checked_call(_get_cmd("set-conf-dir", package, version), logoutput=False, quiet=False, sudo=True)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 293, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-python-wrap /usr/bin/conf-select set-conf-dir --package hadoop --stack-version 2.5.3.0-37 --conf-version 0' returned 1. Traceback (most recent call last):
  File "/usr/bin/conf-select", line 178, in <module>
    setConfDir(options.pname, options.sver, options.cver)
  File "/usr/bin/conf-select", line 136, in setConfDir
    raise Exception("Expected confdir %s to be a symlink." % confdir)
Exception: Expected confdir /usr/hdp/2.5.3.0-37/hadoop/conf to be a symlink.
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-3023.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-3023.json', 'INFO', '/var/lib/ambari-agent/tmp']
7 REPLIES 7
Highlighted

Re: Error with installing any service OSError: [Errrno 17]File exists

Super Collaborator

You most likely have a directory in /usr/hdp/current. If you list this directory, you should be able to see any children in it which are not symlinks. They should be removed:

ls -l /usr/hdp/current

Highlighted

Re: Error with installing any service OSError: [Errrno 17]File exists

Contributor

Hmm, I'll post you feedback first thing in the morning. Would you please checkout it? Thank you :)

Highlighted

Re: Error with installing any service OSError: [Errrno 17]File exists

Contributor

This is an output of what you have requested:

lrwxrwxrwx.  1 root root   28 Dec 15 18:00 accumulo-client -> /usr/hdp/2.5.3.0-37/accumulo
lrwxrwxrwx.  1 root root   28 Dec 15 18:00 accumulo-gc -> /usr/hdp/2.5.3.0-37/accumulo
lrwxrwxrwx.  1 root root   28 Dec 15 18:00 accumulo-master -> /usr/hdp/2.5.3.0-37/accumulo
lrwxrwxrwx.  1 root root   28 Dec 15 18:00 accumulo-monitor -> /usr/hdp/2.5.3.0-37/accumulo
lrwxrwxrwx.  1 root root   28 Dec 15 18:00 accumulo-tablet -> /usr/hdp/2.5.3.0-37/accumulo
lrwxrwxrwx.  1 root root   28 Dec 15 18:00 accumulo-tracer -> /usr/hdp/2.5.3.0-37/accumulo
lrwxrwxrwx.  1 root root   25 Dec 15 18:00 atlas-client -> /usr/hdp/2.5.3.0-37/atlas
lrwxrwxrwx.  1 root root   25 Dec 15 18:00 atlas-server -> /usr/hdp/2.5.3.0-37/atlas
lrwxrwxrwx.  1 root root   26 Dec 15 18:00 falcon-client -> /usr/hdp/2.5.3.0-37/falcon
lrwxrwxrwx.  1 root root   26 Dec 15 18:00 falcon-server -> /usr/hdp/2.5.3.0-37/falcon
lrwxrwxrwx.  1 root root   25 Dec 15 18:00 flume-server -> /usr/hdp/2.5.3.0-37/flume
lrwxrwxrwx.  1 root root   26 Dec 15 18:00 hadoop-client -> /usr/hdp/2.5.3.0-37/hadoop
lrwxrwxrwx.  1 root root   31 Dec 15 18:00 hadoop-hdfs-client -> /usr/hdp/2.5.3.0-37/hadoop-hdfs
lrwxrwxrwx.  1 root root   32 Dec 14 10:50 hadoop-hdfs-datanode -> /usr/hdp/2.4.2.0-258/hadoop-hdfs
lrwxrwxrwx.  1 root root   32 Dec 14 10:50 hadoop-hdfs-journalnode -> /usr/hdp/2.4.2.0-258/hadoop-hdfs
lrwxrwxrwx.  1 root root   32 Dec 14 10:50 hadoop-hdfs-namenode -> /usr/hdp/2.4.2.0-258/hadoop-hdfs
lrwxrwxrwx.  1 root root   32 Dec 14 10:50 hadoop-hdfs-nfs3 -> /usr/hdp/2.4.2.0-258/hadoop-hdfs
lrwxrwxrwx.  1 root root   32 Dec 14 10:50 hadoop-hdfs-portmap -> /usr/hdp/2.4.2.0-258/hadoop-hdfs
lrwxrwxrwx.  1 root root   32 Dec 14 10:50 hadoop-hdfs-secondarynamenode -> /usr/hdp/2.4.2.0-258/hadoop-hdfs
lrwxrwxrwx.  1 root root   33 Dec 15 18:00 hadoop-httpfs -> /usr/hdp/2.5.3.0-37/hadoop-httpfs
lrwxrwxrwx.  1 root root   36 Dec 15 18:00 hadoop-mapreduce-client -> /usr/hdp/2.5.3.0-37/hadoop-mapreduce
lrwxrwxrwx.  1 root root   37 Dec 14 10:50 hadoop-mapreduce-historyserver -> /usr/hdp/2.4.2.0-258/hadoop-mapreduce
lrwxrwxrwx.  1 root root   31 Dec 15 18:00 hadoop-yarn-client -> /usr/hdp/2.5.3.0-37/hadoop-yarn
lrwxrwxrwx.  1 root root   32 Dec 14 10:50 hadoop-yarn-nodemanager -> /usr/hdp/2.4.2.0-258/hadoop-yarn
lrwxrwxrwx.  1 root root   32 Dec 14 10:50 hadoop-yarn-resourcemanager -> /usr/hdp/2.4.2.0-258/hadoop-yarn
lrwxrwxrwx.  1 root root   32 Dec 14 10:50 hadoop-yarn-timelineserver -> /usr/hdp/2.4.2.0-258/hadoop-yarn
lrwxrwxrwx.  1 root root   26 Dec 14 10:50 hbase-client -> /usr/hdp/2.4.2.0-258/hbase
lrwxrwxrwx.  1 root root   26 Dec 14 10:50 hbase-master -> /usr/hdp/2.4.2.0-258/hbase
lrwxrwxrwx.  1 root root   26 Dec 14 10:50 hbase-regionserver -> /usr/hdp/2.4.2.0-258/hbase
lrwxrwxrwx.  1 root root   24 Dec 15 18:00 hive-client -> /usr/hdp/2.5.3.0-37/hive
lrwxrwxrwx.  1 root root   25 Dec 14 10:50 hive-metastore -> /usr/hdp/2.4.2.0-258/hive
lrwxrwxrwx.  1 root root   25 Dec 14 10:50 hive-server2 -> /usr/hdp/2.4.2.0-258/hive
lrwxrwxrwx.  1 root root   25 Dec 14 12:12 hive-server2-hive2 -> /usr/hdp/2.5.3.0-37/hive2
lrwxrwxrwx.  1 root root   34 Dec 14 10:50 hive-webhcat -> /usr/hdp/2.4.2.0-258/hive-hcatalog
lrwxrwxrwx.  1 root root   25 Dec 15 14:06 kafka-broker -> /usr/hdp/2.5.3.0-37/kafka
lrwxrwxrwx.  1 root root   25 Dec 14 10:50 knox-server -> /usr/hdp/2.4.2.0-258/knox
lrwxrwxrwx.  1 root root   24 Dec 15 18:00 livy-client -> /usr/hdp/2.5.3.0-37/livy
lrwxrwxrwx.  1 root root   25 Dec 14 10:50 livy-server -> /usr/hdp/2.4.2.0-258/livy
lrwxrwxrwx.  1 root root   27 Dec 14 10:50 mahout-client -> /usr/hdp/2.4.2.0-258/mahout
lrwxrwxrwx.  1 root root   26 Dec 14 10:50 oozie-client -> /usr/hdp/2.4.2.0-258/oozie
lrwxrwxrwx.  1 root root   26 Dec 14 10:50 oozie-server -> /usr/hdp/2.4.2.0-258/oozie
lrwxrwxrwx.  1 root root   28 Dec 14 10:50 phoenix-client -> /usr/hdp/2.4.2.0-258/phoenix
lrwxrwxrwx.  1 root root   28 Dec 14 10:50 phoenix-server -> /usr/hdp/2.4.2.0-258/phoenix
lrwxrwxrwx.  1 root root   23 Dec 15 18:00 pig-client -> /usr/hdp/2.5.3.0-37/pig
lrwxrwxrwx.  1 root root   33 Dec 14 10:50 ranger-admin -> /usr/hdp/2.4.2.0-258/ranger-admin
lrwxrwxrwx.  1 root root   31 Dec 14 10:50 ranger-kms -> /usr/hdp/2.4.2.0-258/ranger-kms
lrwxrwxrwx.  1 root root   36 Dec 14 10:50 ranger-usersync -> /usr/hdp/2.4.2.0-258/ranger-usersync
lrwxrwxrwx.  1 root root   27 Dec 14 10:50 slider-client -> /usr/hdp/2.4.2.0-258/slider
drwxr-xr-x. 10  500  500 4096 Dec 14 11:28 spark2-client
lrwxrwxrwx.  1 root root   26 Dec 15 15:37 spark2-historyserver -> /usr/hdp/2.5.3.0-37/spark2
lrwxrwxrwx.  1 root root   26 Dec 15 15:37 spark2-thriftserver -> /usr/hdp/2.5.3.0-37/spark2
lrwxrwxrwx.  1 root root   26 Dec 14 10:50 spark-historyserver -> /usr/hdp/2.4.2.0-258/spark
lrwxrwxrwx.  1 root root   26 Dec 14 10:50 spark-thriftserver -> /usr/hdp/2.4.2.0-258/spark
lrwxrwxrwx.  1 root root   26 Dec 14 10:50 sqoop-client -> /usr/hdp/2.4.2.0-258/sqoop
lrwxrwxrwx.  1 root root   26 Dec 14 10:50 sqoop-server -> /usr/hdp/2.4.2.0-258/sqoop
lrwxrwxrwx.  1 root root   26 Dec 14 10:50 storm-client -> /usr/hdp/2.4.2.0-258/storm
lrwxrwxrwx.  1 root root   26 Dec 14 10:50 storm-nimbus -> /usr/hdp/2.4.2.0-258/storm
lrwxrwxrwx.  1 root root   40 Dec 14 10:50 storm-slider-client -> /usr/hdp/2.4.2.0-258/storm-slider-client
lrwxrwxrwx.  1 root root   26 Dec 14 10:50 storm-supervisor -> /usr/hdp/2.4.2.0-258/storm
lrwxrwxrwx.  1 root root   23 Dec 15 18:00 tez-client -> /usr/hdp/2.5.3.0-37/tez
lrwxrwxrwx.  1 root root   29 Dec 14 10:50 zeppelin-server -> /usr/hdp/2.4.2.0-258/zeppelin
lrwxrwxrwx.  1 root root   29 Dec 15 14:06 zookeeper-client -> /usr/hdp/2.5.3.0-37/zookeeper
lrwxrwxrwx.  1 root root   30 Dec 14 10:50 zookeeper-server -> /usr/hdp/2.4.2.0-258/zookeeper
Highlighted

Re: Error with installing any service OSError: [Errrno 17]File exists

Contributor

@Jonathan Hurley

Also, I removed folder spark2-client as you said, but if I try to

hdp-select set all <version>

and afterwards

hdp-select status

actually nothing happend, all of them are on "None" where version has to be set.

If I try to run in /usr/hdp/current/ command ls hive client it gives me hive-client as result that it exists. But, if I run

hpd-select set hive-client 2.5.3.0-37

As a result I got:

ERROR: Invalid package - hive-client


Packages:
  accumulo-client
  accumulo-gc
  accumulo-master
  accumulo-monitor
  accumulo-tablet
  accumulo-tracer
  atlas-client
  atlas-server
  falcon-client
  falcon-server
  flume-server
  hadoop-client
  hadoop-hdfs-datanode
  hadoop-hdfs-journalnode
  hadoop-hdfs-namenode
  hadoop-hdfs-nfs3
  hadoop-hdfs-portmap
  hadoop-hdfs-secondarynamenode
  hadoop-hdfs-zkfc
  hadoop-httpfs
  hadoop-mapreduce-historyserver
  hadoop-yarn-nodemanager
  hadoop-yarn-resourcemanager
  hadoop-yarn-timelineserver
  hbase-client
  hbase-master
  hbase-regionserver
  hive-metastore
  hive-server2
  hive-server2-hive2
  hive-webhcat
  kafka-broker
  knox-server
  livy-server
  mahout-client
  oozie-client
  oozie-server
  phoenix-client
  phoenix-server
  ranger-admin
  ranger-kms
  ranger-tagsync
  ranger-usersync
  slider-client
  spark-client
  spark-historyserver
  spark-thriftserver
  spark2-client
  spark2-historyserver
  spark2-thriftserver
  sqoop-client
  sqoop-server
  storm-client
  storm-nimbus
  storm-slider-client
  storm-supervisor
  zeppelin-server
  zookeeper-client
  zookeeper-server
Aliases:
  accumulo-server
  all
  client
  hadoop-hdfs-server
  hadoop-mapreduce-server
  hadoop-yarn-server
  hive-server

Re: Error with installing any service OSError: [Errrno 17]File exists

Super Collaborator

hive-client isn't a valid target for hdp-select. In this case it would just be "hive".

Highlighted

Re: Error with installing any service OSError: [Errrno 17]File exists

Contributor

And now with command

ls -l /usr/hdp/current

I got output where every dir is pointing to

usr/hdp/2.5.3.0-37/

One little update. This is now what I've got as an error: stderr:

@Artem Ervits Maybe you know?

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 35, in <module>
    BeforeAnyHook().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 26, in hook
    import params
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/params.py", line 191, in <module>
    hadoop_conf_dir = conf_select.get_hadoop_conf_dir(force_latest_on_upgrade=True)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/conf_select.py", line 477, in get_hadoop_conf_dir
    select(stack_name, "hadoop", version)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/conf_select.py", line 315, in select
    shell.checked_call(_get_cmd("set-conf-dir", package, version), logoutput=False, quiet=False, sudo=True)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 293, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-python-wrap /usr/bin/conf-select set-conf-dir --package hadoop --stack-version 2.5.3.0-37 --conf-version 0' returned 1. Traceback (most recent call last):
  File "/usr/bin/conf-select", line 178, in <module>
    setConfDir(options.pname, options.sver, options.cver)
  File "/usr/bin/conf-select", line 136, in setConfDir
    raise Exception("Expected confdir %s to be a symlink." % confdir)
Exception: Expected confdir /usr/hdp/2.5.3.0-37/hadoop/conf to be a symlink.
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-3023.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-3023.json', 'INFO', '/var/lib/ambari-agent/tmp']

Highlighted

Re: Error with installing any service OSError: [Errrno 17]File exists

Super Collaborator
/usr/hdp/2.5.3.0-37/hadoop/conf

That does need to be a symlink:

/etc/hadoop/conf -> /usr/hdp/current/hadoop/conf

/usr/hdp/current/hadoop -> /usr/hdp/2.5.3.0-37/hadoop

/usr/hdp/2.5.3.0-37/hadoop/conf -> /etc/hadoop/2.5.3.0-37/0

Don't have an account?
Coming from Hortonworks? Activate your account here