Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

problem in moving the input text file from file system to hdfs dfs file system with following comman

avatar
Explorer

1) [cloudera@localhost ~]$ sudo -u hdfs hdfs dfs -put /home/cloudera/ipf.txt /inputnew/
put: `/home/cloudera/ipf.txt': No such file or directory
2) [cloudera@localhost ~]$ hdfs dfs -put /home/cloudera/ipf.txt /inputnew/
put: Permission denied: user=cloudera, access=WRITE, inode="/inputnew":hdfs:supergroup:drwxr-xr-x

3) [cloudera@localhost ~]$ sudo -u cloudera hdfs dfs -put /home/cloudera/ipf.txt /inputnew/
put: Permission denied: user=cloudera, access=WRITE, inode="/inputnew":hdfs:supergroup:drwxr-xr-x

6 ACCEPTED SOLUTIONS

avatar
Explorer
thank u,its working

View solution in original post

avatar
Master Collaborator

@subbu You are doing the same error all the time.

 

In order to write or read from and to the jdfs the user running the command need permissions.

 

You can solve this by always running with the super user "hdfs" and in order to do that you need to add for you command sudo -u hdfs, so your command should be :

 

sudo -u hdfs hadoop jar /home/cloudera/WordCount.jar WordCount /inputnew/inputfile.txt /outputnew

 

or as i see all the time you are running with cloudera user so you need to change the owner of / to cloudera or you can change the permissions to the root folder.

 

so:

 

1- use the super user: 

 

sudo -u hdfs hadoop jar /home/cloudera/WordCount.jar WordCount /inputnew/inputfile.txt /outputnew

 

2- change root dir to cloudera-scm:

 

sudo -u hdfs  hdfs -chown -R cloudera /

then run:

hadoop jar /home/cloudera/WordCount.jar WordCount /inputnew/inputfile.txt /outputnew

 

 

 

3- change the permissions:

 

sudo -u hdfs  hdfs -chmod -R 777   / then run:

 

hadoop jar /home/cloudera/WordCount.jar WordCount /inputnew/inputfile.txt /outputnew

 

 

 

View solution in original post

avatar
Explorer

sir while i am trying to exucute this getting error

cloudera@localhost ~]$ sudo -u hdfs hdfs -chown -R cloudera /
Unrecognized option: -chown
Could not create the Java virtual machine.

View solution in original post

avatar
Master Collaborator
sudo -u hdfs hdfs dfs -chown -R cloudera /

View solution in original post

avatar
Explorer

thank u very much sir,its working and i am able run and got an output also

 

View solution in original post

avatar
Explorer
While executing this command I am getting error as
[cloudera@localhost ~]$ hadoop jar /home/cloudera/WordCount.jar WordCount /inputnew/inputfile.txt /op
18/04/22 07:40:40 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:8021. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)

View solution in original post

16 REPLIES 16

avatar
Master Collaborator

@subbu You are doing the same error all the time.

 

In order to write or read from and to the jdfs the user running the command need permissions.

 

You can solve this by always running with the super user "hdfs" and in order to do that you need to add for you command sudo -u hdfs, so your command should be :

 

sudo -u hdfs hadoop jar /home/cloudera/WordCount.jar WordCount /inputnew/inputfile.txt /outputnew

 

or as i see all the time you are running with cloudera user so you need to change the owner of / to cloudera or you can change the permissions to the root folder.

 

so:

 

1- use the super user: 

 

sudo -u hdfs hadoop jar /home/cloudera/WordCount.jar WordCount /inputnew/inputfile.txt /outputnew

 

2- change root dir to cloudera-scm:

 

sudo -u hdfs  hdfs -chown -R cloudera /

then run:

hadoop jar /home/cloudera/WordCount.jar WordCount /inputnew/inputfile.txt /outputnew

 

 

 

3- change the permissions:

 

sudo -u hdfs  hdfs -chmod -R 777   / then run:

 

hadoop jar /home/cloudera/WordCount.jar WordCount /inputnew/inputfile.txt /outputnew

 

 

 

avatar
Explorer

sir while i am trying to exucute this getting error

cloudera@localhost ~]$ sudo -u hdfs hdfs -chown -R cloudera /
Unrecognized option: -chown
Could not create the Java virtual machine.

avatar
Master Collaborator
sudo -u hdfs hdfs dfs -chown -R cloudera /

avatar
Explorer

thank u very much sir,its working and i am able run and got an output also

 

avatar
Explorer
While executing this command I am getting error as
[cloudera@localhost ~]$ hadoop jar /home/cloudera/WordCount.jar WordCount /inputnew/inputfile.txt /op
18/04/22 07:40:40 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:8021. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)

avatar
Master Collaborator

Where did you get this command from?

 

Are you have the hadoop-example jar?

 

Check the jars under /home/cloudera/ by running ll /home/cloudera/

 

 

avatar
Explorer

While running a wordcount program i am getting the following error.

cloudera@localhost ~]$ hadoop jar WordCount.jar WordCount /inputnew2/inputfile.txt /output_new

18/06/09 00:29:06 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:8021. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)

18/06/09 00:29:07 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:8021. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)

18/06/09 00:29:08 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:8021. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)

18/06/09 00:29:09 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:8021. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)

18/06/09 00:29:10 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:8021. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)

18/06/09 00:29:11 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:8021. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)

18/06/09 00:29:12 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:8021. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)

18/06/09 00:29:13 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:8021. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)

18/06/09 00:29:14 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:8021. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)

18/06/09 00:29:15 INFO ipc.Client: Retrying connect to server: localhost.localdomain/127.0.0.1:8021. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)

18/06/09 00:29:15 ERROR security.UserGroupInformation: PriviledgedActionException as:cloudera (auth:SIMPLE) cause:java.net.ConnectException: Call From localhost.localdomain/127.0.0.1 to localhost.localdomain:8021 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Exception in thread "main" java.net.ConnectException: Call From localhost.localdomain/127.0.0.1 to localhost.localdomain:8021 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

          at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

          at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)

          at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)

          at java.lang.reflect.Constructor.newInstance(Constructor.java:513)

          at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:782)

          at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:729)

          at org.apache.hadoop.ipc.Client.call(Client.java:1241)

          at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)

          at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown Source)

          at org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)

          at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)

          at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)

          at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)

          at java.security.AccessController.doPrivileged(Native Method)

          at javax.security.auth.Subject.doAs(Subject.java:396)

          at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)

          at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)

          at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)

          at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)

          at WordCount.main(WordCount.java:132)

          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

          at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

          at java.lang.reflect.Method.invoke(Method.java:597)

          at org.apache.hadoop.util.RunJar.main(RunJar.java:208)

Caused by: java.net.ConnectException: Connection refused

          at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)

          at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:599)

          at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:207)

          at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:528)

          at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:492)

          at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:509)

          at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:603)

          at org.apache.hadoop.ipc.Client$Connection.access$2100(Client.java:252)

          at org.apache.hadoop.ipc.Client.getConnection(Client.java:1290)

          at org.apache.hadoop.ipc.Client.call(Client.java:1208)

          ... 18 more