Created 04-14-2018 08:03 AM
1) [cloudera@localhost ~]$ sudo -u hdfs hdfs dfs -put /home/cloudera/ipf.txt /inputnew/
put: `/home/cloudera/ipf.txt': No such file or directory
2) [cloudera@localhost ~]$ hdfs dfs -put /home/cloudera/ipf.txt /inputnew/
put: Permission denied: user=cloudera, access=WRITE, inode="/inputnew":hdfs:supergroup:drwxr-xr-x
3) [cloudera@localhost ~]$ sudo -u cloudera hdfs dfs -put /home/cloudera/ipf.txt /inputnew/
put: Permission denied: user=cloudera, access=WRITE, inode="/inputnew":hdfs:supergroup:drwxr-xr-x
Created 04-18-2018 08:04 AM
Created 04-18-2018 09:40 AM
@subbu You are doing the same error all the time.
In order to write or read from and to the jdfs the user running the command need permissions.
You can solve this by always running with the super user "hdfs" and in order to do that you need to add for you command sudo -u hdfs, so your command should be :
sudo -u hdfs hadoop jar /home/cloudera/WordCount.jar WordCount /inputnew/inputfile.txt /outputnew
or as i see all the time you are running with cloudera user so you need to change the owner of / to cloudera or you can change the permissions to the root folder.
so:
1- use the super user:
sudo -u hdfs hadoop jar /home/cloudera/WordCount.jar WordCount /inputnew/inputfile.txt /outputnew
2- change root dir to cloudera-scm:
sudo -u hdfs hdfs -chown -R cloudera /
then run:
hadoop jar /home/cloudera/WordCount.jar WordCount /inputnew/inputfile.txt /outputnew
3- change the permissions:
sudo -u hdfs hdfs -chmod -R 777 / then run:
hadoop jar /home/cloudera/WordCount.jar WordCount /inputnew/inputfile.txt /outputnew
Created 04-19-2018 02:17 AM
sir while i am trying to exucute this getting error
cloudera@localhost ~]$ sudo -u hdfs hdfs -chown -R cloudera /
Unrecognized option: -chown
Could not create the Java virtual machine.
Created 04-19-2018 02:27 AM
Created 04-19-2018 11:50 PM
thank u very much sir,its working and i am able run and got an output also
Created 04-22-2018 07:46 AM
Created on 04-14-2018 02:19 PM - edited 04-14-2018 02:21 PM
1) [cloudera@localhost ~]$ sudo -u hdfs hdfs dfs -put /home/cloudera/ipf.txt /inputnew/
put: `/home/cloudera/ipf.txt': No such file or directory
The file /home/cloudera/ipf.txt doesn't exist in you local host, you can check by ll /home/cloudera/
Below you are not using the sudo -u hdfs as you used in the above command.
** you faced the same issue in another post.
Please use sudo -u hdfs hdfs dfs -put /home/cloudera/ipf.txt /inputnew/
2) [cloudera@localhost ~]$ hdfs dfs -put /home/cloudera/ipf.txt /inputnew/
put: Permission denied: user=cloudera, access=WRITE, inode="/inputnew":hdfs:supergroup:drwxr-xr-x
3) [cloudera@localhost ~]$ sudo -u cloudera hdfs dfs -put /home/cloudera/ipf.txt /inputnew/
put: Permission denied: user=cloudera, access=WRITE, inode="/inputnew":hdfs:supergroup:drwxr-xr-x
Created 04-15-2018 04:23 AM
actually the txt file is already there in the folder and i am issuing the correct command but still i getting error.can u suggest me a solution
Created 04-15-2018 04:26 AM
Created 04-17-2018 09:30 AM
put: Permission denied: user=cloudera, access=WRITE, inode="/inputnew":hdfs:supergroup:drwxr-xr-x
Created 04-17-2018 01:36 AM
1) User hdfs does not have access to the /home/cloudera directory
2) and 3) is actually the same because in both cases you try to upload the file as user cloudera.
You have two options:
1) grant read permissions to hdfs user in /home/cloudera and all sub-contents (directory access require also execute permission)
2) grant write permissions in "/inputnew/" directory in HDFS , to "cloudera" user.
example: sudo -u hdfs hdfs dfs -chown cloudera /inputnew
There are multiple ways to grant permissions (e.g. using ACLs), but keep it simple.
Created 04-17-2018 09:31 AM
Created 04-17-2018 10:01 AM
Created 04-18-2018 08:04 AM
Created 04-18-2018 08:15 AM
hadoop jar /home/cloudera/WordCount.jar WordCount /inputnew/inputfile.txt /outputnew
error message is:
[cloudera@localhost ~]$ hadoop jar /home/cloudera/WordCount.jar WordCount /inputnew/inputfile.txt /outputnew
18/04/18 07:57:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
18/04/18 07:57:25 INFO input.FileInputFormat: Total input paths to process : 1
18/04/18 07:57:25 INFO mapred.JobClient: Running job: job_201804180742_0002
18/04/18 07:57:26 INFO mapred.JobClient: map 0% reduce 0%
18/04/18 07:57:36 INFO mapred.JobClient: Task Id : attempt_201804180742_0002_m_000002_0, Status : FAILED
org.apache.hadoop.security.AccessControlException: Permission denied: user=cloudera, access=WRITE, inode="/":hdfs:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:224)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:204)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:149)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4705)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4687)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4661)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:3032)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:2996)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2977)