<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Permission denied as I am unable to delete a directory in HDFS in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Permission-denied-as-I-am-unable-to-delete-a-directory-in/m-p/322426#M228778</link>
    <description>&lt;P&gt;Hi experts,&lt;/P&gt;&lt;P&gt;As the root user, I am trying to delete a directory in HDFS which was created by root.&lt;BR /&gt;However, when I try to delete it, it says "Permission denied: user=root, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x"&lt;BR /&gt;&lt;BR /&gt;Why does it say permission denied on "/user" when I am trying to delete the directory "/tmp/root/testdirectory"&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The error message is below.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;[root@test02 ~]# hdfs dfs -ls /tmp/root/&lt;BR /&gt;Picked up _JAVA_OPTIONS: -Xmx2048m -XX:MaxPermSize=512m -Djava.awt.headless=true&lt;BR /&gt;Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0&lt;BR /&gt;Found 2 items&lt;BR /&gt;drwxrwxrwx - root hdfs 0 2021-08-09 20:35 /tmp/root/testdirectory&lt;BR /&gt;-rw-r--r-- 3 root hdfs 0 2021-08-10 13:54 /tmp/root/test&lt;BR /&gt;[root@test02 ~]# hdfs dfs -rmr /tmp/root/testdirectory&lt;BR /&gt;Picked up _JAVA_OPTIONS: -Xmx2048m -XX:MaxPermSize=512m -Djava.awt.headless=true&lt;BR /&gt;Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0&lt;BR /&gt;rmr: DEPRECATED: Please use '-rm -r' instead.&lt;BR /&gt;21/08/11 12:08:30 WARN fs.TrashPolicyDefault: Can't create trash directory: hdfs://test/user/root/.Trash/Current/tmp/root&lt;BR /&gt;org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:351)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:251)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:189)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1756)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1740)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1699)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:60)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3007)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1132)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:659)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:507)&lt;BR /&gt;at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1034)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1003)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:931)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1926)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2854)&lt;/P&gt;&lt;P&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:423)&lt;BR /&gt;at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:121)&lt;BR /&gt;at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:88)&lt;BR /&gt;at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2498)&lt;BR /&gt;at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2471)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1243)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1240)&lt;BR /&gt;at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1257)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1232)&lt;BR /&gt;at org.apache.hadoop.fs.TrashPolicyDefault.moveToTrash(TrashPolicyDefault.java:147)&lt;BR /&gt;at org.apache.hadoop.fs.Trash.moveToTrash(Trash.java:109)&lt;BR /&gt;at org.apache.hadoop.fs.Trash.moveToAppropriateTrash(Trash.java:95)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Delete$Rm.moveToTrash(Delete.java:153)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Delete$Rm.processPath(Delete.java:118)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:327)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:299)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:281)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:265)&lt;BR /&gt;at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:119)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Command.run(Command.java:175)&lt;BR /&gt;at org.apache.hadoop.fs.FsShell.run(FsShell.java:317)&lt;BR /&gt;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)&lt;BR /&gt;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)&lt;BR /&gt;at org.apache.hadoop.fs.FsShell.main(FsShell.java:380)&lt;BR /&gt;Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=root, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:351)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:251)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:189)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1756)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1740)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1699)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:60)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3007)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1132)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:659)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:507)&lt;BR /&gt;at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1034)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1003)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:931)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1926)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2854)&lt;/P&gt;&lt;P&gt;at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1549)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1495)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1394)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118)&lt;BR /&gt;at com.sun.proxy.$Proxy10.mkdirs(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:587)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)&lt;BR /&gt;at com.sun.proxy.$Proxy11.mkdirs(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2496)&lt;BR /&gt;... 21 more&lt;BR /&gt;rmr: Failed to move to trash: hdfs://test/tmp/root/testdirectory: Permission denied: user=root, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x&lt;BR /&gt;[root@test02 ~]#&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any help is much appreciated.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;</description>
    <pubDate>Wed, 11 Aug 2021 17:15:01 GMT</pubDate>
    <dc:creator>ryu</dc:creator>
    <dc:date>2021-08-11T17:15:01Z</dc:date>
    <item>
      <title>Permission denied as I am unable to delete a directory in HDFS</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Permission-denied-as-I-am-unable-to-delete-a-directory-in/m-p/322426#M228778</link>
      <description>&lt;P&gt;Hi experts,&lt;/P&gt;&lt;P&gt;As the root user, I am trying to delete a directory in HDFS which was created by root.&lt;BR /&gt;However, when I try to delete it, it says "Permission denied: user=root, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x"&lt;BR /&gt;&lt;BR /&gt;Why does it say permission denied on "/user" when I am trying to delete the directory "/tmp/root/testdirectory"&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The error message is below.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;[root@test02 ~]# hdfs dfs -ls /tmp/root/&lt;BR /&gt;Picked up _JAVA_OPTIONS: -Xmx2048m -XX:MaxPermSize=512m -Djava.awt.headless=true&lt;BR /&gt;Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0&lt;BR /&gt;Found 2 items&lt;BR /&gt;drwxrwxrwx - root hdfs 0 2021-08-09 20:35 /tmp/root/testdirectory&lt;BR /&gt;-rw-r--r-- 3 root hdfs 0 2021-08-10 13:54 /tmp/root/test&lt;BR /&gt;[root@test02 ~]# hdfs dfs -rmr /tmp/root/testdirectory&lt;BR /&gt;Picked up _JAVA_OPTIONS: -Xmx2048m -XX:MaxPermSize=512m -Djava.awt.headless=true&lt;BR /&gt;Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0&lt;BR /&gt;rmr: DEPRECATED: Please use '-rm -r' instead.&lt;BR /&gt;21/08/11 12:08:30 WARN fs.TrashPolicyDefault: Can't create trash directory: hdfs://test/user/root/.Trash/Current/tmp/root&lt;BR /&gt;org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:351)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:251)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:189)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1756)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1740)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1699)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:60)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3007)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1132)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:659)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:507)&lt;BR /&gt;at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1034)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1003)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:931)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1926)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2854)&lt;/P&gt;&lt;P&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:423)&lt;BR /&gt;at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:121)&lt;BR /&gt;at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:88)&lt;BR /&gt;at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2498)&lt;BR /&gt;at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2471)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1243)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1240)&lt;BR /&gt;at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1257)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1232)&lt;BR /&gt;at org.apache.hadoop.fs.TrashPolicyDefault.moveToTrash(TrashPolicyDefault.java:147)&lt;BR /&gt;at org.apache.hadoop.fs.Trash.moveToTrash(Trash.java:109)&lt;BR /&gt;at org.apache.hadoop.fs.Trash.moveToAppropriateTrash(Trash.java:95)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Delete$Rm.moveToTrash(Delete.java:153)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Delete$Rm.processPath(Delete.java:118)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:327)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:299)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:281)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:265)&lt;BR /&gt;at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:119)&lt;BR /&gt;at org.apache.hadoop.fs.shell.Command.run(Command.java:175)&lt;BR /&gt;at org.apache.hadoop.fs.FsShell.run(FsShell.java:317)&lt;BR /&gt;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)&lt;BR /&gt;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)&lt;BR /&gt;at org.apache.hadoop.fs.FsShell.main(FsShell.java:380)&lt;BR /&gt;Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=root, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:351)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:251)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:189)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1756)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1740)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1699)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:60)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3007)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1132)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:659)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:507)&lt;BR /&gt;at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1034)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1003)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:931)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.security.auth.Subject.doAs(Subject.java:422)&lt;BR /&gt;at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1926)&lt;BR /&gt;at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2854)&lt;/P&gt;&lt;P&gt;at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1549)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1495)&lt;BR /&gt;at org.apache.hadoop.ipc.Client.call(Client.java:1394)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)&lt;BR /&gt;at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118)&lt;BR /&gt;at com.sun.proxy.$Proxy10.mkdirs(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:587)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)&lt;BR /&gt;at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)&lt;BR /&gt;at com.sun.proxy.$Proxy11.mkdirs(Unknown Source)&lt;BR /&gt;at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2496)&lt;BR /&gt;... 21 more&lt;BR /&gt;rmr: Failed to move to trash: hdfs://test/tmp/root/testdirectory: Permission denied: user=root, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x&lt;BR /&gt;[root@test02 ~]#&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any help is much appreciated.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;</description>
      <pubDate>Wed, 11 Aug 2021 17:15:01 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Permission-denied-as-I-am-unable-to-delete-a-directory-in/m-p/322426#M228778</guid>
      <dc:creator>ryu</dc:creator>
      <dc:date>2021-08-11T17:15:01Z</dc:date>
    </item>
    <item>
      <title>Re: Permission denied as I am unable to delete a directory in HDFS</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Permission-denied-as-I-am-unable-to-delete-a-directory-in/m-p/322449#M228787</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/84981"&gt;@ryu&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;[root@test02 ~]# hdfs dfs -rmr /tmp/root/testdirectory&lt;BR /&gt;...&lt;BR /&gt;...&lt;BR /&gt;21/08/11 12:08:30 WARN fs.TrashPolicyDefault: Can't create trash directory: hdfs://test/user/root/.Trash/Current/tmp/root&lt;BR /&gt;...&lt;BR /&gt;...&lt;BR /&gt;rmr: Failed to move to trash: hdfs://test/tmp/root/testdirectory: Permission denied: user=root, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Upon checking the logs you are trying to delete a testdirectory and if you do that it will try to move the file to trash directory(since trash is enabled) under "/user/root/.Trash" location.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Since the folder /user("inode="/user":hdfs:hdfs:drwxr-xr-x") has hdfs as primary and group so eventually the user root falls under others(that is the third one r-x). The other users does not have write permission so that he was not able to write.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Either give write permission for user root in to the folder or try to delete the folder as hdfs user to overcome the issue.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If you are happy with the reply, mark it Accept as Solution&lt;/P&gt;</description>
      <pubDate>Thu, 12 Aug 2021 08:44:41 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Permission-denied-as-I-am-unable-to-delete-a-directory-in/m-p/322449#M228787</guid>
      <dc:creator>Shifu</dc:creator>
      <dc:date>2021-08-12T08:44:41Z</dc:date>
    </item>
    <item>
      <title>Re: Permission denied as I am unable to delete a directory in HDFS</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Permission-denied-as-I-am-unable-to-delete-a-directory-in/m-p/322469#M228801</link>
      <description>&lt;P&gt;Thanks it worked.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 12 Aug 2021 13:23:47 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Permission-denied-as-I-am-unable-to-delete-a-directory-in/m-p/322469#M228801</guid>
      <dc:creator>ryu</dc:creator>
      <dc:date>2021-08-12T13:23:47Z</dc:date>
    </item>
  </channel>
</rss>

