Created 09-26-2017 04:26 PM
Hi, I encountered this 'permission denied - 500' error when executing the DDL query.
<small>java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.security.AccessControlException: Permission denied: user=hive, access=WRITE, inode="/apps":hdfs:hdfs:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:219) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1955) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1939) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1913) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8751) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:2089) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1454) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345) </small>
)
I am on admin account. I need help as this is my first time dealing with HDCloud.
Thank you very much.
Created 09-26-2017 06:58 PM
Hi @Mohammad Shazreen Bin Haini,
Your query is trying to write to /apps location to which "hive" user doesn't have permission.
Change the permission of the folder and try running the query.
hdfs dfs -chmod 777 /apps
Thanks,
Aditya
Created 09-27-2017 02:48 PM
Connect to hdp vm server as root, then switch hdfs user and then check directory ownership & permission.
[root@fqdn]# su hdfs
[hdfs@fqdn]# hdfs dfs -ls /apps
[hdfs@fqdn]# hdfs dfs -ls /apps/hive
[hdfs@fqdn]# hdfs dfs -chmod 755 /apps/hive
[hdfs@fqdn]# hdfs dfs -chmod 777 /apps/hive/warehouse
if the ownership is incorrect, then try it.
[hdfs@fqdn]# hdfs dfs -chown hdfs:hdfs /apps/hive
[hdfs@fqdn]# hdfs dfs -chown hive:hdfs /apps/hive/warehouse
Created 09-27-2017 02:05 PM
@Aditya Sirna Tried SSH the node, but now when I try
hdfs dfs -chmod 777/apps
it says
Permission denied: user=cloudbreak is not the owner of inode=apps
Created 09-27-2017 03:09 PM
You have to run it as hdfs user
Run su hdfs and run the chmod command
Thanks,
Aditya
Created 09-29-2017 03:54 PM
Thank you so much. The query works now.
-----------------------
SSH to your node, run su hdfs, and then run the chmod command, and then execute the query.