Created 09-26-2017 04:26 PM
Hi, I encountered this 'permission denied - 500' error when executing the DDL query.
<small>java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.security.AccessControlException: Permission denied: user=hive, access=WRITE, inode="/apps":hdfs:hdfs:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:219) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1955) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1939) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1913) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8751) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:2089) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1454) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345) </small>
)
I am on admin account. I need help as this is my first time dealing with HDCloud.
Thank you very much.
Created 09-26-2017 06:58 PM
Hi @Mohammad Shazreen Bin Haini,
Your query is trying to write to /apps location to which "hive" user doesn't have permission.
Change the permission of the folder and try running the query.
hdfs dfs -chmod 777 /apps
Thanks,
Aditya
Created 09-26-2017 06:58 PM
Hi @Mohammad Shazreen Bin Haini,
Your query is trying to write to /apps location to which "hive" user doesn't have permission.
Change the permission of the folder and try running the query.
hdfs dfs -chmod 777 /apps
Thanks,
Aditya
Created 09-27-2017 07:15 AM
Thank you for the response...but where should I enter this command?
Created 09-27-2017 08:59 AM
@Mohammad Shazreen Bin Haini, Based on the question tag, I believe that you're using Sandbox. You will need to ssh into the Sandbox and type in command there. You can learn more here: https://hortonworks.com/hadoop-tutorial/learning-the-ropes-of-the-hortonworks-sandbox/
Created 09-27-2017 10:30 AM
No I don't have Sandbox installed, my PC is too weak for it.
Is there any way to enter that command using Ambari Web or HDCloud CLI?
I'm following tutorial-410 and I believe Sandbox is not a prerequisite for this.
Created 09-27-2017 10:51 AM
ssh to the node where Ambari is installed / HDFS client is installed and you can run the command there.
As @Peter Kim mentioned , check the permissions of those directories before changing.
Thanks,
Aditya
Created 09-30-2017 06:34 AM
@Mohammad Shazreen Bin Haini To change the permission from Ambari, you can use Files View under Views sections.
Created 09-27-2017 01:06 AM
If you are using ranger to manage permissions, there should 2 default policies. 1) HDFS policy - that gives hive full permission to read/write to /apps/hive/warehouse directory. 2) Hive policy - that gives hive full permission "hive" user to create, drop databases, tables.
Created 09-27-2017 01:13 AM
If you're querying to default hive table in hive database in HDFS, then do not change access permission.
Originally, hive default warehouse directory in HDFS is "/apps/hive/warehouse".
So you should check the permission of these directories before change permission.
/app : 755 hdfs:hdfs
/app/hive : 755 hdfs:hdfs
/app/hive/warehouse : 777 hive:hdfs
Created 09-27-2017 07:16 AM
Thank you for the response...but where should I enter this command?
Created 09-27-2017 02:48 PM
Connect to hdp vm server as root, then switch hdfs user and then check directory ownership & permission.
[root@fqdn]# su hdfs
[hdfs@fqdn]# hdfs dfs -ls /apps
[hdfs@fqdn]# hdfs dfs -ls /apps/hive
[hdfs@fqdn]# hdfs dfs -chmod 755 /apps/hive
[hdfs@fqdn]# hdfs dfs -chmod 777 /apps/hive/warehouse
if the ownership is incorrect, then try it.
[hdfs@fqdn]# hdfs dfs -chown hdfs:hdfs /apps/hive
[hdfs@fqdn]# hdfs dfs -chown hive:hdfs /apps/hive/warehouse
Created 09-27-2017 02:05 PM
@Aditya Sirna Tried SSH the node, but now when I try
hdfs dfs -chmod 777/apps
it says
Permission denied: user=cloudbreak is not the owner of inode=apps
Created 09-27-2017 03:09 PM
You have to run it as hdfs user
Run su hdfs and run the chmod command
Thanks,
Aditya
Created 09-29-2017 03:54 PM
Thank you so much. The query works now.
-----------------------
SSH to your node, run su hdfs, and then run the chmod command, and then execute the query.