Support Questions

Find answers, ask questions, and share your expertise

Can't execute query - Permission denied?,Can't execute DDL query

avatar
Explorer

Hi, I encountered this 'permission denied - 500' error when executing the DDL query.

<small>java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.security.AccessControlException: Permission denied: user=hive, access=WRITE, inode="/apps":hdfs:hdfs:drwxr-xr-x
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:219)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1955)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1939)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1913)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8751)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:2089)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1454)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345) </small>

)


I am on admin account. I need help as this is my first time dealing with HDCloud.
Thank you very much.

1 ACCEPTED SOLUTION

avatar
Super Guru

Hi @Mohammad Shazreen Bin Haini,

Your query is trying to write to /apps location to which "hive" user doesn't have permission.

Change the permission of the folder and try running the query.

hdfs dfs -chmod 777 /apps

Thanks,

Aditya

View solution in original post

13 REPLIES 13

avatar
Super Guru

Hi @Mohammad Shazreen Bin Haini,

Your query is trying to write to /apps location to which "hive" user doesn't have permission.

Change the permission of the folder and try running the query.

hdfs dfs -chmod 777 /apps

Thanks,

Aditya

avatar
Explorer

Thank you for the response...but where should I enter this command?

avatar
Contributor

@Mohammad Shazreen Bin Haini, Based on the question tag, I believe that you're using Sandbox. You will need to ssh into the Sandbox and type in command there. You can learn more here: https://hortonworks.com/hadoop-tutorial/learning-the-ropes-of-the-hortonworks-sandbox/

avatar
Explorer

No I don't have Sandbox installed, my PC is too weak for it.
Is there any way to enter that command using Ambari Web or HDCloud CLI?

I'm following tutorial-410 and I believe Sandbox is not a prerequisite for this.

avatar
Super Guru

@Mohammad Shazreen Bin Haini,

ssh to the node where Ambari is installed / HDFS client is installed and you can run the command there.

As @Peter Kim mentioned , check the permissions of those directories before changing.

Thanks,

Aditya

avatar
Expert Contributor

@Mohammad Shazreen Bin Haini To change the permission from Ambari, you can use Files View under Views sections.

avatar
Super Collaborator

@Mohammad Shazreen Bin Haini

If you are using ranger to manage permissions, there should 2 default policies. 1) HDFS policy - that gives hive full permission to read/write to /apps/hive/warehouse directory. 2) Hive policy - that gives hive full permission "hive" user to create, drop databases, tables.

avatar
Rising Star

If you're querying to default hive table in hive database in HDFS, then do not change access permission.

Originally, hive default warehouse directory in HDFS is "/apps/hive/warehouse".

So you should check the permission of these directories before change permission.

/app : 755 hdfs:hdfs

/app/hive : 755 hdfs:hdfs

/app/hive/warehouse : 777 hive:hdfs

avatar
Explorer

Thank you for the response...but where should I enter this command?