Support Questions

Find answers, ask questions, and share your expertise

Can't execute query - Permission denied?,Can't execute DDL query

Explorer

Hi, I encountered this 'permission denied - 500' error when executing the DDL query.

<small>java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.security.AccessControlException: Permission denied: user=hive, access=WRITE, inode="/apps":hdfs:hdfs:drwxr-xr-x
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:219)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1955)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1939)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1913)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8751)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:2089)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1454)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345) </small>

)


I am on admin account. I need help as this is my first time dealing with HDCloud.
Thank you very much.

1 ACCEPTED SOLUTION

Hi @Mohammad Shazreen Bin Haini,

Your query is trying to write to /apps location to which "hive" user doesn't have permission.

Change the permission of the folder and try running the query.

hdfs dfs -chmod 777 /apps

Thanks,

Aditya

View solution in original post

13 REPLIES 13

Hi @Mohammad Shazreen Bin Haini,

Your query is trying to write to /apps location to which "hive" user doesn't have permission.

Change the permission of the folder and try running the query.

hdfs dfs -chmod 777 /apps

Thanks,

Aditya

Explorer

Thank you for the response...but where should I enter this command?

Explorer

@Mohammad Shazreen Bin Haini, Based on the question tag, I believe that you're using Sandbox. You will need to ssh into the Sandbox and type in command there. You can learn more here: https://hortonworks.com/hadoop-tutorial/learning-the-ropes-of-the-hortonworks-sandbox/

Explorer

No I don't have Sandbox installed, my PC is too weak for it.
Is there any way to enter that command using Ambari Web or HDCloud CLI?

I'm following tutorial-410 and I believe Sandbox is not a prerequisite for this.

@Mohammad Shazreen Bin Haini,

ssh to the node where Ambari is installed / HDFS client is installed and you can run the command there.

As @Peter Kim mentioned , check the permissions of those directories before changing.

Thanks,

Aditya

Contributor

@Mohammad Shazreen Bin Haini To change the permission from Ambari, you can use Files View under Views sections.

Expert Contributor

@Mohammad Shazreen Bin Haini

If you are using ranger to manage permissions, there should 2 default policies. 1) HDFS policy - that gives hive full permission to read/write to /apps/hive/warehouse directory. 2) Hive policy - that gives hive full permission "hive" user to create, drop databases, tables.

Explorer

If you're querying to default hive table in hive database in HDFS, then do not change access permission.

Originally, hive default warehouse directory in HDFS is "/apps/hive/warehouse".

So you should check the permission of these directories before change permission.

/app : 755 hdfs:hdfs

/app/hive : 755 hdfs:hdfs

/app/hive/warehouse : 777 hive:hdfs

Explorer

Thank you for the response...but where should I enter this command?

Explorer

Connect to hdp vm server as root, then switch hdfs user and then check directory ownership & permission.

[root@fqdn]# su hdfs

[hdfs@fqdn]# hdfs dfs -ls /apps

[hdfs@fqdn]# hdfs dfs -ls /apps/hive

[hdfs@fqdn]# hdfs dfs -chmod 755 /apps/hive

[hdfs@fqdn]# hdfs dfs -chmod 777 /apps/hive/warehouse

if the ownership is incorrect, then try it.

[hdfs@fqdn]# hdfs dfs -chown hdfs:hdfs /apps/hive

[hdfs@fqdn]# hdfs dfs -chown hive:hdfs /apps/hive/warehouse

Explorer

@Aditya Sirna Tried SSH the node, but now when I try
hdfs dfs -chmod 777/apps
it says

Permission denied: user=cloudbreak is not the owner of inode=apps

You have to run it as hdfs user

Run su hdfs and run the chmod command

Thanks,

Aditya

Explorer

Thank you so much. The query works now.

-----------------------
SSH to your node, run su hdfs, and then run the chmod command, and then execute the query.