Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [admin] does not have [USE] privilege on [null] [ERROR_STATUS]

avatar
Contributor

When I am trying to access my hive databases from Hive View, am getting above error.

Does anyone has any idea about this that why we get such error from hive view ?

7 REPLIES 7

avatar

@revan wabale

I think you have logged in as admin and trying to do operations, Refer Link and create hdfs directory for admin and give appropriately permissions . It will works

  1. Connect to a host in the cluster that includes the HDFS client.
  2. Switch to the hdfs system account user.
    su - hdfs
  3. Using the HDFS client, make an HDFS directory for the user. For example, if your username is admin, you would create the following directory.
    hadoop fs -mkdir /user/admin
  4. Set the ownership on the newly created directory. For example, if your username is admin, you would make that user the directory owner.
    hadoop fs -chown admin:hadoop /user/admin

avatar

create a hive policy in ranger to allow admin user to access hive

avatar

@revan wabale

Also verify and ensure that hadoop.proxyuser.hive.users/group are set to * in hdfs-configs [core-site].

avatar
Guru

Hi @revan wabale ,

to answer your question clearly we need to know what type of authorization you are using for Hive.

Do you have Ranger in place? If not maybe the following link shines some light on authentication in Hiveserver2 link

Regards...

avatar

Hi all,

I have the same problem and when i've done all the steps as previously mentioned above I still get permissions denied. Can someone help me please?

avatar
Contributor

For me the following fixed it. Set the following configs on hive:

webhcat.proxyuser.root.groups *

webhcat.proxyuser.root.hosts *

avatar
INSERT OVERWRITE DIRECTORY "hdfs://sandbox-hdp.hortonworks.com/tmp/hey2" SELECT * FROM mytable;

(i.e. you need to specify full name of cluster) works fine when you have correct configuration in Ranger, but INSERT OVERWRITE DIRECTORY "/tmp/hey2" SELECT * FROM mytable; will fail. At least this is true for HDP 2.6.3