Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

While creating table Error:permission denied: user=hive,access=WRITE,inode=''(kerberos+sentry)

While creating table Error:permission denied: user=hive,access=WRITE,inode=''(kerberos+sentry)

Contributor

Whille creating a table through beeline with kinit user(not hive) default user going to hive (hiveserver2 high availability+kerberos+SSL+sentry)

 

I have enable HS2 high availability. After enabling i tried to create a table with another user but, by default it's going to USER=hive

 

Granted all permissions on URI  'hdfs://nameservice1/user/john/test.db/ha' to user

added user in allowed sentry users

current role is showing john in beeline

 

beeline URL:

jdbc:hive2://<ZK1>:2181,<ZK2>:2181,<ZK3>:2181/test;principal=hive/<HOST>@<REALM>;ssl=true;sslTrustStore=/opt/cloudera/security/tls-ssl/hive-custom.truststore;trustStorePassword=cloudera;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;

>create table ha(id int, name string) stored as textfile location 'hdfs://nameservice1/user/john/test.db/ha'; 

 

getting below error:

Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.SentryFilterDDLTask. MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=hive, access=WRITE, inode="/user/john/test.db":john:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:281)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:262)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:242)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:169)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6590)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6572)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6524)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:4322)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4292)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4265)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:867)
at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:322)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:603)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080)
) (state=08S01,code=1)

 

 

Please help me out,

Thanks

 

 

4 REPLIES 4
Highlighted

Re: While creating table Error:permission denied: user=hive,access=WRITE,inode=''(kerberos+sentry)

Explorer

Hi,

  if you are use sentry you need to add the /user/john/test.db to the path managed by Sentry Plugin for HDFS.

 

In HDFS configuration you will find that the "Sentry Synchronization Path Prefix" property contains the default hive warehouse path (/user/hive/warehouse).

 

You need to add your path to this property (it is a multi values property).

 

Sentry will controll only the paths that contain a Hive db. I.e. if you add the /user/john/ path only the path under /user/john/test.db will be synchrinized by Sentry.

Re: While creating table Error:permission denied: user=hive,access=WRITE,inode=''(kerberos+sentry)

Contributor
Thanks MicheleM for you response.

I didn't enabled HDFS Sentry synchronization. Before hiveserver2 highavailability config it used to work perfectly. Sentry enabled for only hive now. The problem here is ,create table/query is running by default user hive. It should be an enduser.

Re: While creating table Error:permission denied: user=hive,access=WRITE,inode=''(kerberos+sentry)

Explorer

Hi Rakesh,

  afaik when you use sentry only Hive user runs ddl and dml instruction, for all of the databases, in the behalf of the enduser.

 

The Sentry synchronization will keep aligned the hdfs ACLs so the end user can see, via hdfs, its grants on database files.

 

More in deep, HDFS asks the actual ACLs to Sentry for the path that it manages.

 

If you add your path, /user/john/test.db, to Sentry synchronization then Sentry will do all work for you.

 

Regards

Michele

Re: While creating table Error:permission denied: user=hive,access=WRITE,inode=''(kerberos+sentry)

Explorer

Also if you use Sentry Synchronization and Sentry Synchronization Path Prefixes you need to change owner an group for all hive directories and files in hdfs to hive:hive. If you enable this feature you cannot change hive warehouse ACLs manually, because it is controlled by Sentry (with grants).

 

You can assign this ACLs manually but is no recommended.

 

Marc.

Don't have an account?
Coming from Hortonworks? Activate your account here