Support Questions
Find answers, ask questions, and share your expertise

Kerberos permission issue

Kerberos permission issue

Rising Star

I am trying to execute below query in kerberised cluster with Hive principal and keytab from java api. This query will create table in Hive and as well as in HBase also.

CREATE TABLE hive_aps (key struct<f1:string, f2:string>, value string) ROW FORMAT DELIMITED COLLECTION ITEMS TERMINATED BY '~' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES (\"hbase.columns.mapping\"=\":key,f:c1\") TBLPROPERTIES (\"hbase.table.name\" = \"hbase_aps\", \"hbase.mapred.output.outputtable\" = \"hbase_aps\")

And getting below error

java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:org.apache.hadoop.hbase.security.AccessDeniedException: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions (user=hive, scope=default, params=[namespace=default,table=default:hbase_aps,family=f],action=CREATE)
at org.apache.hadoop.hbase.security.access.AccessController.requireNamespacePermission(AccessController.java:622)
at org.apache.hadoop.hbase.security.access.AccessController.preCreateTable(AccessController.java:991)
at org.apache.hadoop.hbase.master.MasterCoprocessorHost$11.call(MasterCoprocessorHost.java:216)
at org.apache.hadoop.hbase.master.MasterCoprocessorHost.execOperation(MasterCoprocessorHost.java:1140)
at org.apache.hadoop.hbase.master.MasterCoprocessorHost.preCreateTable(MasterCoprocessorHost.java:212)
at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1533)
at org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:454)
at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:55401)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:226)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:240)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:140)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4083)
at org.apache.hadoop.hbase.client.HBaseAdmin.createTableAsyncV2(HBaseAdmin.java:723)
at org.apache.hadoop.hbase.client.HBaseAdmin.createTable(HBaseAdmin.java:644)
at org.apache.hadoop.hbase.client.HBaseAdmin.createTable(HBaseAdmin.java:577)
at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:217)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:670)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:663)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
at com.sun.proxy.$Proxy9.createTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:717)
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4271)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:311)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1728)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1485)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1262)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1126)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1121)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:154)
at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:183)
at org.apache.hive.service.cli.operation.Operation.run(Operation.java:257)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:419)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:400)
at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
at com.sun.proxy.$Proxy20.executeStatement(Unknown Source)
at org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:263)
at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:486)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1317)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1302)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:562)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException): org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions (user=hive, scope=default, params=[namespace=default,table=default:hbase_aps,family=f],action=CREATE)
at org.apache.hadoop.hbase.security.access.AccessController.requireNamespacePermission(AccessController.java:622)
at org.apache.hadoop.hbase.security.access.AccessController.preCreateTable(AccessController.java:991)
at org.apache.hadoop.hbase.master.MasterCoprocessorHost$11.call(MasterCoprocessorHost.java:216)
at org.apache.hadoop.hbase.master.MasterCoprocessorHost.execOperation(MasterCoprocessorHost.java:1140)
at org.apache.hadoop.hbase.master.MasterCoprocessorHost.preCreateTable(MasterCoprocessorHost.java:212)
at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1533)
at org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:454)
at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:55401)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1226)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.createTable(MasterProtos.java:58320)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$4.createTable(ConnectionManager.java:1821)
at org.apache.hadoop.hbase.client.HBaseAdmin$5.call(HBaseAdmin.java:728)
at org.apache.hadoop.hbase.client.HBaseAdmin$5.call(HBaseAdmin.java:724)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
... 50 more
)
at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:167)
at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:155)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:210)
at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:333)
at com.bigframe.hive.api.HiveClient_Kerberos.doCreateHiveTable(HiveClient_Kerberos.java:114)
at com.bigframe.hive.api.HiveClient_Kerberos.main(HiveClient_Kerberos.java:220)

This query is able to create table in Hive but failing for HBase due to permission issue. Because Hive principal does not have permission for create table in HBase. What should I do? Please help. This is urgent.

2 REPLIES 2
Highlighted

Re: Kerberos permission issue

Super Guru
@Arkaprova Saha

You need to enable authorization for hive user in hbase. Check out thw following link on how to enable authorization in HBase.

http://hbase.apache.org/0.94/book/hbase.accesscontrol.configuration.html

 grant hive RWC hive_aps
    

<permissions> is zero or more letters from the set "RWCA": READ('R'), WRITE('W'), CREATE('C'), ADMIN('A'). I this example I have given Hive Read, Write and Create but no Admin access.

Highlighted

Re: Kerberos permission issue

Super Mentor

@Arkaprova Saha

Have you tried giving the following permissions to user "hive" ?

  grant  'hive', 'RWCA' hive_aps