Member since
10-01-2015
52
Posts
25
Kudos Received
3
Solutions
10-05-2016
07:46 PM
2 Kudos
We tried 3 scenarios and all failed. 1) Creating Hive Table on HBase. (proper hive/hbase polices for user)
hive> CREATE TABLE hbase_table_1(key int,
value string) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH
SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
TBLPROPERTIES ("hbase.table.name" = "xyz",
"hbase.mapred.output.outputtable" = "xyz");
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:org.apache.hadoop.hbase.security.AccessDeniedException:
org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient
permissions for user 'abc@ABC.COM' (action=create)
at
org.apache.ranger.authorization.hbase.AuthorizationSession.publishResults(AuthorizationSession.java:254)
at
org.apache.ranger.authorization.hbase.RangerAuthorizationCoprocessor.authorizeAccess(RangerAuthorizationCoprocessor.java:595)
at
org.apache.ranger.authorization.hbase.RangerAuthorizationCoprocessor.requirePermission(RangerAuthorizationCoprocessor.java:664)
at
org.apache.ranger.authorization.hbase.RangerAuthorizationCoprocessor.preCreateTable(RangerAuthorizationCoprocessor.java:769)
at
org.apache.ranger.authorization.hbase.RangerAuthorizationCoprocessor.preCreateTable(RangerAuthorizationCoprocessor.java:496)
at
org.apache.hadoop.hbase.master.MasterCoprocessorHost$11.call(MasterCoprocessorHost.java:216)
at
org.apache.hadoop.hbase.master.MasterCoprocessorHost.execOperation(MasterCoprocessorHost.java:1140)
at
org.apache.hadoop.hbase.master.MasterCoprocessorHost.preCreateTable(MasterCoprocessorHost.java:212)
at
org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1533)
at
org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:454)
at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:55401)
at
org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
at
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at
org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
2) Create Table directly on Hbase hbase(main):001:0> create 'emp', 'personal', 'professional'
ERROR: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions for user 'abc@ABC.COM' (action=create) 3) Using Grant from HBase Shell
hbase(main):001:0> grant 'abc', 'RWCA'
ERROR: org.apache.hadoop.hbase.coprocessor.CoprocessorException: SSLContext must not be null
at org.apache.ranger.authorization.hbase.RangerAuthorizationCoprocessor.grant(RangerAuthorizationCoprocessor.java:1171)
at org.apache.hadoop.hbase.protobuf.generated.AccessControlProtos$AccessControlService$1.grant(AccessControlProtos.java:9933)
at org.apache.hadoop.hbase.protobuf.generated.AccessControlProtos$AccessControlService.callMethod(AccessControlProtos.java:10097)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7553)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1878)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1860)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745) The easy solution we tried is to disable SSL and it worked.
We also tried creating different policies and policy on namespace 'default:*' worked and users were ONLY able to create the TABLE.
They were NOT able to scan the table and issue (3) still had same problem. "
ERROR: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions for user abc@ABC.COM',action: scannerOpen, tableName:hello1, family:col1.
" Regional Servers were playing a important role here on Ranger Policies.
While HBase Master was able to wirte to Policy Cache the cache for Regional Servers was 0 bytes.(communication blocked) We distributed the keystore and truststore from HBase Master to all the worker nodes running Regional Server and restarted Hbase and this solved the issue. How to do? Ambari-Hbase-Config- Filter "SSL"
1) xasecure.policymgr.clientssl.keystore - Find the /path/keystore.file.name and distribute it to all the machines (keep the path/file name same) 2) xasecure.policymgr.clientssl.truststore - - Find the /path/truststore.file.name and distribute it to all the machines (keep the path/file name same) Hope this will help. Thanks
Mayank
... View more
Labels:
01-22-2016
05:28 PM
3 Kudos
Hi Everyone, I came across a Kerberos cache issue and wanted to share and possible have more ideas. I understand few of us had some issues in the past and hope this article might help. One of our clients have RedHat IDM (supported version of freeIpa) and when you install sssd along with krb5 by IDM the default cache setting is 'KEYRING' than 'File' You will still be able to get the tickets but will have GSSException error on Cluster. KEYRING persistent cache setting works with many applications however not with HADOOP Cluster, I'm sure HDO engineering team must be looking into this for a solution, since KEYRING is the future for kerberos cache and the stuff you can do with it like keylist and etc. To solve the errors you can comment out the "default_ccache_name=KEYRING....." on krb.conf or change it to "default_ccache_name = FILE:/tmp/krb5cc_%{uid}" Logout and log in again - destroy the previous tickets and you should have something like "Ticket cache: FILE:/tmp/krb5cc_" in your klist output. If you still see KEYRING PERSISTENT, kill all the running sessions of the user having the problem and restart SSSD service. Thanks Mayank
... View more
Labels: