- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Exception while executing insert query on kerberos+encryption enabled cluster
- Labels:
-
Cloudera Navigator Encrypt
Created ‎04-18-2016 01:55 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I have cluster with kerberos + HDFS Transparent encryption enabled. While executing query from beeline i am getting following error:
0: > insert into sample_test_src values(100); INFO : Number of reduce tasks is set to 0 since there's no reduce operator INFO : Cleaning up the staging area /user/adpqa/.staging/job_1460636656326_0016 ERROR : Job Submission failed with exception 'java.io.IOException(java.lang.reflect.UndeclaredThrowableException)' java.io.IOException: java.lang.reflect.UndeclaredThrowableException at org.apache.hadoop.crypto.key.kms.KMSClientProvider.addDelegationTokens(KMSClientProvider.java:888) at org.apache.hadoop.crypto.key.KeyProviderDelegationTokenExtension.addDelegationTokens(KeyProviderDelegationTokenExtension.java:86) at org.apache.hadoop.hdfs.DistributedFileSystem.addDelegationTokens(DistributedFileSystem.java:2243) at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:121) at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:100) at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodes(TokenCache.java:80) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:166) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:575) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:570) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:570) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:561) at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:431) at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1703) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1460) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1237) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1101) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1096) at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:154) at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:71) at org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:206) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:218) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.reflect.UndeclaredThrowableException at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1672) at org.apache.hadoop.crypto.key.kms.KMSClientProvider.addDelegationTokens(KMSClientProvider.java:870) ... 40 more Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden at org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274) at org.apache.hadoop.security.authentication.client.PseudoAuthenticator.authenticate(PseudoAuthenticator.java:77) at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.authenticate(DelegationTokenAuthenticator.java:128) at org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:214) at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.authenticate(DelegationTokenAuthenticator.java:128) at org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:215) at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.doDelegationTokenOperation(DelegationTokenAuthenticator.java:285) at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.getDelegationToken(DelegationTokenAuthenticator.java:166) at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticatedURL.getDelegationToken(DelegationTokenAuthenticatedURL.java:371) at org.apache.hadoop.crypto.key.kms.KMSClientProvider$2.run(KMSClientProvider.java:875) at org.apache.hadoop.crypto.key.kms.KMSClientProvider$2.run(KMSClientProvider.java:870) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) ... 41 more
Any idea on what could be wrong/missing here? I have similar query works fine from Hive CLI.
Created ‎04-20-2016 06:18 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks Ali for the details.
Later yesterday we added following configurations to our setup.
hadoop.kms.proxyuser.user1.users = *
hadoop.kms.proxyuser.user1.hosts = *
We were impersonating user1 and we had ticket generated for the user. Push-down queries are working fine now.
Thanks for the help.
Created ‎04-18-2016 03:59 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
In the beeline command please check if the Hive principal name is set correctly and matching the cluster settings.
Also ensure that the kerberos ticket is still available.
!connect jdbc:hive2://sandbox.hortonworks.com:10000/default;principal=hive/_HOST@REALM.COM
Created ‎04-19-2016 09:37 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Service Principal name is specified correctly and kerberos ticket is also available.
Created ‎04-19-2016 10:31 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
In the below, I am assuming that
a) the hive warehouse dirs were moved to an ecnryption zone and
b) as recommended in our docs when Ranger is installed, hive.server2.enable.doas is set to false in hive configs (ie the queries are run as 'hive' user)
Probable root cause: The 403 error message may mean there is an authorization issue (Ranger is blocking access). Best way to confirm this is to check Ranger audits.
1. Check if user is being denied at Hive or HDFS level: Login to Ranger as admin and navigate to audits tab and filter for Result = Denied
2. Check if 'hive' user is being denied access to encryption zone containing hive warehouse tables. To do this:
a) First expose Audits view to keyadmin user:
- login to Ranger as admin and click Settings tab > Permissions.
- Click 'Audit' (second row from bottom) to change users who have access to Audit screen
- Under 'Select User', add 'keyadmin' user
b) Logoff as admin and relogin to Ranger as keyadmin user. Then navigate to audits tab and filter for Result = Denied
Most likely you will see requests getting denied by Ranger.
Resolution:
Once you confirm its an authorization issue, follow below to resolve:
1. check if the user (you are kinit'ed as before launching beeline) has is a Ranger hive policy allowing him/her access to the table
To check this, login to Ranger as admin and check the Hive policies
2. there is a KMS policy allowing 'hive' and 'nn' user access to the key used to encrypt the hive warehouse dir in HDFS (you may need to create these users in Ranger or sync from 'unix' once before you can do this)
nn
user needs at leastGetMetaData
andGenerateEEK
privilegehive
user needs at leastGetMetaData
andDecryptEEK
privilege
Created ‎04-20-2016 06:18 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks Ali for the details.
Later yesterday we added following configurations to our setup.
hadoop.kms.proxyuser.user1.users = *
hadoop.kms.proxyuser.user1.hosts = *
We were impersonating user1 and we had ticket generated for the user. Push-down queries are working fine now.
Thanks for the help.
Created ‎07-19-2016 01:58 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hey Vishal ..where did you added these properties ..
