Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hive Metastore is unable to impersonate hiveserver2 principal

Hive Metastore is unable to impersonate hiveserver2 principal

Explorer

I have a CDH 5.13.3 kerberized cluster with the hive installed with Sentry Service. The hive setup is as follows:

  • node-1 running hivemetastore
  • node-2 running hiveserver2
  • node-3 previous hivemetastore

I migrated the Hivemetastore from node-3 to node-1 and it seems that hivemetastore now has problems accepting connections from HiveServer2:(/var/log/hive/hadoop-cmf-hive-HIVEMETASTORE-node-1<DOMAIN>.log.out)

 

 

2020-04-10 18:47:51,084 DEBUG org.apache.hadoop.security.UserGroupInformation: [pool-5-thread-118]: PrivilegedAction as:hive/node-1.<DOMAIN>(auth:KERBEROS) from:org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:807)
2020-04-10 18:47:51,084 DEBUG org.apache.thrift.transport.TSaslServerTransport: [pool-5-thread-118]: transport map does not contain key
2020-04-10 18:47:51,084 DEBUG org.apache.thrift.transport.TSaslTransport: [pool-5-thread-118]: opening transport org.apache.thrift.transport.TSaslServerTransport@6083dddd
2020-04-10 18:47:51,085 DEBUG org.apache.thrift.transport.TSaslTransport: [pool-5-thread-118]: SERVER: Received message with status START and payload length 6
2020-04-10 18:47:51,085 DEBUG org.apache.thrift.transport.TSaslServerTransport: [pool-5-thread-118]: Received start message with status START
2020-04-10 18:47:51,085 DEBUG org.apache.thrift.transport.TSaslServerTransport: [pool-5-thread-118]: Received mechanism name 'GSSAPI'
2020-04-10 18:47:51,086 DEBUG org.apache.thrift.transport.TSaslTransport: [pool-5-thread-118]: SERVER: Start message handled
2020-04-10 18:47:51,086 DEBUG org.apache.thrift.transport.TSaslTransport: [pool-5-thread-118]: SERVER: Received message with status OK and payload length 665
2020-04-10 18:47:51,087 DEBUG org.apache.thrift.transport.TSaslTransport: [pool-5-thread-118]: SERVER: Writing message with status OK and payload length 110
2020-04-10 18:47:51,088 DEBUG org.apache.thrift.transport.TSaslTransport: [pool-5-thread-118]: SERVER: Received message with status OK and payload length 0
2020-04-10 18:47:51,088 DEBUG org.apache.thrift.transport.TSaslTransport: [pool-5-thread-118]: SERVER: Writing message with status OK and payload length 65
2020-04-10 18:47:51,089 DEBUG org.apache.thrift.transport.TSaslTransport: [pool-5-thread-118]: SERVER: Received message with status COMPLETE and payload length 65
2020-04-10 18:47:51,089 DEBUG org.apache.hadoop.security.SaslRpcServer: [pool-5-thread-118]: SASL server GSSAPI callback: setting canonicalized client ID: hive/node-2.<DOMAIN>
2020-04-10 18:47:51,089 DEBUG org.apache.thrift.transport.TSaslTransport: [pool-5-thread-118]: SERVER: Writing message with status COMPLETE and payload length 0
2020-04-10 18:47:51,089 DEBUG org.apache.thrift.transport.TSaslTransport: [pool-5-thread-118]: SERVER: Main negotiation loop complete
2020-04-10 18:47:51,089 DEBUG org.apache.hadoop.security.UserGroupInformation: [pool-5-thread-118]: PrivilegedAction as:hive/node-1.<DOMAIN>(auth:KERBEROS) from:org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:807)
2020-04-10 18:47:51,089 DEBUG org.apache.thrift.transport.TSaslServerTransport: [pool-5-thread-118]: transport map does contain key org.apache.thrift.transport.TSocket@6e443e65
2020-04-10 18:47:51,089 DEBUG org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge: [pool-5-thread-118]: AUTH ID ======>hive/node-2.<DOMAIN>
2020-04-10 18:47:51,089 ERROR org.apache.thrift.server.TThreadPoolServer: [pool-5-thread-118]: Error occurred during processing of message.
java.lang.RuntimeException: org.apache.hadoop.security.authorize.AuthorizationException: User: hive/node-1.<DOMAIN> is not allowed to impersonate hive/node-2.<DOMAIN>
	at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:773)
	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.security.authorize.AuthorizationException: User: hive/node-1.<DOMAIN> is not allowed to impersonate hive/node-2.<DOMAIN>
	at org.apache.hadoop.security.authorize.DefaultImpersonationProvider.authorize(DefaultImpersonationProvider.java:123)
	at org.apache.hadoop.security.authorize.ProxyUsers.authorize(ProxyUsers.java:102)
	at org.apache.hadoop.security.authorize.ProxyUsers.authorize(ProxyUsers.java:116)
	at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:743)
	... 4 more

 

 

 On the hiveserver2 side:( /var/log/hive/hadoop-cmf-hive-HIVESERVER2-node-2<DOMAIN>.log.out):

 

 

2020-04-10 18:47:51,088 INFO  hive.metastore: [main]: Opened a connection to metastore, current connections: 29
2020-04-10 18:47:51,088 INFO  hive.metastore: [main]: Connected to metastore.
2020-04-10 18:47:51,088 DEBUG org.apache.thrift.transport.TSaslTransport: [main]: writing data length: 30
2020-04-10 18:47:51,089 WARN  hive.ql.metadata.Hive: [main]: Failed to register all functions.
org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
	at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3646)
	at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:231)
	at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:215)
	at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:338)
	at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:299)
	at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:274)
	at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:256)
	at org.apache.hadoop.hive.ql.security.authorization.DefaultHiveAuthorizationProvider.init(DefaultHiveAuthorizationProvider.java:29)
	at org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProviderBase.setConf(HiveAuthorizationProviderBase.java:112)
	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
	at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:388)
	at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:810)
	at org.apache.hadoop.hive.ql.session.SessionState.getAuthorizationMode(SessionState.java:1679)
	at org.apache.hadoop.hive.ql.session.SessionState.isAuthorizationModeV2(SessionState.java:1690)
	at org.apache.hadoop.hive.ql.session.SessionState.applyAuthorizationPolicy(SessionState.java:1738)
	at org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:125)
	at org.apache.hive.service.cli.CLIService.init(CLIService.java:111)
	at org.apache.hive.service.CompositeService.init(CompositeService.java:59)
	at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:125)
	at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:542)
	at org.apache.hive.service.server.HiveServer2.access$700(HiveServer2.java:89)
	at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:793)
	at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:666)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

 

 

i have the following properties configured:

 

 

</property>
<name>hadoop.proxyuser.hive.hosts</name>
<value>*</value>
</property>

<property>
<name>hadoop.proxyuser.hive.groups</name>
<value>hive,hue,sentry</value>
</property>

 

 

hive.server2.enable.impersonation and hive.server2.enable.doAs are disabled because I am using Sentry Service.

Can someone explain to me why is the hive metastore trying to impersonate as hiveserver2 when hive.server2.enable.impersonation is disabled and why is this not working after migration?

Don't have an account?
Coming from Hortonworks? Activate your account here