Support Questions

Find answers, ask questions, and share your expertise

Ambari Hive View Problem

avatar
Expert Contributor

Dear All,

whenever I launch the hive view I get the following error:

----------------
H060 Unable to open Hive session: org.apache.thrift.protocol.TProtocolException: Required field 'serverProtocolVersion' is unset! Struct:TOpenSessionResp(status:TStatus(statusCode:ERROR_STATUS, infoMessages:[*org.apache.hive.service.cli.HiveSQLException:Failed to open new session: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): Unauthorized connection for super-user: hive from IP 10.0.202.157:13:12, org.apache.hive.service.cli.session.SessionManager:openSession:SessionManager.java:266, org.apache.hive.service.cli.CLIService:openSessionWithImpersonation:CLIService.java:202, org.apache.hive.service.cli.thrift.ThriftCLIService:getSessionHandle:ThriftCLIService.java:402, org.apache.hive.service.cli.thrift.ThriftCLIService:OpenSession:ThriftCLIService.java:297, org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession:getResult:TCLIService.java:1253, org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession:getResult:TCLIService.java:1238, org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39, org.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39, org.apache.hive.service.auth.TSetIpAddressProcessor:process:TSetIpAddressProcessor.java:56, org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:285, java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1142, java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:617, java.lang.Thread:run:Thread.java:745, *java.lang.RuntimeException:java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): Unauthorized connection for super-user: hive from IP 10.0.202.157:21:8, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:83, org.apache.hive.service.cli.session.HiveSessionProxy:access$000:HiveSessionProxy.java:36, org.apache.hive.service.cli.session.HiveSessionProxy$1:run:HiveSessionProxy.java:63, java.security.AccessController:doPrivileged:AccessController.java:-2, javax.security.auth.Subject:doAs:Subject.java:422, org.apache.hadoop.security.UserGroupInformation:doAs:UserGroupInformation.java:1657, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:59, com.sun.proxy.$Proxy20:open::-1, org.apache.hive.service.cli.session.SessionManager:openSession:SessionManager.java:258, *java.lang.RuntimeException:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): Unauthorized connection for super-user: hive from IP 10.0.202.157:26:5, org.apache.hadoop.hive.ql.session.SessionState:start:SessionState.java:494, org.apache.hive.service.cli.session.HiveSessionImpl:open:HiveSessionImpl.java:137, sun.reflect.GeneratedMethodAccessor11:invoke::-1, sun.reflect.DelegatingMethodAccessorImpl:invoke:DelegatingMethodAccessorImpl.java:43, java.lang.reflect.Method:invoke:Method.java:497, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:78, *org.apache.hadoop.ipc.RemoteException:Unauthorized connection for super-user: hive from IP 10.0.202.157:45:19, org.apache.hadoop.ipc.Client:call:Client.java:1427, org.apache.hadoop.ipc.Client:call:Client.java:1358, org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker:invoke:ProtobufRpcEngine.java:229, com.sun.proxy.$Proxy15:getFileInfo::-1, org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB:getFileInfo:ClientNamenodeProtocolTranslatorPB.java:771, sun.reflect.GeneratedMethodAccessor7:invoke::-1, sun.reflect.DelegatingMethodAccessorImpl:invoke:DelegatingMethodAccessorImpl.java:43, java.lang.reflect.Method:invoke:Method.java:497, org.apache.hadoop.io.retry.RetryInvocationHandler:invokeMethod:RetryInvocationHandler.java:252, org.apache.hadoop.io.retry.RetryInvocationHandler:invoke:RetryInvocationHandler.java:104, com.sun.proxy.$Proxy16:getFileInfo::-1, org.apache.hadoop.hdfs.DFSClient:getFileInfo:DFSClient.java:2116, org.apache.hadoop.hdfs.DistributedFileSystem$22:doCall:DistributedFileSystem.java:1315, org.apache.hadoop.hdfs.DistributedFileSystem$22:doCall:DistributedFileSystem.java:1311, org.apache.hadoop.fs.FileSystemLinkResolver:resolve:FileSystemLinkResolver.java:81, org.apache.hadoop.hdfs.DistributedFileSystem:getFileStatus:DistributedFileSystem.java:1311, org.apache.hadoop.fs.FileSystem:exists:FileSystem.java:1424, org.apache.hadoop.hive.ql.session.SessionState:createRootHDFSDir:SessionState.java:568, org.apache.hadoop.hive.ql.session.SessionState:createSessionDirs:SessionState.java:526, org.apache.hadoop.hive.ql.session.SessionState:start:SessionState.java:480], errorCode:0, errorMessage:Failed to open new session: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): Unauthorized connection for super-user: hive from IP 10.0.202.157), serverProtocolVersion:null)

-----------------------

I have the following proxy users in custom core-site:

hadoop.proxyuser.hcat.groups = *

hadoop.proxyuser.hcat.hosts = *hadoop.proxyuser.hdfs.groups = *hadoop.proxyuser.hdfs.hosts = *hadoop.proxyuser.hive.groups = *hadoop.proxyuser.hive.hosts = *hadoop.proxyuser.root.groups = *hadoop.proxyuser.root.hosts = *Also I granted permission to use the Hive View to "admin" in Ambari.What am I missing?Any support is appreciated!br,Rainer
1 ACCEPTED SOLUTION

avatar
Master Mentor
@Rainer Geissendoerfer

Please see this guide http://docs.hortonworks.com/HDPDocuments/Ambari-2.2.0.0/bk_ambari_views_guide/content/_setup_HDFS_pr...

Problem is with "Unauthorized connection for super-user: hive from IP 10.0.202.157:13:12"

You have to recheck the settings.

View solution in original post

13 REPLIES 13

avatar
Contributor

Trying make it more clear: I am getting Hive related issues in two diff situations. 1) With Hive view on Ambari

2) Beeline on command line

avatar
Contributor

@Neeraj Sabharwal

Can you suggest the fix for above 2. Thanks.

avatar
New Contributor

Hey i make it running with this way :

1- connect to Ambari

2- hdfs service > advanced config > Custom core-site and change this:

hadoop.proxyuser.hive.groups = *

hadoop.proxyuser.hive.hosts = *

hadoop.proxyuser.hcat.groups = *

hadoop.proxyuser.hcat.hosts = *

11016-capture-decran-2017-01-03-a-172106.jpg

avatar

Thanks I had same issue after HDP2.6 upgrade. The install silently chnaged the seetings.

1- connect to Ambari

2- hdfs service > advanced config > Custom core-site and change this:

hadoop.proxyuser.hive.groups = *

hadoop.proxyuser.hive.hosts = *

hadoop.proxyuser.hcat.groups = *

hadoop.proxyuser.hcat.hosts = *

This solved my issue as well