Created 11-27-2015 11:06 PM
Am getting error "Required field 'serverProtocolVersion' is unset" when trying to run Hive queries from recent build of Zeppelin on HDP 2.3.2/Spark 1.4.1. Tried both Zeppelin service and compiling Zeppelin. Spark and Phoenix jobs run fine. Hive used to work fine before.
Google search suggests issue may be caused by below...anyone else run into this?
https://issues.apache.org/jira/browse/HIVE-6050
http://stackoverflow.com/questions/24694415/required-field-client-protocol-is-unset
Using default proxy settings for Hive
hadoop.proxyuser.hive.hosts = * hadoop.proxyuser.hive.groups = users
Full error
ERROR [2015-11-27 04:41:26,630] ({pool-2-thread-2} HiveConnection.java[openSession]:466) - Error opening session org.apache.thrift.protocol.TProtocolException: Required field 'serverProtocolVersion' is unset! Struct:TOpenSessionResp(status:TStatus(statusCode:ERROR_STATUS, infoMessages:[*org.apache.hive.service.cli.HiveSQLException:Failed to open new session: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hive is not allowed to impersonate hive:13:12, org.apache.hive.service.cli.session.SessionManager:openSession:SessionManager.java:266, org.apache.hive.service.cli.CLIService:openSessionWithImpersonation:CLIService.java:202, org.apache.hive.service.cli.thrift.ThriftCLIService:getSessionHandle:ThriftCLIService.java:402, org.apache.hive.service.cli.thrift.ThriftCLIService:OpenSession:ThriftCLIService.java:297, org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession:getResult:TCLIService.java:1253, org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession:getResult:TCLIService.java:1238, org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39, org.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39, org.apache.hive.service.auth.TSetIpAddressProcessor:process:TSetIpAddressProcessor.java:56, org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:285, java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1145, java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:615, java.lang.Thread:run:Thread.java:745, *java.lang.RuntimeException:java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hive is not allowed to impersonate hive:21:8, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:83, org.apache.hive.service.cli.session.HiveSessionProxy:access$000:HiveSessionProxy.java:36, org.apache.hive.service.cli.session.HiveSessionProxy$1:run:HiveSessionProxy.java:63, java.security.AccessController:doPrivileged:AccessController.java:-2, javax.security.auth.Subject:doAs:Subject.java:415, org.apache.hadoop.security.UserGroupInformation:doAs:UserGroupInformation.java:1657, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:59, com.sun.proxy.$Proxy19:open::-1, org.apache.hive.service.cli.session.SessionManager:openSession:SessionManager.java:258, *java.lang.RuntimeException:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hive is not allowed to impersonate hive:26:5, org.apache.hadoop.hive.ql.session.SessionState:start:SessionState.java:522, org.apache.hive.service.cli.session.HiveSessionImpl:open:HiveSessionImpl.java:137, sun.reflect.GeneratedMethodAccessor10:invoke::-1, sun.reflect.DelegatingMethodAccessorImpl:invoke:DelegatingMethodAccessorImpl.java:43, java.lang.reflect.Method:invoke:Method.java:606, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:78, *org.apache.hadoop.ipc.RemoteException:User: hive is not allowed to impersonate hive:45:19, org.apache.hadoop.ipc.Client:call:Client.java:1427, org.apache.hadoop.ipc.Client:call:Client.java:1358, org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker:invoke:ProtobufRpcEngine.java:229, com.sun.proxy.$Proxy14:getFileInfo::-1, org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB:getFileInfo:ClientNamenodeProtocolTranslatorPB.java:771, sun.reflect.GeneratedMethodAccessor7:invoke::-1, sun.reflect.DelegatingMethodAccessorImpl:invoke:DelegatingMethodAccessorImpl.java:43, java.lang.reflect.Method:invoke:Method.java:606, org.apache.hadoop.io.retry.RetryInvocationHandler:invokeMethod:RetryInvocationHandler.java:187, org.apache.hadoop.io.retry.RetryInvocationHandler:invoke:RetryInvocationHandler.java:102, com.sun.proxy.$Proxy15:getFileInfo::-1, org.apache.hadoop.hdfs.DFSClient:getFileInfo:DFSClient.java:2116, org.apache.hadoop.hdfs.DistributedFileSystem$22:doCall:DistributedFileSystem.java:1305, org.apache.hadoop.hdfs.DistributedFileSystem$22:doCall:DistributedFileSystem.java:1301, org.apache.hadoop.fs.FileSystemLinkResolver:resolve:FileSystemLinkResolver.java:81, org.apache.hadoop.hdfs.DistributedFileSystem:getFileStatus:DistributedFileSystem.java:1301, org.apache.hadoop.fs.FileSystem:exists:FileSystem.java:1424, org.apache.hadoop.hive.ql.session.SessionState:createRootHDFSDir:SessionState.java:596, org.apache.hadoop.hive.ql.session.SessionState:createSessionDirs:SessionState.java:554, org.apache.hadoop.hive.ql.session.SessionState:start:SessionState.java:508], errorCode:0, errorMessage:Failed to open new session: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hive is not allowed to impersonate hive), serverProtocolVersion:null)
Created 11-29-2015 02:18 AM
(Adding as answer as it too big to add as comment): Ran some tests and here are the findings. Changing proxyuser helps but still having issues getting queries that result in Tez jobs working for some reason. With 2.3.0 sandbox (spark 1.3.1) all hive queries seem to run fine
Test 1:
Test 2:
Test 3:
Test 4:
Test 5:
Created 02-10-2016 09:25 PM
@Ali Bajwa is this still an issue with the latest Spark and Sandbox? I have a user with exact same issues and no means to fix it yet. He claimed he tried fixing this using this article.