- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
java.sql.SQLException: Could not establish connection to jdbc:hive2://localhost:10000: Required field 'serverProtocolVersion' is unset!
- Labels:
-
Apache Hive
-
Apache Zeppelin
Created ‎12-05-2017 02:14 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am using Zeppelin and Hive in a single Hadoop cluster (Pseudo-distributed mode). I have edited jdbc in Zeppelin interpreter and I have added hive settings as suggested in https://zeppelin.apache.org/docs/0.7.3/interpreter/hive.html
Now, I am executing the below commands in Zeppelin
%jdbc(hive) show databases;
But, I am facing with the following error:
java.sql.SQLException: Could not establish connection to jdbc:hive2://localhost:10000: Required field 'serverProtocolVersion' is unset! Struct:TOpenSessionResp(status:TStatus(statusCode:ERROR_STATUS, infoMessages:[*org.apache.hive.service.cli.HiveSQLException:Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate anonymous:14:13, org.apache.hive.service.cli.session.SessionManager:createSession:SessionManager.java:336, org.apache.hive.service.cli.session.SessionManager:openSession:SessionManager.java:279, org.apache.hive.service.cli.CLIService:openSessionWithImpersonation:CLIService.java:189, org.apache.hive.service.cli.thrift.ThriftCLIService:getSessionHandle:ThriftCLIService.java:414, org.apache.hive.service.cli.thrift.ThriftCLIService:OpenSession:ThriftCLIService.java:310, org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession:getResult:TCLIService.java:1377, org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession:getResult:TCLIService.java:1362, org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39, org.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39, org.apache.hive.service.auth.TSetIpAddressProcessor:process:TSetIpAddressProcessor.java:56, org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:286, java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1152, java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:622, java.lang.Thread:run:Thread.java:748, *java.lang.RuntimeException:java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate anonymous:22:8, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:89, org.apache.hive.service.cli.session.HiveSessionProxy:access$000:HiveSessionProxy.java:36, org.apache.hive.service.cli.session.HiveSessionProxy$1:run:HiveSessionProxy.java:63, java.security.AccessController:doPrivileged:AccessController.java:-2, javax.security.auth.Subject:doAs:Subject.java:421, org.apache.hadoop.security.UserGroupInformation:doAs:UserGroupInformation.java:1807, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:59, com.sun.proxy.$Proxy34:open::-1, org.apache.hive.service.cli.session.SessionManager:createSession:SessionManager.java:327, *java.lang.RuntimeException:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate anonymous:29:7, org.apache.hadoop.hive.ql.session.SessionState:start:SessionState.java:578, org.apache.hadoop.hive.ql.session.SessionState:start:SessionState.java:513, org.apache.hive.service.cli.session.HiveSessionImpl:open:HiveSessionImpl.java:165, sun.reflect.NativeMethodAccessorImpl:invoke0:NativeMethodAccessorImpl.java:-2, sun.reflect.NativeMethodAccessorImpl:invoke:NativeMethodAccessorImpl.java:57, sun.reflect.DelegatingMethodAccessorImpl:invoke:DelegatingMethodAccessorImpl.java:43, java.lang.reflect.Method:invoke:Method.java:606, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:78, *org.apache.hadoop.ipc.RemoteException:User: root is not allowed to impersonate anonymous:54:25, org.apache.hadoop.ipc.Client:getRpcResponse:Client.java:1481, org.apache.hadoop.ipc.Client:call:Client.java:1427, org.apache.hadoop.ipc.Client:call:Client.java:1337, org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker:invoke:ProtobufRpcEngine.java:227, org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker:invoke:ProtobufRpcEngine.java:116, com.sun.proxy.$Proxy30:getFileInfo::-1, org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB:getFileInfo:ClientNamenodeProtocolTranslatorPB.java:787, sun.reflect.NativeMethodAccessorImpl:invoke0:NativeMethodAccessorImpl.java:-2, sun.reflect.NativeMethodAccessorImpl:invoke:NativeMethodAccessorImpl.java:57, sun.reflect.DelegatingMethodAccessorImpl:invoke:DelegatingMethodAccessorImpl.java:43, java.lang.reflect.Method:invoke:Method.java:606, org.apache.hadoop.io.retry.RetryInvocationHandler:invokeMethod:RetryInvocationHandler.java:398, org.apache.hadoop.io.retry.RetryInvocationHandler$Call:invokeMethod:RetryInvocationHandler.java:163, org.apache.hadoop.io.retry.RetryInvocationHandler$Call:invoke:RetryInvocationHandler.java:155, org.apache.hadoop.io.retry.RetryInvocationHandler$Call:invokeOnce:RetryInvocationHandler.java:95, org.apache.hadoop.io.retry.RetryInvocationHandler:invoke:RetryInvocationHandler.java:335, com.sun.proxy.$Proxy31:getFileInfo::-1, org.apache.hadoop.hdfs.DFSClient:getFileInfo:DFSClient.java:1700, org.apache.hadoop.hdfs.DistributedFileSystem$27:doCall:DistributedFileSystem.java:1436, org.apache.hadoop.hdfs.DistributedFileSystem$27:doCall:DistributedFileSystem.java:1433, org.apache.hadoop.fs.FileSystemLinkResolver:resolve:FileSystemLinkResolver.java:81, org.apache.hadoop.hdfs.DistributedFileSystem:getFileStatus:DistributedFileSystem.java:1433, org.apache.hadoop.fs.FileSystem:exists:FileSystem.java:1436, org.apache.hadoop.hive.ql.session.SessionState:createRootHDFSDir:SessionState.java:674, org.apache.hadoop.hive.ql.session.SessionState:createSessionDirs:SessionState.java:622, org.apache.hadoop.hive.ql.session.SessionState:start:SessionState.java:550], errorCode:0, errorMessage:Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate anonymous), serverProtocolVersion:null) at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:467) at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:178) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:571) at java.sql.DriverManager.getConnection(DriverManager.java:187) at org.apache.commons.dbcp2.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:79) at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:205) at org.apache.commons.pool2.impl.GenericObjectPool.create(GenericObjectPool.java:861) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:435) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:363) at org.apache.commons.dbcp2.PoolingDriver.connect(PoolingDriver.java:129) at java.sql.DriverManager.getConnection(DriverManager.java:571) at java.sql.DriverManager.getConnection(DriverManager.java:233) at org.apache.zeppelin.jdbc.JDBCInterpreter.getConnectionFromPool(JDBCInterpreter.java:360) at org.apache.zeppelin.jdbc.JDBCInterpreter.getConnection(JDBCInterpreter.java:378) at org.apache.zeppelin.jdbc.JDBCInterpreter.executeSql(JDBCInterpreter.java:570) at org.apache.zeppelin.jdbc.JDBCInterpreter.interpret(JDBCInterpreter.java:709) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:97) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:498) at org.apache.zeppelin.scheduler.Job.run(Job.java:175) at org.apache.zeppelin.scheduler.ParallelScheduler$JobRunner.run(ParallelScheduler.java:162) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:473) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1152) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:622) at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.thrift.protocol.TProtocolException: Required field 'serverProtocolVersion' is unset! Struct:TOpenSessionResp(status:TStatus(statusCode:ERROR_STATUS, infoMessages:[*org.apache.hive.service.cli.HiveSQLException:Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate anonymous:14:13, org.apache.hive.service.cli.session.SessionManager:createSession:SessionManager.java:336, org.apache.hive.service.cli.session.SessionManager:openSession:SessionManager.java:279, org.apache.hive.service.cli.CLIService:openSessionWithImpersonation:CLIService.java:189, org.apache.hive.service.cli.thrift.ThriftCLIService:getSessionHandle:ThriftCLIService.java:414, org.apache.hive.service.cli.thrift.ThriftCLIService:OpenSession:ThriftCLIService.java:310, org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession:getResult:TCLIService.java:1377, org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession:getResult:TCLIService.java:1362, org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39, org.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39, org.apache.hive.service.auth.TSetIpAddressProcessor:process:TSetIpAddressProcessor.java:56, org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:286, java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1152, java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:622, java.lang.Thread:run:Thread.java:748, *java.lang.RuntimeException:java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate anonymous:22:8, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:89, org.apache.hive.service.cli.session.HiveSessionProxy:access$000:HiveSessionProxy.java:36, org.apache.hive.service.cli.session.HiveSessionProxy$1:run:HiveSessionProxy.java:63, java.security.AccessController:doPrivileged:AccessController.java:-2, javax.security.auth.Subject:doAs:Subject.java:421, org.apache.hadoop.security.UserGroupInformation:doAs:UserGroupInformation.java:1807, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:59, com.sun.proxy.$Proxy34:open::-1, org.apache.hive.service.cli.session.SessionManager:createSession:SessionManager.java:327, *java.lang.RuntimeException:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate anonymous:29:7, org.apache.hadoop.hive.ql.session.SessionState:start:SessionState.java:578, org.apache.hadoop.hive.ql.session.SessionState:start:SessionState.java:513, org.apache.hive.service.cli.session.HiveSessionImpl:open:HiveSessionImpl.java:165, sun.reflect.NativeMethodAccessorImpl:invoke0:NativeMethodAccessorImpl.java:-2, sun.reflect.NativeMethodAccessorImpl:invoke:NativeMethodAccessorImpl.java:57, sun.reflect.DelegatingMethodAccessorImpl:invoke:DelegatingMethodAccessorImpl.java:43, java.lang.reflect.Method:invoke:Method.java:606, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:78, *org.apache.hadoop.ipc.RemoteException:User: root is not allowed to impersonate anonymous:54:25, org.apache.hadoop.ipc.Client:getRpcResponse:Client.java:1481, org.apache.hadoop.ipc.Client:call:Client.java:1427, org.apache.hadoop.ipc.Client:call:Client.java:1337, org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker:invoke:ProtobufRpcEngine.java:227, org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker:invoke:ProtobufRpcEngine.java:116, com.sun.proxy.$Proxy30:getFileInfo::-1, org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB:getFileInfo:ClientNamenodeProtocolTranslatorPB.java:787, sun.reflect.NativeMethodAccessorImpl:invoke0:NativeMethodAccessorImpl.java:-2, sun.reflect.NativeMethodAccessorImpl:invoke:NativeMethodAccessorImpl.java:57, sun.reflect.DelegatingMethodAccessorImpl:invoke:DelegatingMethodAccessorImpl.java:43, java.lang.reflect.Method:invoke:Method.java:606, org.apache.hadoop.io.retry.RetryInvocationHandler:invokeMethod:RetryInvocationHandler.java:398, org.apache.hadoop.io.retry.RetryInvocationHandler$Call:invokeMethod:RetryInvocationHandler.java:163, org.apache.hadoop.io.retry.RetryInvocationHandler$Call:invoke:RetryInvocationHandler.java:155, org.apache.hadoop.io.retry.RetryInvocationHandler$Call:invokeOnce:RetryInvocationHandler.java:95, org.apache.hadoop.io.retry.RetryInvocationHandler:invoke:RetryInvocationHandler.java:335, com.sun.proxy.$Proxy31:getFileInfo::-1, org.apache.hadoop.hdfs.DFSClient:getFileInfo:DFSClient.java:1700, org.apache.hadoop.hdfs.DistributedFileSystem$27:doCall:DistributedFileSystem.java:1436, org.apache.hadoop.hdfs.DistributedFileSystem$27:doCall:DistributedFileSystem.java:1433, org.apache.hadoop.fs.FileSystemLinkResolver:resolve:FileSystemLinkResolver.java:81, org.apache.hadoop.hdfs.DistributedFileSystem:getFileStatus:DistributedFileSystem.java:1433, org.apache.hadoop.fs.FileSystem:exists:FileSystem.java:1436, org.apache.hadoop.hive.ql.session.SessionState:createRootHDFSDir:SessionState.java:674, org.apache.hadoop.hive.ql.session.SessionState:createSessionDirs:SessionState.java:622, org.apache.hadoop.hive.ql.session.SessionState:start:SessionState.java:550], errorCode:0, errorMessage:Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate anonymous), serverProtocolVersion:null) at org.apache.hive.service.cli.thrift.TOpenSessionResp.validate(TOpenSessionResp.java:578) at org.apache.hive.service.cli.thrift.TOpenSessionResp$TOpenSessionRespStandardScheme.read(TOpenSessionResp.java:676) at org.apache.hive.service.cli.thrift.TOpenSessionResp$TOpenSessionRespStandardScheme.read(TOpenSessionResp.java:612) at org.apache.hive.service.cli.thrift.TOpenSessionResp.read(TOpenSessionResp.java:520) at org.apache.hive.service.cli.thrift.TCLIService$OpenSession_result$OpenSession_resultStandardScheme.read(TCLIService.java:2281) at org.apache.hive.service.cli.thrift.TCLIService$OpenSession_result$OpenSession_resultStandardScheme.read(TCLIService.java:2266) at org.apache.hive.service.cli.thrift.TCLIService$OpenSession_result.read(TCLIService.java:2213) at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78) at org.apache.hive.service.cli.thrift.TCLIService$Client.recv_OpenSession(TCLIService.java:156) at org.apache.hive.service.cli.thrift.TCLIService$Client.OpenSession(TCLIService.java:143) at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:456) ... 27 more
Please guide me in this matter.
Created ‎12-05-2017 08:01 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I see the following error in the "Caused By" section of the error:
User: root is not allowed to impersonate anonymous), serverProtocolVersion:null)
Which makes me feel that your hadoop proxy user configuration is not setup correctly.
Please check the proxy user settings inside the
Ambari UI --> Services --> HDFS -- > Configs --> Advanced tab --> Custom core-site
If not added then, Click Add Property… to add the following custom properties.
hadoop.proxyuser.root.groups=* hadoop.proxyuser.root.hosts=*
.
Also please double check if the "localhost" is the correct address for your HiveServer2? Is that port listening? Can you please try changing it to proper Hostname instead of "localhost". Please run the following commands on the HS2 host to see if the hostname is correct and the port is listening.
# netstat -tnlpa | grep 10000 # hostname -f
.
Created ‎12-05-2017 02:52 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
This error can occur because of jdbc driver version. Replace the zeppelin hive jdbc jar with the one provided by hive server. Hopefully that should work.
Thanks,
Aditya
Created ‎12-05-2017 03:33 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear @Aditya Sirna,
Could you please give me more hints? I do not know where are both exactly, I mean Zeppelin Hive JDBC and HIVE Server JDBC. tnx.
Created ‎12-05-2017 03:52 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm not sure where the jars are located in a stand alone distribution. You should find it in libs folder. In HDP the jars in the below locations
1)Zeppelin hive jar : /usr/hdp/current/zeppelin-server/interpreter/jdbc/hive-jdbc-1.2.1000.2.6.3.0-235.jar
2) Hive server jdbc jar : /usr/hdp/current/hive-server2/jdbc/hive-jdbc-1.2.1000.2.6.3.0-235-standalone.jar
Replace jar in #1 with #2 and restart zeppelin and try running the query
Created ‎12-05-2017 04:15 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
In the standalone cluster:
1) Zeppelin Hive JDBC jar: /usr/local/zeppelin/interpreter/jdbc
2) Hive JDBC jar: /usr/local/hive/jdbc/hive-jdbc-2.1.0-standalone.jar
I copied Hive JDBC into Zeppelin path. Should Hive be Hive-server2? Because I can run Hive Server2 which is a execution file in Hive folder.
3) Shall I change dependencis in jdbc interpreter setting as well? default is: org.apache.hive:hive-jdbc:0.14.0
Created ‎12-05-2017 04:34 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
For #2, Yes. you can use hive server2
For #3, I'm not 100% sure about that. But you can give it a try
Created ‎12-05-2017 04:48 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I thought that you were already using Hiveserver2. 10000 is usually the port for HS2. You can continue to use that. Just change the jar in zeppelin and rerun
Created ‎12-05-2017 04:37 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Do you mean install Hive-server2?
Created ‎12-05-2017 08:01 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I see the following error in the "Caused By" section of the error:
User: root is not allowed to impersonate anonymous), serverProtocolVersion:null)
Which makes me feel that your hadoop proxy user configuration is not setup correctly.
Please check the proxy user settings inside the
Ambari UI --> Services --> HDFS -- > Configs --> Advanced tab --> Custom core-site
If not added then, Click Add Property… to add the following custom properties.
hadoop.proxyuser.root.groups=* hadoop.proxyuser.root.hosts=*
.
Also please double check if the "localhost" is the correct address for your HiveServer2? Is that port listening? Can you please try changing it to proper Hostname instead of "localhost". Please run the following commands on the HS2 host to see if the hostname is correct and the port is listening.
# netstat -tnlpa | grep 10000 # hostname -f
.
Created ‎12-05-2017 09:49 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks for comments. That is really helpful. Since I am using standalone hadoop not ambari, I configured core-site.xml with the same hadoop.proxyuser configuration as you specified.
Now, Zeppelin is working with Hive JDBC.
