Support Questions

Find answers, ask questions, and share your expertise
Announcements
Welcome to the upgraded Community! Read this blog to see What’s New!

hive error connection

avatar
Super Collaborator

hi:

I have restarte hive service and i cant use hive now, here the strange error, any suggestions?? i cant connect now to mysql hive database

Connection failed on host lnxbig05.cajarural.gcr:10000 (Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/alerts/alert_hive_thrift_port.py", line 200, in execute check_command_timeout=int(check_command_timeout)) File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/hive_check.py", line 68, in check_thrift_port_sasl timeout=check_command_timeout File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 238, in action_run tries=self.resource.tries, try_sleep=self.resource.try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call tries=tries, try_sleep=try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 285, in _call raise ExecuteTimeoutException(err_msg) ExecuteTimeoutException: Execution of 'ambari-sudo.sh su ambari-qa -l -s /bin/bash -c 'export PATH='"'"'/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/lib64/qt-3.3/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/dmexpress/bin:/home/bigotes/bin:/usr/dmexpress/bin:/var/lib/ambari-agent:/bin/:/usr/bin/:/usr/lib/hive/bin/:/usr/sbin/'"'"' ; ! beeline -u '"'"'jdbc:hive2://HOSTNAME:10000/;transportMode=binary'"'"' -e '"'"''"'"' 2>&1| awk '"'"'{print}'"'"'|grep -i -e '"'"'Connection refused'"'"' -e '"'"'Invalid URL'"'"''' was killed due timeout after 60 seconds )

thanks

1 ACCEPTED SOLUTION

avatar

Hi @Roberto Sancho

Can you please run this command and share the output?

grep -iR org.apache.hadoop.hive.shims.HadoopShims /usr/hdp/current/hive-server2/

View solution in original post

14 REPLIES 14

avatar

@Roberto Sancho

re-start hiveserver2 & metastore services and give a try again?

avatar
Expert Contributor

Connection failed on host lnxbig05.cajarural.gcr:10000... This says its unable to connect to the hiveserver2 process . Please check if you hiveserver2 service is up and running .

avatar
Super Collaborator

Hi: after restarted it, the same error:

 H110 Unable to submit statement. java.lang.NoSuchMethodError: org.apache.hadoop.hive.shims.HadoopShims.setHadoopSessionContext(Ljava/lang/String;)V [ERROR_STATUS]

I think that i lost some path or jar, but i just restart the process.

avatar
Super Collaborator

HI:

Hi , hibeserver is running

hive      9704  3.3  0.7 2379760 376784 ?      Sl   18:10   0:33 /usr/jdk64/jdk1.8.0_40/bin/java -Xmx8192m -Dhdp.version=2.4.0.0-169 -Djava.net.preferIPv4Stack=true -Dhdp.version=2.4.0.0-169 -Dhadoop.log.dir=/var/log/hadoop/hive -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.4.0.0-169/hadoop -Dhadoop.id.str=hive -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:/usr/hdp/2.4.0.0-169/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx8192m -Xmx512m -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/hdp/2.4.0.0-169/hive/lib/hive-service-1.2.1000.2.4.0.0-169.jar org.apache.hive.service.server.HiveServer2 --hiveconf hive.aux.jars.path=file:///usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar -hiveconf hive.metastore.uris=  -hiveconf hive.log.file=hiveserver2.log -hiveconf hive.log.dir=/var/log/hive


avatar
Super Collaborator

hi:

i think i need this jar, but i dont knwo why

hive-shims-0.11.0.jar

avatar

Can you run below two commands on your server where hs2 and metastore are running and paste the output?

ps -ef | grep hiveserver2

ps -ef | grep metastore

avatar

Hi @Roberto Sancho

Can you please run this command and share the output?

grep -iR org.apache.hadoop.hive.shims.HadoopShims /usr/hdp/current/hive-server2/

avatar

@Roberto Sancho

Did you added any external jar recently inside cluster or specifically in hive?.

avatar

@Roberto Sancho

Great.!! :), please accept the answer which helped you to close this thread.

avatar
Super Collaborator
[root@lnxbig05 bin]# grep -iR org.apache.hadoop.hive.shims.HadoopShims /usr/hdp/current/hive-server2/
Binary file /usr/hdp/current/hive-server2/lib/hive-jdbc-1.2.1000.2.4.0.0-169-standalone.jar matches
Binary file /usr/hdp/current/hive-server2/lib/hive-exec-1.2.1000.2.4.0.0-169.jar matches
Binary file /usr/hdp/current/hive-server2/lib/drill-hive-exec-shaded-1.6.0.jar matches
Binary file /usr/hdp/current/hive-server2/lib/hive-exec.jar matches
Binary file /usr/hdp/current/hive-server2/lib/hive-shims-common-1.2.1000.2.4.0.0-169.jar matches
grep: /usr/hdp/current/hive-server2/lib/ojdbc6.jar: No such file or directory
Binary file /usr/hdp/current/hive-server2/lib/hive-shims-common.jar matches
Binary file /usr/hdp/current/hive-server2/lib/hive-jdbc.jar matches

avatar
Super Collaborator

hi:

I was deleting external and internal tables from job-browser, from this path into hdfs:

/apps/hive/warehouse
and the external path

avatar

looking at this exception

java.lang.NoSuchMethodError: org.apache.hadoop.hive.shims.HadoopShims.setHadoopSessionContext(Ljava/lang/String;)V

it seems that there is wrong version of HadoopShims jar is available in your classpath which dont have setHadoopSessionContext implementation in it or it has different method signature.

to troubleshoot this problem

lsof -p <HS2 process id> | grep -i jar |awk '{ print $9 }' > class-jar.txt
for jar in `cat class-jar.txt` ; do echo "$jar" ; jar -tvf "$jar" | grep --color 'org.apache.hadoop.hive.shims.HadoopShims' ; done

look out the jars(there could me multiple shim jar available) which contains this class and then extract this class from jar

for each jar which contains HadoopShims

do

jar xvf <jar> org/apache/hadoop/hive/shims/HadoopShims

run

javap org.apache.hadoop.hive.shims.HadoopShims to verify the method setHadoopSessionContext availbility and method signature

avatar
Super Collaborator

Hi:

i dont have lsof installed, so tomorrow ill tell me OS team to install that and ill try it.

is curious that the hadoop classpath is this, its correct?? i dont see hive

hadoop classpath
/usr/hdp/2.3.2.0-2950/hadoop/conf:/usr/hdp/2.3.2.0-2950/hadoop/lib/*:/usr/hdp/2.3.2.0-2950/hadoop/.//*:/usr/hdp/2.3.2.0-2950/hadoop-hdfs/./:/usr/hdp/2.3.2.0-2950/hadoop-hdfs/lib/*:/usr/hdp/2.3.2.0-2950/hadoop-hdfs/.//*:/usr/hdp/2.3.2.0-2950/hadoop-yarn/lib/*:/usr/hdp/2.3.2.0-2950/hadoop-yarn/.//*:/usr/hdp/2.3.2.0-2950/hadoop-mapreduce/lib/*:/usr/hdp/2.3.2.0-2950/hadoop-mapreduce/.//*:::/usr/share/java/mysql-connector-java-5.1.17.jar:/usr/share/java/mysql-connector-java.jar:/usr/share/java/ojdbc.jar:/usr/hdp/2.3.2.0-2950/tez/*:/usr/hdp/2.3.2.0-2950/tez/lib/*:/usr/hdp/2.3.2.0-2950/tez/conf

avatar
Super Collaborator

Hi:

finally it work deleting some jars that I put in the hive classpath like apachedrill :(, so i deleted it and i restarted the service.

Many thanks all of you.

Labels