Created 05-31-2016 03:50 PM
hi:
I have restarte hive service and i cant use hive now, here the strange error, any suggestions?? i cant connect now to mysql hive database
Connection failed on host lnxbig05.cajarural.gcr:10000 (Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/alerts/alert_hive_thrift_port.py", line 200, in execute check_command_timeout=int(check_command_timeout)) File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/hive_check.py", line 68, in check_thrift_port_sasl timeout=check_command_timeout File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 238, in action_run tries=self.resource.tries, try_sleep=self.resource.try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call tries=tries, try_sleep=try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 285, in _call raise ExecuteTimeoutException(err_msg) ExecuteTimeoutException: Execution of 'ambari-sudo.sh su ambari-qa -l -s /bin/bash -c 'export PATH='"'"'/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/lib64/qt-3.3/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/dmexpress/bin:/home/bigotes/bin:/usr/dmexpress/bin:/var/lib/ambari-agent:/bin/:/usr/bin/:/usr/lib/hive/bin/:/usr/sbin/'"'"' ; ! beeline -u '"'"'jdbc:hive2://HOSTNAME:10000/;transportMode=binary'"'"' -e '"'"''"'"' 2>&1| awk '"'"'{print}'"'"'|grep -i -e '"'"'Connection refused'"'"' -e '"'"'Invalid URL'"'"''' was killed due timeout after 60 seconds )
thanks
Created 05-31-2016 04:34 PM
Can you please run this command and share the output?
grep -iR org.apache.hadoop.hive.shims.HadoopShims /usr/hdp/current/hive-server2/
Created 05-31-2016 04:34 PM
[root@lnxbig05 bin]# grep -iR org.apache.hadoop.hive.shims.HadoopShims /usr/hdp/current/hive-server2/ Binary file /usr/hdp/current/hive-server2/lib/hive-jdbc-1.2.1000.2.4.0.0-169-standalone.jar matches Binary file /usr/hdp/current/hive-server2/lib/hive-exec-1.2.1000.2.4.0.0-169.jar matches Binary file /usr/hdp/current/hive-server2/lib/drill-hive-exec-shaded-1.6.0.jar matches Binary file /usr/hdp/current/hive-server2/lib/hive-exec.jar matches Binary file /usr/hdp/current/hive-server2/lib/hive-shims-common-1.2.1000.2.4.0.0-169.jar matches grep: /usr/hdp/current/hive-server2/lib/ojdbc6.jar: No such file or directory Binary file /usr/hdp/current/hive-server2/lib/hive-shims-common.jar matches Binary file /usr/hdp/current/hive-server2/lib/hive-jdbc.jar matches
Created 05-31-2016 04:54 PM
hi:
I was deleting external and internal tables from job-browser, from this path into hdfs:
/apps/hive/warehouse and the external path
Created 05-31-2016 05:01 PM
looking at this exception
java.lang.NoSuchMethodError: org.apache.hadoop.hive.shims.HadoopShims.setHadoopSessionContext(Ljava/lang/String;)V
it seems that there is wrong version of HadoopShims jar is available in your classpath which dont have setHadoopSessionContext implementation in it or it has different method signature.
to troubleshoot this problem
lsof -p <HS2 process id> | grep -i jar |awk '{ print $9 }' > class-jar.txt |
for jar in `cat class-jar.txt` ; do echo "$jar" ; jar -tvf "$jar" | grep --color 'org.apache.hadoop.hive.shims.HadoopShims' ; done |
look out the jars(there could me multiple shim jar available) which contains this class and then extract this class from jar
for each jar which contains HadoopShims
do
jar xvf <jar> org/apache/hadoop/hive/shims/HadoopShims
run
javap org.apache.hadoop.hive.shims.HadoopShims to verify the method setHadoopSessionContext availbility and method signature
Created 05-31-2016 05:02 PM
Hi:
i dont have lsof installed, so tomorrow ill tell me OS team to install that and ill try it.
is curious that the hadoop classpath is this, its correct?? i dont see hive
hadoop classpath /usr/hdp/2.3.2.0-2950/hadoop/conf:/usr/hdp/2.3.2.0-2950/hadoop/lib/*:/usr/hdp/2.3.2.0-2950/hadoop/.//*:/usr/hdp/2.3.2.0-2950/hadoop-hdfs/./:/usr/hdp/2.3.2.0-2950/hadoop-hdfs/lib/*:/usr/hdp/2.3.2.0-2950/hadoop-hdfs/.//*:/usr/hdp/2.3.2.0-2950/hadoop-yarn/lib/*:/usr/hdp/2.3.2.0-2950/hadoop-yarn/.//*:/usr/hdp/2.3.2.0-2950/hadoop-mapreduce/lib/*:/usr/hdp/2.3.2.0-2950/hadoop-mapreduce/.//*:::/usr/share/java/mysql-connector-java-5.1.17.jar:/usr/share/java/mysql-connector-java.jar:/usr/share/java/ojdbc.jar:/usr/hdp/2.3.2.0-2950/tez/*:/usr/hdp/2.3.2.0-2950/tez/lib/*:/usr/hdp/2.3.2.0-2950/tez/conf
Created 06-01-2016 09:07 AM
Hi:
finally it work deleting some jars that I put in the hive classpath like apachedrill :(, so i deleted it and i restarted the service.
Many thanks all of you.