Created 06-08-2019 02:00 PM
Hi All,
when we start spark2 thrift server , its start for ashort time - 30 sec and then fail back.
I have attched the spark2 logs.
19/06/07 11:22:16 INFO HiveThriftServer2: HiveThriftServer2 started 19/06/07 11:22:16 INFO UserGroupInformation: Login successful for user hive/lhdcsi02v.production.local@production.local using keytab file /etc/security/keytabs/hive.service.keytab 19/06/07 11:22:16 ERROR ThriftCLIService: Error starting HiveServer2: could not start ThriftBinaryCLIService java.lang.NoSuchMethodError: org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server.startDelegationTokenSecretManager(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/Object;Lorg/apache/hadoop/hive/thrift/HadoopThriftAuthBridge$Server$ServerMode;)V at org.apache.hive.service.auth.HiveAuthFactory.<init>(HiveAuthFactory.java:125) at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:57) at java.lang.Thread.run(Thread.java:748) 19/06/07 11:22:16 INFO HiveServer2: Shutting down HiveServer2 19/06/07 11:22:16 INFO AbstractService: Service:ThriftBinaryCLIService is stopped. 19/06/07 11:22:16 INFO AbstractService: Service:OperationManager is stopped. 19/06/07 11:22:16 INFO AbstractService: Service:SessionManager is stopped. 19/06/07 11:22:16 INFO SparkUI: Stopped Spark web UI at http://lhdcsi02v.production.local:4041 19/06/07 11:22:26 WARN ShutdownHookManager: ShutdownHook '$anon$2' timeout, java.util.concurrent.TimeoutException java.util.concurrent.TimeoutException at java.util.concurrent.FutureTask.get(FutureTask.java:205) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:67) 19/06/07 11:22:26 ERROR Utils: Uncaught exception in thread pool-1-thread-1 java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.spark.scheduler.AsyncEventQueue.stop(AsyncEventQueue.scala:133) at org.apache.spark.scheduler.LiveListenerBus$$anonfun$stop$1.apply(LiveListenerBus.scala:219) at org.apache.spark.scheduler.LiveListenerBus$$anonfun$stop$1.apply(LiveListenerBus.scala:219) at scala.collection.Iterator$class.foreach(Iterator.scala:893) at scala.collection.AbstractIterator.foreach(Iterator.scala:1336) at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) at scala.collection.AbstractIterable.foreach(Iterable.scala:54) at org.apache.spark.scheduler.LiveListenerBus.stop(LiveListenerBus.scala:219) at org.apache.spark.SparkContext$$anonfun$stop$6.apply$mcV$sp(SparkContext.scala:1922) at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1357) at org.apache.spark.SparkContext.stop(SparkContext.scala:1921) at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:66) at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$$anonfun$main$1.apply$mcV$sp(HiveThriftServer2.scala:82) at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1988) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 19/06/07 11:22:26 INFO AbstractService: Service:CLIService is stopped. 19/06/07 11:22:26 INFO AbstractService: Service:HiveServer2 is stopped.
@Jay Kumar SenSharma,@Geoffrey Shelton Okot,@Neeraj Sabharwal,@Akhil S Naik
Plaese help.
Thanks,
Vishal Bohra
Created 06-09-2019 12:59 AM
Have you recently placed any new hive-exec jar in your file system? Or Upgraded HDP by any chance?
What is your HDP version ? Can you please share the output of the following command from the spark2 thrift server host?
# hdp-select | grep -e "hive\|spark"
.
We see the following error:
java.lang.NoSuchMethodError: org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server.startDelegationTokenSecretManager(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/Object;Lorg/apache/hadoop/hive/thrift/HadoopThriftAuthBridge$Server$ServerMode;)V
Above error indicates that you might have an incorrect verioon of JARs in the classpath (which might have happened when some of your JARs might not be upgraded or mistakenly some jars of incorrect version are copied to the spark2 thrift server jars lib)
.
Based on the error it looks like you might have a slightly conflicting version of "hive-exec*.jar" JAR inside the host where you are running the "spark2 thrift server"
Can you please check scan your file system and find out which all places you have this JAR and what is the version? You can use the following approach to locate/find the "hive-exec" jars?
# yum install mlocate -y # updatedb # locate hive-exec | grep jar
.
Once you find the JAR then try checking the version is correct or not ? (It should be matching your HDP version)
For example if you are using HDP 2.6.5.0-291 version then the hive-exec jar should look like following "hive-exec-1.21.2.2.6.5.0-292.jar".
You can run the following command to find out the Signature of the method which is listed in the above error:
Example in HDP 2.6.5 checking the signature of "startDelegationTokenSecretManager" method
# /usr/jdk64/jdk1.8.0_112/bin/javap -cp /usr/hdp/current/spark2-thriftserver/jars/hive-exec-1.21.2.2.6.5.0-292.jar "org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge\$Server" | grep startDelegationTokenSecretManager public void startDelegationTokenSecretManager(org.apache.hadoop.conf.Configuration, java.lang.Object, org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$ServerMode) throws java.io.IOException;
Similarly check which "hive-exec-*.jar" JAR has a slightly different signature in your filesystem. Then try to remove the conflicting JAR from the classpath and then try again.
.
.
Created 06-09-2019 12:59 AM
Have you recently placed any new hive-exec jar in your file system? Or Upgraded HDP by any chance?
What is your HDP version ? Can you please share the output of the following command from the spark2 thrift server host?
# hdp-select | grep -e "hive\|spark"
.
We see the following error:
java.lang.NoSuchMethodError: org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server.startDelegationTokenSecretManager(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/Object;Lorg/apache/hadoop/hive/thrift/HadoopThriftAuthBridge$Server$ServerMode;)V
Above error indicates that you might have an incorrect verioon of JARs in the classpath (which might have happened when some of your JARs might not be upgraded or mistakenly some jars of incorrect version are copied to the spark2 thrift server jars lib)
.
Based on the error it looks like you might have a slightly conflicting version of "hive-exec*.jar" JAR inside the host where you are running the "spark2 thrift server"
Can you please check scan your file system and find out which all places you have this JAR and what is the version? You can use the following approach to locate/find the "hive-exec" jars?
# yum install mlocate -y # updatedb # locate hive-exec | grep jar
.
Once you find the JAR then try checking the version is correct or not ? (It should be matching your HDP version)
For example if you are using HDP 2.6.5.0-291 version then the hive-exec jar should look like following "hive-exec-1.21.2.2.6.5.0-292.jar".
You can run the following command to find out the Signature of the method which is listed in the above error:
Example in HDP 2.6.5 checking the signature of "startDelegationTokenSecretManager" method
# /usr/jdk64/jdk1.8.0_112/bin/javap -cp /usr/hdp/current/spark2-thriftserver/jars/hive-exec-1.21.2.2.6.5.0-292.jar "org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge\$Server" | grep startDelegationTokenSecretManager public void startDelegationTokenSecretManager(org.apache.hadoop.conf.Configuration, java.lang.Object, org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$ServerMode) throws java.io.IOException;
Similarly check which "hive-exec-*.jar" JAR has a slightly different signature in your filesystem. Then try to remove the conflicting JAR from the classpath and then try again.
.
.
Created 06-10-2019 10:50 AM
@Jay Kumar SenSharma Thanks for the reply.
hive-exec-1.21.2.2.6.5.0-292.jar is present at the specified location.
could you please suggest the next step ?
lhdcsi02v spark2]# locate hive-exec | grep jar
/u01/tmp/hadoop-unjar9078871880138055540/META-INF/maven/org.apache.hive/hive-exec
/u01/tmp/hadoop-unjar9078871880138055540/META-INF/maven/org.apache.hive/hive-exec/pom.properties
/u01/tmp/hadoop-unjar9078871880138055540/META-INF/maven/org.apache.hive/hive-exec/pom.xml
/usr/hdp/2.6.5.0-292/hive/lib/hive-exec-1.2.1000.2.6.5.0-292.jar
/usr/hdp/2.6.5.0-292/hive/lib/hive-exec.jar
/usr/hdp/2.6.5.0-292/hive2/lib/hive-exec-2.1.0.2.6.5.0-292.jar
/usr/hdp/2.6.5.0-292/hive2/lib/hive-exec.jar
/usr/hdp/2.6.5.0-292/oozie/oozie-server/webapps/oozie/WEB-INF/lib/hive-exec-1.2.1000.2.6.5.0-292.jar
/usr/hdp/2.6.5.0-292/pig/lib/hive-exec-1.2.1000.2.6.5.0-292-core.jar
/usr/hdp/2.6.5.0-292/ranger-admin/ews/webapp/WEB-INF/classes/ranger-plugins/hive/hive-exec-1.2.1000.2.6.5.0-292.jar
/usr/hdp/2.6.5.0-292/spark2/jars/hive-exec-1.21.2.2.6.5.0-292.jar
Created 06-10-2019 01:33 PM
The spark2 thrift server is started.
But now i am getting the below errors in the spakr2 logs.
Logs attached too.
java.lang.RuntimeException: Could not load shims in class org.apache.hadoop.hive.schshim.FairSchedulerShim
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.schshim.FairSchedulerShim
If i follow the below link then again i will be in same situation like above.
Please help.