Support Questions

Find answers, ask questions, and share your expertise

java.io.InvalidClassException:org.apache.spark.sql.hive.execution.InsertIntoHiveTable

I am getting invalid class serialize exception spark submit client mode and it working good local mode.Can any one help me out.

Exception details

java.io.InvalidClassException:org.apache.spark.sql.hive.execution.InsertIntoHiveTable; local class incompatible: stream classdesc serialVersionUID=439602153812587, local class serailVersionUID=1424907733712746510

6 REPLIES 6

Super Mentor

@tulasi kumar guthurti

This usually happens if you have a conflicting version of JAR that contains the class "org.apache.spark.sql.hive.execution.InsertIntoHiveTable"

- So can you please check the classpath of spark-submit in client mode to see from which jar is it taking that class?

.


Spark 1.6 and 2.0 jars are not mixed. For example, the "spark-assembly-1.6*xxx.jar" (spark) and "spark-hive_2.xxxx.jar" (spark2) both contains this class. So please make sure that you are not mixing the claspath to include both the JARs. You can check the "--jars" parameter values.

.

Do you mean mixing class path on server side?Can you please explain here.

Thank you for your quick response.On server side it using the following assembly spark-assembly-1.6.2.2.5.3.44-1-hadoop2.7.3.2.5.3.44-1.jar and on driver side i am using spark-hive_2.10-1.6.2 jar file and am using hortonworks repository on driver side.I am completely block here and unable to move forward.I apperciate your help.

Super Mentor

@tulasi kumar guthurti

I will suggest to use the same JAR (spark-assembly-1.6xxxxx.jar) version on the driver side that you are using on the server side to see if it goes well.

I didn't find spark-assembly-1.6xxxxx.jar file in horton repo.(So please make sure that you are not mixing the claspath to include both the JARs)Do you know how to verify mixing class paths on server side.On my cluster we have spar1.6.2 and spark2 also installed.

Issue is fixed after passing jars into --conf "spark.driver.ExtraClassPath and --conf "spark.executor.ExtraClassPath" into spark submit