<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26049#M22737</link>
    <description>&lt;P&gt;It sounds like a network config problem:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Caused by: akka.remote.transport.netty.NettyTransport$$anonfu&lt;/SPAN&gt;&lt;SPAN&gt;n$associate$1$$anon$2: Connection refused: hadoop02.mycompany.local/192.168.209.172:37271&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Tue, 31 Mar 2015 11:03:59 GMT</pubDate>
    <dc:creator>srowen</dc:creator>
    <dc:date>2015-03-31T11:03:59Z</dc:date>
    <item>
      <title>Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/25909#M22728</link>
      <description>&lt;P&gt;I'm trying to use spark (standalone) to load data onto hive tables. The avro schema is successfully, I see (on spark ui page) that my applications are finished running, however the applications are in the Killed state.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;THIS IS THE STDERR.LOG ON THE SPARK WEB UI PAGE VIA CLOUDERA MANAGER:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;15/03/25 06:15:58 ERROR Executor: Exception in task 1.3 in stage 2.0 (TID 10)&lt;BR /&gt;java.io.InvalidClassException: org.apache.spark.rdd.PairRDDFunctions; local class incompatible: stream classdesc serialVersionUID = 8789839749593513237, local class serialVersionUID = -4145741279224749316&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:57)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.spark.scheduler.Task.run(Task.scala:56)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)&lt;BR /&gt;&amp;nbsp;&amp;nbsp; &amp;nbsp;at java.lang.Thread.run(Thread.java:745)&lt;BR /&gt;15/03/25 06:15:59 ERROR CoarseGrainedExecutorBackend: Driver Disassociated [akka.tcp://sparkExecutor@HadoopNode01.local:48707] -&amp;gt; [akka.tcp://sparkDriver@HadoopNode02.local:54550] disassociated! Shutting down.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any help will be greatly appreciated.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Thanks&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 09:25:25 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/25909#M22728</guid>
      <dc:creator>Dataminer</dc:creator>
      <dc:date>2022-09-16T09:25:25Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/25910#M22729</link>
      <description>&lt;P&gt;&amp;lt;&amp;lt;Sorry duplicate post&amp;gt;&amp;gt;&lt;/P&gt;</description>
      <pubDate>Wed, 25 Mar 2015 21:38:41 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/25910#M22729</guid>
      <dc:creator>Dataminer</dc:creator>
      <dc:date>2015-03-25T21:38:41Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/25913#M22730</link>
      <description>&lt;P&gt;This generally means you're mixing two versions of Spark somehow. Are you sure your app isn't also trying to bundle Spark? are you using the CDH Spark, and not your own compiled version?&lt;/P&gt;</description>
      <pubDate>Wed, 25 Mar 2015 21:55:01 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/25913#M22730</guid>
      <dc:creator>srowen</dc:creator>
      <dc:date>2015-03-25T21:55:01Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/25914#M22731</link>
      <description>&lt;P&gt;I am using spark (standalone) from the latest cloudera cdh version. Once my cluster is up and running, via cloudera manager I am selecting the "Add a service" option and adding the spark(standalone) service. Could you pls clarify on what you mean by "my app trying to bundle spark"? My application depends on spark as installed by cloudera CDH. The applcation does not come with spark.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Wed, 25 Mar 2015 21:58:58 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/25914#M22731</guid>
      <dc:creator>Dataminer</dc:creator>
      <dc:date>2015-03-25T21:58:58Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/25915#M22732</link>
      <description>&lt;P&gt;I mean, do you build your app with a dependency on Spark, and if so what version, and, have you marked it as 'provided' so as not to be included the JAR you submit?&lt;/P&gt;</description>
      <pubDate>Wed, 25 Mar 2015 22:09:02 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/25915#M22732</guid>
      <dc:creator>srowen</dc:creator>
      <dc:date>2015-03-25T22:09:02Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/25924#M22733</link>
      <description>&lt;P&gt;Yes, my application is installed with a dependency on spark, if spark(standalone) is not present then my app fails to install. I do not specify any spark version, it takes what ever version that is available from cloudera manager.&lt;/P&gt;&lt;P&gt;Where do i mark it 'provided'?&lt;/P&gt;&lt;P&gt;How do i check the spark version on cloudera manager?&lt;/P&gt;&lt;P&gt;I do not submit any jars for spark through my application.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The application I'm trying to install on CDH is Oracle Big Data Discovery, it is tightly coupled with cloudera cdh and depends on spark for data processing.&lt;/P&gt;</description>
      <pubDate>Thu, 26 Mar 2015 14:31:19 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/25924#M22733</guid>
      <dc:creator>Dataminer</dc:creator>
      <dc:date>2015-03-26T14:31:19Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/25926#M22734</link>
      <description>&lt;P&gt;Hm, how do you compile your app? Usually you create a Maven or SBT project to declare its dependencies, which should include a "provided" dependency on the same version of Spark as is on your cluster. How do you submit your application? spark-submit? you are submitting a JAR to run your app, right?&lt;/P&gt;</description>
      <pubDate>Thu, 26 Mar 2015 14:37:19 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/25926#M22734</guid>
      <dc:creator>srowen</dc:creator>
      <dc:date>2015-03-26T14:37:19Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/25950#M22735</link>
      <description>&lt;P&gt;My application comes prepackaged from oracle, I dont find any 'provided' dependency, im still checking though. where can i find the spark version that is installed via cloudera? is there a way to make it upward/downward compatible with other versions? my applciation uses not just spark, it uses oozie, hdfs, hive and yarn&lt;/P&gt;</description>
      <pubDate>Thu, 26 Mar 2015 21:19:40 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/25950#M22735</guid>
      <dc:creator>Dataminer</dc:creator>
      <dc:date>2015-03-26T21:19:40Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26014#M22736</link>
      <description>&lt;P&gt;Ok, so I deleted my entire cluster, hadoop and my application and reinstalled everything. Now i dont see a version mismatch error. I have a different spark related error. I have one spark master and one spark worker nodes. Pls find the errors as below.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;MASTER NODE ERROR(hadoop01.mycompany.local)&lt;BR /&gt;&lt;BR /&gt;2015-03-30 04:22:52,919 INFO org.apache.spark.deploy.master.Master: akka.tcp://sparkDriver@hadoop02.mycompany.local:55921 got disassociated, removing it.&lt;BR /&gt;2015-03-30 04:22:52,922 INFO org.apache.spark.deploy.master.Master: akka.tcp://sparkDriver@hadoop02.mycompany.local:55921 got disassociated, removing it.&lt;BR /&gt;2015-03-30 04:22:52,926 ERROR akka.remote.EndpointWriter: AssociationError [akka.tcp://sparkMaster@hadoop01.mycompany.local:7077] -&amp;gt; [akka.tcp://sparkDriver@hadoop02.mycompany.local:55921]: Error [Association failed with [akka.tcp://sparkDriver@hadoop02.mycompany.local:55921]] [&lt;BR /&gt;akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkDriver@hadoop02.mycompany.local:55921]&lt;BR /&gt;Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: hadoop02.mycompany.local/192.168.209.172:55921&lt;BR /&gt;]&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;*******************************************************************************************************************&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;WORKER NODE ERROR(hadoop02.mycompany.local)&lt;BR /&gt;&lt;BR /&gt;2015-03-30 04:22:42,840 INFO org.apache.spark.deploy.worker.Worker: Asked to launch executor app-20150330042242-0000/0 for EDP&lt;BR /&gt;2015-03-30 04:22:42,892 INFO org.apache.spark.deploy.worker.ExecutorRunner: Launch command: "/usr/java/jdk1.7.0_67-cloudera/bin/java" "-cp" "::/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p0.10/lib/spark/conf:/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p0.10/lib/spark/lib/spark-assembly.jar:/var/run/cloudera-scm-agent/process/76-spark-SPARK_WORKER/hadoop-conf:/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p0.10/lib/hadoop/client/*:/var/run/cloudera-scm-agent/process/76-spark-SPARK_WORKER/hadoop-conf:/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p0.10/lib/hadoop/libexec/../../hadoop/lib/*:/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p0.10/lib/hadoop/libexec/../../hadoop/.//*:/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p0.10/lib/hadoop/../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p0.10/lib/hadoop/../hadoop-hdfs/lib/*:/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p0.10/lib/hadoop/../hadoop-hdfs/.//*:/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p0.10/lib/hadoop/../hadoop-yarn/lib/*:/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p0.10/lib/hadoop/../hadoop-yarn/.//*:/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p0.10/lib/hadoop/../hadoop-mapreduce/lib/*:/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p0.10/lib/hadoop/../hadoop-mapreduce/.//*:/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p0.10/lib/spark/lib/scala-library.jar:/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p0.10/lib/spark/lib/scala-compiler.jar:/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p0.10/lib/spark/lib/jline.jar" "-XX:MaxPermSize=128m" "-Dspark.driver.port=55921" "-Xms20480M" "-Xmx20480M" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "akka.tcp://sparkDriver@hadoop02.mycompany.local:55921/user/CoarseGrainedScheduler" "0" "hadoop02.mycompany.local" "1" "app-20150330042242-0000" "akka.tcp://sparkWorker@hadoop02.mycompany.local:7078/user/Worker"&lt;BR /&gt;2015-03-30 04:22:53,338 INFO org.apache.spark.deploy.worker.Worker: Asked to kill executor app-20150330042242-0000/0&lt;BR /&gt;2015-03-30 04:22:53,338 INFO org.apache.spark.deploy.worker.ExecutorRunner: Runner thread for executor app-20150330042242-0000/0 interrupted&lt;BR /&gt;2015-03-30 04:22:53,339 INFO org.apache.spark.deploy.worker.ExecutorRunner: Killing process!&lt;BR /&gt;2015-03-30 04:22:53,596 INFO org.apache.spark.deploy.worker.Worker: Executor app-20150330042242-0000/0 finished with state KILLED exitStatus 1&lt;BR /&gt;2015-03-30 04:22:53,603 INFO akka.actor.LocalActorRef: Message [akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from Actor[akka://sparkWorker/deadLetters] to Actor[akka://sparkWorker/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkWorker%40192.168.209.172%3A54963-2#1273102661] was not delivered. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.&lt;BR /&gt;2015-03-30 04:22:53,612 ERROR akka.remote.EndpointWriter: AssociationError [akka.tcp://sparkWorker@hadoop02.mycompany.local:7078] -&amp;gt; [akka.tcp://sparkExecutor@hadoop02.mycompany.local:37271]: Error [Association failed with [akka.tcp://sparkExecutor@hadoop02.mycompany.local:37271]] [&lt;BR /&gt;akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkExecutor@hadoop02.mycompany.local:37271]&lt;BR /&gt;Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: hadoop02.mycompany.local/192.168.209.172:37271&lt;BR /&gt;]&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 30 Mar 2015 16:08:08 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26014#M22736</guid>
      <dc:creator>Dataminer</dc:creator>
      <dc:date>2015-03-30T16:08:08Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26049#M22737</link>
      <description>&lt;P&gt;It sounds like a network config problem:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Caused by: akka.remote.transport.netty.NettyTransport$$anonfu&lt;/SPAN&gt;&lt;SPAN&gt;n$associate$1$$anon$2: Connection refused: hadoop02.mycompany.local/192.168.209.172:37271&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 31 Mar 2015 11:03:59 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26049#M22737</guid>
      <dc:creator>srowen</dc:creator>
      <dc:date>2015-03-31T11:03:59Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26120#M22738</link>
      <description>&lt;P&gt;I checked my network cofig, everything seems to be alright. Every node can communicate with every other node in my cluster. My entire firewall has been disabled, so this may not&amp;nbsp; be a 'port-not-open' issue. I was looking at this other post which I think is discussing the same connectivity error: &lt;A href="http://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/Akka-Error-while-running-Spark-Jobs/td-p/18602." target="_blank"&gt;http://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/Akka-Error-while-running-Spark-Jobs/td-p/18602.&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Could you pls let me know if there is any spark cofig files or any spark specific setting that I need to look into?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thank you&lt;/P&gt;</description>
      <pubDate>Wed, 01 Apr 2015 14:48:09 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26120#M22738</guid>
      <dc:creator>Dataminer</dc:creator>
      <dc:date>2015-04-01T14:48:09Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26134#M22739</link>
      <description>&lt;P&gt;Going back to the earlier versionUID conflict error(java.io.InvalidClassException: org.apache.spark.rdd.PairRDDFunctions; local class incompatible: stream classdesc serialVersionUID = 8789839749593513237, local class serialVersionUID = -4145741279224749316), I've found out that my application used the spark jar file called: &lt;STRONG&gt;spark-core_2.10-1.2.0-cdh5.3.0.jar,&lt;/STRONG&gt; this .jar file contains the path&amp;nbsp;org.apache.spark.rdd.PairRDDFunctions as shown in the error, how do i check the serialVersionUID in this jar? and could you pls tell me what other spark jar (from cloudera manager/cdh)&amp;nbsp;could this jar be possibily conflicting with? is it with the spark-assembly.jar?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 01 Apr 2015 20:23:00 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26134#M22739</guid>
      <dc:creator>Dataminer</dc:creator>
      <dc:date>2015-04-01T20:23:00Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26321#M22740</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have exactly the same issue and and trying to do a BDD install too, did you find a solution to the problem?&lt;/P&gt;</description>
      <pubDate>Wed, 08 Apr 2015 12:17:17 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26321#M22740</guid>
      <dc:creator>richie78</dc:creator>
      <dc:date>2015-04-08T12:17:17Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26328#M22741</link>
      <description>&lt;P&gt;Actually yes, i figured out what the problem was. The CDH jars that are shipped with BDD are of version 5.3.0 whereas the cloudera cdh that I have installed on my cluster is of version 5.3.2. Due to this version mismatch i was getting this error. I removed CDH 5.3.2 and replaced it with cloudera parcels of version 5.3.0 (basically a fresh installation of CM and other hadoop components) and then this error doesn't appear.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;However, once that error is cleared I'm facing another issue which i have on this post: &lt;A target="_blank" href="http://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/Spark-ERROR-CoarseGrainedExecutorBackend-Driver-Disassociated/m-p/26269#U26269"&gt;http://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/Spark-ERROR-CoarseGrainedExecutorBackend-Driver-Disassociated/m-p/26269#U26269&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Could you pls let me know(after you implement the version change) if you're getting this same error or if BDD works all the way? Thanks&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;Bob&lt;/P&gt;</description>
      <pubDate>Wed, 08 Apr 2015 13:49:05 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26328#M22741</guid>
      <dc:creator>Dataminer</dc:creator>
      <dc:date>2015-04-08T13:49:05Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26339#M22742</link>
      <description>&lt;P&gt;Will do for sure, I have had my head burried in log files this afternoon and I think I am going crazy!&lt;/P&gt;</description>
      <pubDate>Wed, 08 Apr 2015 17:04:47 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26339#M22742</guid>
      <dc:creator>richie78</dc:creator>
      <dc:date>2015-04-08T17:04:47Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26340#M22743</link>
      <description>&lt;P&gt;There's very few people out there working on BDD, pls do keep me posted on how things work once you make this fix.&lt;/P&gt;</description>
      <pubDate>Wed, 08 Apr 2015 17:11:29 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26340#M22743</guid>
      <dc:creator>Dataminer</dc:creator>
      <dc:date>2015-04-08T17:11:29Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26693#M22744</link>
      <description>Hi, I am also facing the similar error/issue. Could you please let me know whether you are able to fix this issue. I am using pseudo cluster with latest cloudera software and Oracle Big Data 1.0</description>
      <pubDate>Wed, 22 Apr 2015 22:13:47 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26693#M22744</guid>
      <dc:creator>ramsuk</dc:creator>
      <dc:date>2015-04-22T22:13:47Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26694#M22745</link>
      <description>&lt;P&gt;The CDH jars that are shipped with BDD are of version 5.3.0 whereas the cloudera cdh that I have installed on my cluster is of version 5.3.2. Due to this version mismatch i was getting this error. I removed CDH 5.3.2 and replaced it with cloudera parcels of version 5.3.0 (basically a fresh installation of CM and other hadoop components) and then this error doesn't appear.&lt;/P&gt;</description>
      <pubDate>Wed, 22 Apr 2015 22:15:15 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26694#M22745</guid>
      <dc:creator>Dataminer</dc:creator>
      <dc:date>2015-04-22T22:15:15Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26696#M22746</link>
      <description>Hi, Dataminer, many thanks for quick post. I have got two exceptions 1. java.io.InvalidClassException: org.apache.spark.rdd.PairRDDFunctions; local class incompatible: stream classdesc serialVersionUID = 8789839749593513237, local class serialVersionUID = -4145741279224749316 and the second one ERROR CoarseGrainedExecutorBackend: Driver Disassociated . Are you able to fix both of these by installing CDH 5.3.0, How to find what version of CDH jars were shipped with BDD as i have downloaded BDD software last week, I am not sure whether still BDD has CDH 5.3.0 jars.</description>
      <pubDate>Wed, 22 Apr 2015 22:29:14 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26696#M22746</guid>
      <dc:creator>ramsuk</dc:creator>
      <dc:date>2015-04-22T22:29:14Z</dc:date>
    </item>
    <item>
      <title>Re: Spark (Standalone) error local class incompatible: stream classdesc serialVersionUID</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26697#M22747</link>
      <description>&lt;P&gt;For me both these issues were resolved once i installed cdh 5.3.0. The cdh version of the bdd jars can be found under the folder ${oracle_home}/Middleware/BDD1.0/dataprocessing/edp_cli/libs&lt;/P&gt;</description>
      <pubDate>Wed, 22 Apr 2015 22:36:01 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Standalone-error-local-class-incompatible-stream/m-p/26697#M22747</guid>
      <dc:creator>Dataminer</dc:creator>
      <dc:date>2015-04-22T22:36:01Z</dc:date>
    </item>
  </channel>
</rss>

