<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Local Spark Development against a remote cluster in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Local-Spark-Development-against-a-remote-cluster/m-p/104083#M66980</link>
    <description>&lt;P&gt;What is the best way to develop Spark applications on your local computer?  I'm using IntelliJ and trying to set the master, just for debugging purposes, to my remote HDP cluster so I can test code against Hive and other resources on my cluster.  I'm using HDP 2.5.3 and I've added the spark libraries for scala 2.10 and spark 1.6.2 from the maven repository.  I've set my build.sbt scalaVersion to 2.10.5 and added the library dependencies.  As far as I can tell, I have the exact same versions that are running in HDP 2.5.3 in my project, but when I try to run the application pointing the SparkConf to my remote spark master I get the following error for an incompatible class:&lt;/P&gt;&lt;P&gt;java.io.InvalidClassException: org.apache.spark.rdd.RDD; local class incompatible: stream classdesc serialVersionUID = 5009924811397974881, local class serialVersionUID = 7185378471520864965&lt;/P&gt;&lt;P&gt;Is there something I'm missing, or is there a better way to develop and test against the remote cluster?&lt;/P&gt;</description>
    <pubDate>Tue, 14 Feb 2017 04:18:51 GMT</pubDate>
    <dc:creator>ehanson</dc:creator>
    <dc:date>2017-02-14T04:18:51Z</dc:date>
  </channel>
</rss>

