<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Spark 2.0 App not working on cluster in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-0-App-not-working-on-cluster/m-p/53379#M57331</link>
    <description>&lt;P&gt;What is the solution? (I do not have an enterprise account and we may not be able to upgrade the cluster soon enough).&lt;/P&gt;</description>
    <pubDate>Sat, 08 Apr 2017 19:07:46 GMT</pubDate>
    <dc:creator>sprash</dc:creator>
    <dc:date>2017-04-08T19:07:46Z</dc:date>
    <item>
      <title>Spark 2.0 App not working on cluster</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-0-App-not-working-on-cluster/m-p/52305#M57328</link>
      <description>&lt;P&gt;Hi all&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;We have Spark 2.0 (*) installed from the Cloudera parcel on our cluster (CDH 5.9.0).&lt;/P&gt;&lt;P&gt;When running a quite simple App which just reads in some csv files and does a groupBy I always receive errors.&lt;/P&gt;&lt;P&gt;The App is submitted with:&lt;/P&gt;&lt;PRE&gt;spark2-submit --class my_class myapp-1.0-SNAPSHOT.jar&lt;/PRE&gt;&lt;P&gt;And I receive the following error message:&lt;/P&gt;&lt;PRE&gt;java.io.InvalidClassException: org.apache.commons.lang3.time.FastDateFormat; local class incompatible: stream classdesc serialVersionUID = 2, local class serialVersionUID = 1&lt;/PRE&gt;&lt;P&gt;I figured out that there are multiple versions of lang3 installed with the Cloudera release and modified the spark2-submit to:&lt;/P&gt;&lt;PRE&gt;spark2-submit --conf spark.driver.userClassPathFirst=true --conf spark.executor.userClassPathFirst=true --jars /var/opt/teradata/cloudera/parcels/CDH/jars/commons-lang3-3.3.2.jar --class my_class myapp-1.0-SNAPSHOT.jar&lt;/PRE&gt;&lt;P&gt;This way I cloud get rid of the first error message, but now I get:&lt;/P&gt;&lt;PRE&gt;java.lang.ClassCastException: cannot assign instance of org.apache.commons.lang3.time.FastDateFormat to field org.apache.spark.sql.execution.datasources.csv.CSVOptions.dateFormat of type org.apache.commons.lang3.time.FastDateFormat in instance of org.apache.spark.sql.execution.datasources.csv.CSVOptions&lt;/PRE&gt;&lt;P&gt;The App was written in Scala and compiled using Maven. The source code (**) and the maven pom file (***) are attached at the bottom of this post.&lt;/P&gt;&lt;P&gt;Does anybody have an idea on solving this issue?&lt;/P&gt;&lt;P&gt;Any help is highly appreciated!&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks a lot in advance!&lt;/P&gt;&lt;P&gt;Kind Regards&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;(*)&lt;/P&gt;&lt;PRE&gt;$spark2-submit --version
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.0.0.cloudera1
      /_/

Branch HEAD
Compiled by user jenkins on 2016-12-06T18:34:13Z
Revision 2389f44e0185f33969d782ed09b41ae45fe30324&lt;/PRE&gt;&lt;P&gt;(**)&lt;/P&gt;&lt;PRE&gt;import org.apache.spark.sql.SparkSession

object my_class {
  def main(args: Array[String]): Unit = {
    val spark = SparkSession
      .builder
      .appName("myapp")
      .getOrCreate()

    val csv = spark.read.option("header", value = false).csv("/path/to/folder/with/some/csv/files/")

    val pivot = csv.groupBy("_c0").count()

    csv.take(10).foreach(println)
    pivot.take(10).foreach(println)
    spark.stop()
  }
}&lt;/PRE&gt;&lt;P&gt;(***)&lt;/P&gt;&lt;PRE&gt;&amp;lt;?xml version="1.0" encoding="UTF-8"?&amp;gt;
&amp;lt;project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"&amp;gt;
    &amp;lt;modelVersion&amp;gt;4.0.0&amp;lt;/modelVersion&amp;gt;

    &amp;lt;groupId&amp;gt;de.lht.datalab.ingestion&amp;lt;/groupId&amp;gt;
    &amp;lt;artifactId&amp;gt;myapp&amp;lt;/artifactId&amp;gt;
    &amp;lt;version&amp;gt;1.0-SNAPSHOT&amp;lt;/version&amp;gt;

    &amp;lt;properties&amp;gt;
        &amp;lt;scala.version.base&amp;gt;2.11&amp;lt;/scala.version.base&amp;gt;
        &amp;lt;scala.version&amp;gt;${scala.version.base}.8&amp;lt;/scala.version&amp;gt;
        &amp;lt;spark.version&amp;gt;2.0.0.cloudera1&amp;lt;/spark.version&amp;gt;
    &amp;lt;/properties&amp;gt;

    &amp;lt;repositories&amp;gt;
        &amp;lt;repository&amp;gt;
            &amp;lt;id&amp;gt;cloudera&amp;lt;/id&amp;gt;
            &amp;lt;url&amp;gt;https://repository.cloudera.com/artifactory/cloudera-repos/&amp;lt;/url&amp;gt;
        &amp;lt;/repository&amp;gt;
    &amp;lt;/repositories&amp;gt;

    &amp;lt;dependencies&amp;gt;
        &amp;lt;dependency&amp;gt;
            &amp;lt;groupId&amp;gt;org.apache.spark&amp;lt;/groupId&amp;gt;
            &amp;lt;artifactId&amp;gt;spark-sql_${scala.version.base}&amp;lt;/artifactId&amp;gt;
            &amp;lt;version&amp;gt;${spark.version}&amp;lt;/version&amp;gt;
        &amp;lt;/dependency&amp;gt;
    &amp;lt;/dependencies&amp;gt;


    &amp;lt;build&amp;gt;
        &amp;lt;plugins&amp;gt;
            &amp;lt;plugin&amp;gt;
                &amp;lt;groupId&amp;gt;org.scala-tools&amp;lt;/groupId&amp;gt;
                &amp;lt;artifactId&amp;gt;maven-scala-plugin&amp;lt;/artifactId&amp;gt;
                &amp;lt;version&amp;gt;2.15.2&amp;lt;/version&amp;gt;
                &amp;lt;executions&amp;gt;
                    &amp;lt;execution&amp;gt;
                        &amp;lt;goals&amp;gt;
                            &amp;lt;goal&amp;gt;compile&amp;lt;/goal&amp;gt;
                            &amp;lt;goal&amp;gt;testCompile&amp;lt;/goal&amp;gt;
                        &amp;lt;/goals&amp;gt;
                    &amp;lt;/execution&amp;gt;
                &amp;lt;/executions&amp;gt;
            &amp;lt;/plugin&amp;gt;
        &amp;lt;/plugins&amp;gt;
    &amp;lt;/build&amp;gt;

&amp;lt;/project&amp;gt;&lt;/PRE&gt;</description>
      <pubDate>Fri, 16 Sep 2022 11:16:20 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-0-App-not-working-on-cluster/m-p/52305#M57328</guid>
      <dc:creator>Uque</dc:creator>
      <dc:date>2022-09-16T11:16:20Z</dc:date>
    </item>
    <item>
      <title>Re: Spark 2.0 App not working on cluster</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-0-App-not-working-on-cluster/m-p/52353#M57329</link>
      <description>&lt;P&gt;This is due to a difference in the version of commons-lang3 you use and the one Spark does, generally. See&amp;nbsp;&lt;A href="https://issues.apache.org/jira/browse/ZEPPELIN-1977" target="_blank"&gt;https://issues.apache.org/jira/browse/ZEPPELIN-1977&lt;/A&gt; for example.&lt;/P&gt;&lt;P&gt;I believe you'll find that it's resolved in the latest Spark 2 release for CDH.&lt;/P&gt;&lt;P&gt;&lt;A href="http://community.cloudera.com/t5/Community-News-Release/ANNOUNCE-Spark-2-0-Release-2/m-p/51464#M161" target="_blank"&gt;http://community.cloudera.com/t5/Community-News-Release/ANNOUNCE-Spark-2-0-Release-2/m-p/51464#M161&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Sat, 18 Mar 2017 10:23:09 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-0-App-not-working-on-cluster/m-p/52353#M57329</guid>
      <dc:creator>srowen</dc:creator>
      <dc:date>2017-03-18T10:23:09Z</dc:date>
    </item>
    <item>
      <title>Re: Spark 2.0 App not working on cluster</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-0-App-not-working-on-cluster/m-p/52381#M57330</link>
      <description>&lt;P&gt;Thanks a lot.&lt;/P&gt;&lt;P&gt;With the given workaround at the end of the Zeppelin issue, it works for me now.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 20 Mar 2017 13:33:21 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-0-App-not-working-on-cluster/m-p/52381#M57330</guid>
      <dc:creator>Uque</dc:creator>
      <dc:date>2017-03-20T13:33:21Z</dc:date>
    </item>
    <item>
      <title>Re: Spark 2.0 App not working on cluster</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-0-App-not-working-on-cluster/m-p/53379#M57331</link>
      <description>&lt;P&gt;What is the solution? (I do not have an enterprise account and we may not be able to upgrade the cluster soon enough).&lt;/P&gt;</description>
      <pubDate>Sat, 08 Apr 2017 19:07:46 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-0-App-not-working-on-cluster/m-p/53379#M57331</guid>
      <dc:creator>sprash</dc:creator>
      <dc:date>2017-04-08T19:07:46Z</dc:date>
    </item>
    <item>
      <title>Re: Spark 2.0 App not working on cluster</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-0-App-not-working-on-cluster/m-p/323032#M57332</link>
      <description>&lt;P&gt;I am using Spark 2.4.0 CDH 6.3.4. I got the issue&amp;nbsp;of &lt;EM&gt;java.lang.ClassCastException: cannot assign instance of org.apache.commons.lang3.time.FastDateFormat to field org.apache.spark.sql.catalyst.csv.CSVOptions.dateFormat of type org.apache.commons.lang3.time.FastDateFormat in instance of org.apache.spark.sql.catalyst.csv.CSVOptions&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;Caused by: java.lang.ClassCastException: cannot assign instance of org.apache.commons.lang3.time.FastDateFormat to field org.apache.spark.sql.catalyst.csv.CSVOptions.dateFormat of type org.apache.commons.lang3.time.FastDateFormat in instance of org.apache.spark.sql.catalyst.csv.CSVOptions&lt;BR /&gt;at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2301)&lt;BR /&gt;at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1431)&lt;BR /&gt;at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2371)&lt;BR /&gt;at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2289)&lt;BR /&gt;at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2147)&lt;BR /&gt;at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1646)&lt;BR /&gt;at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2365)&lt;BR /&gt;at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2289)&lt;BR /&gt;at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2147)&lt;BR /&gt;at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1646)&lt;BR /&gt;at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2365)&lt;BR /&gt;at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2289)&lt;BR /&gt;at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2147)&lt;BR /&gt;at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1646)&lt;BR /&gt;at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2365)&lt;BR /&gt;at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2289)&lt;BR /&gt;at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2147)&lt;BR /&gt;at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1646)&lt;BR /&gt;at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2365)&lt;BR /&gt;at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2289)&lt;BR /&gt;at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2147)&lt;BR /&gt;at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1646)&lt;BR /&gt;at java.io.ObjectInputStream.readObject(ObjectInputStream.java:482)&lt;BR /&gt;at java.io.ObjectInputStream.readObject(ObjectInputStream.java:440)&lt;BR /&gt;at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)&lt;BR /&gt;at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)&lt;BR /&gt;at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:83)&lt;BR /&gt;at org.apache.spark.scheduler.Task.run(Task.scala:121)&lt;BR /&gt;at org.apache.spark.executor.Executor$TaskRunner$$anonfun$11.apply(Executor.scala:407)&lt;BR /&gt;at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1408)&lt;BR /&gt;at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:413)&lt;BR /&gt;at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)&lt;BR /&gt;at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)&lt;BR /&gt;at java.lang.Thread.run(Thread.java:748)&lt;/PRE&gt;&lt;P&gt;&lt;FONT color="#3366FF"&gt;&lt;STRONG&gt;Finally I able to resolve the issue&lt;/STRONG&gt;&lt;/FONT&gt;. I was using org.apache.spark:spark-core_2.11:jar:2.4.0-cdh6.3.4:provided. Even though it is mentioned as provided, but it includes some of the transitive dependencies as scope compile. org.apache.commons:commons-lang3:jar:3.7&amp;nbsp;is one of those. If you provide commons-lang3 from outside it will create the problem as it gets packaged inside your fat jar.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Therefore I forced few of the jars scope as provided explicitly as &amp;nbsp;listed below.&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;org.apache.commons:commons-lang3:3.7&lt;/LI&gt;&lt;LI&gt;org.apache.zookeeper:zookeeper:3.4.5-cdh6.3.4&lt;/LI&gt;&lt;LI&gt;io.dropwizard.metrics:metrics-core:3.1.5&lt;/LI&gt;&lt;LI&gt;com.fasterxml.jackson.core:jackson-databind:2.9.10.6&lt;/LI&gt;&lt;LI&gt;org.apache.commons:commons-crypto:1.0.0&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;By doing this application is forced to use the&amp;nbsp;commons-lang3 jar provided by the platform.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT color="#3366FF"&gt;&lt;STRONG&gt;Pom snippet to solve the issue&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;PRE&gt;&lt;SPAN&gt;&amp;lt;dependency&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;groupId&amp;gt;&lt;/SPAN&gt;org.apache.spark&lt;SPAN&gt;&amp;lt;/groupId&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;artifactId&amp;gt;&lt;/SPAN&gt;spark-core_${scala.binary.version}&lt;SPAN&gt;&amp;lt;/artifactId&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;version&amp;gt;&lt;/SPAN&gt;${spark.core.version}&lt;SPAN&gt;&amp;lt;/version&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;scope&amp;gt;&lt;/SPAN&gt;provided&lt;SPAN&gt;&amp;lt;/scope&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;lt;/dependency&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;lt;!-- Declaring following dependencies explicitly as provided as they are not declared as provide as part of spark-core --&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;lt;!-- Start --&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;lt;dependency&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;groupId&amp;gt;&lt;/SPAN&gt;org.apache.commons&lt;SPAN&gt;&amp;lt;/groupId&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;artifactId&amp;gt;&lt;/SPAN&gt;commons-lang3&lt;SPAN&gt;&amp;lt;/artifactId&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;version&amp;gt;&lt;/SPAN&gt;3.7&lt;SPAN&gt;&amp;lt;/version&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;scope&amp;gt;&lt;/SPAN&gt;provided&lt;SPAN&gt;&amp;lt;/scope&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;lt;/dependency&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;lt;dependency&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;groupId&amp;gt;&lt;/SPAN&gt;org.apache.zookeeper&lt;SPAN&gt;&amp;lt;/groupId&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;artifactId&amp;gt;&lt;/SPAN&gt;zookeeper&lt;SPAN&gt;&amp;lt;/artifactId&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;version&amp;gt;&lt;/SPAN&gt;3.4.5-cdh6.3.4&lt;SPAN&gt;&amp;lt;/version&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;scope&amp;gt;&lt;/SPAN&gt;provided&lt;SPAN&gt;&amp;lt;/scope&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;lt;/dependency&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;lt;dependency&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;groupId&amp;gt;&lt;/SPAN&gt;io.dropwizard.metrics&lt;SPAN&gt;&amp;lt;/groupId&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;artifactId&amp;gt;&lt;/SPAN&gt;metrics-core&lt;SPAN&gt;&amp;lt;/artifactId&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;version&amp;gt;&lt;/SPAN&gt;3.1.5&lt;SPAN&gt;&amp;lt;/version&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;scope&amp;gt;&lt;/SPAN&gt;provided&lt;SPAN&gt;&amp;lt;/scope&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;lt;/dependency&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;lt;dependency&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;groupId&amp;gt;&lt;/SPAN&gt;com.fasterxml.jackson.core&lt;SPAN&gt;&amp;lt;/groupId&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;artifactId&amp;gt;&lt;/SPAN&gt;jackson-databind&lt;SPAN&gt;&amp;lt;/artifactId&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;version&amp;gt;&lt;/SPAN&gt;2.9.10.6&lt;SPAN&gt;&amp;lt;/version&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;scope&amp;gt;&lt;/SPAN&gt;provided&lt;SPAN&gt;&amp;lt;/scope&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;lt;/dependency&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;lt;dependency&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;groupId&amp;gt;&lt;/SPAN&gt;org.apache.commons&lt;SPAN&gt;&amp;lt;/groupId&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;artifactId&amp;gt;&lt;/SPAN&gt;commons-crypto&lt;SPAN&gt;&amp;lt;/artifactId&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;version&amp;gt;&lt;/SPAN&gt;1.0.0&lt;SPAN&gt;&amp;lt;/version&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;    &amp;lt;scope&amp;gt;&lt;/SPAN&gt;provided&lt;SPAN&gt;&amp;lt;/scope&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;lt;/dependency&amp;gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;lt;!-- End --&amp;gt;&lt;/SPAN&gt;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 24 Aug 2021 01:31:58 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-2-0-App-not-working-on-cluster/m-p/323032#M57332</guid>
      <dc:creator>manasdas</dc:creator>
      <dc:date>2021-08-24T01:31:58Z</dc:date>
    </item>
  </channel>
</rss>

