<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Spark-submit Options --jar, --spark-driver-classpath and spark.executor.extraClasspath in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-submit-Options-jar-spark-driver-classpath-and-spark/m-p/197985#M76440</link>
    <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;1- I have confusion between difference between --driver-class-path
--driver-library-path.. Please help me in understanding difference between these two.&lt;/P&gt;&lt;P&gt;2- I am bit new to scala. can you please help in understanding difference between class path and library path. At end, both requires jar path to be set.&lt;/P&gt;&lt;P&gt;3- If i add extra dependencies with --jar option, then do i need to separately project jar path with driver-class-path and spark.executor.executorClassPath&lt;/P&gt;</description>
    <pubDate>Mon, 26 Mar 2018 14:46:18 GMT</pubDate>
    <dc:creator>Vinitkumar</dc:creator>
    <dc:date>2018-03-26T14:46:18Z</dc:date>
    <item>
      <title>Spark-submit Options --jar, --spark-driver-classpath and spark.executor.extraClasspath</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-submit-Options-jar-spark-driver-classpath-and-spark/m-p/197985#M76440</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;1- I have confusion between difference between --driver-class-path
--driver-library-path.. Please help me in understanding difference between these two.&lt;/P&gt;&lt;P&gt;2- I am bit new to scala. can you please help in understanding difference between class path and library path. At end, both requires jar path to be set.&lt;/P&gt;&lt;P&gt;3- If i add extra dependencies with --jar option, then do i need to separately project jar path with driver-class-path and spark.executor.executorClassPath&lt;/P&gt;</description>
      <pubDate>Mon, 26 Mar 2018 14:46:18 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-submit-Options-jar-spark-driver-classpath-and-spark/m-p/197985#M76440</guid>
      <dc:creator>Vinitkumar</dc:creator>
      <dc:date>2018-03-26T14:46:18Z</dc:date>
    </item>
    <item>
      <title>Re: Spark-submit Options --jar, --spark-driver-classpath and spark.executor.extraClasspath</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-submit-Options-jar-spark-driver-classpath-and-spark/m-p/197986#M76441</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/14226/vinitpandey8623.html" nodeid="14226"&gt;@Vinitkumar Pandey&lt;/A&gt;&lt;P&gt;&lt;STRONG&gt;--driver-class-path&lt;/STRONG&gt; is used to mention "extra" jars to add to the "driver" of the spark job&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;--driver-library-path&lt;/STRONG&gt; is used to "change" the default library path for the jars needed for the spark driver&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;--&lt;/STRONG&gt;&lt;STRONG&gt;driver-class-path&lt;/STRONG&gt; will only push the jars to the driver machine. If you want to send the jars to "executors", you need to use &lt;STRONG&gt;--jar&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Hope that helps!&lt;/P&gt;</description>
      <pubDate>Tue, 27 Mar 2018 06:40:34 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-submit-Options-jar-spark-driver-classpath-and-spark/m-p/197986#M76441</guid>
      <dc:creator>RahulSoni</dc:creator>
      <dc:date>2018-03-27T06:40:34Z</dc:date>
    </item>
    <item>
      <title>Re: Spark-submit Options --jar, --spark-driver-classpath and spark.executor.extraClasspath</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-submit-Options-jar-spark-driver-classpath-and-spark/m-p/197987#M76442</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/14226/vinitpandey8623.html" nodeid="14226"&gt;@Vinitkumar Pandey&lt;/A&gt;&lt;P&gt;Did the answer help in the resolution of your query? Please close the thread by marking the answer as Accepted!&lt;/P&gt;</description>
      <pubDate>Sun, 01 Apr 2018 23:19:46 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-submit-Options-jar-spark-driver-classpath-and-spark/m-p/197987#M76442</guid>
      <dc:creator>RahulSoni</dc:creator>
      <dc:date>2018-04-01T23:19:46Z</dc:date>
    </item>
    <item>
      <title>Re: Spark-submit Options --jar, --spark-driver-classpath and spark.executor.extraClasspath</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-submit-Options-jar-spark-driver-classpath-and-spark/m-p/197988#M76443</link>
      <description>&lt;P&gt;I had used --driver-class-path and tried with two options:&lt;/P&gt;&lt;P&gt;1.  used to mention hdfs:// path ..  but it did not work.&lt;/P&gt;&lt;P&gt;2.  used to mention local path .. Expected Spark to copy it to slave nodes  - this also does not seem to work &lt;/P&gt;&lt;P&gt;&lt;BR /&gt;The executors are working perfectly where i had mentioned the same HDFS path using --jars option.&lt;/P&gt;&lt;P&gt;However, the driver is not getting the reference to this path.  &lt;/P&gt;&lt;P&gt;This path is a directory where some external customizable configurations could be kept by a user who wish to override default settings shipped in our jars.. &lt;/P&gt;&lt;P&gt;&lt;BR /&gt;For #2 - I am planning to copy to all slave nodes of the cluster.. and see if that does the trick.. Shall update here.. &lt;/P&gt;</description>
      <pubDate>Sat, 21 Jul 2018 01:20:10 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-submit-Options-jar-spark-driver-classpath-and-spark/m-p/197988#M76443</guid>
      <dc:creator>arun_n</dc:creator>
      <dc:date>2018-07-21T01:20:10Z</dc:date>
    </item>
    <item>
      <title>Re: Spark-submit Options --jar, --spark-driver-classpath and spark.executor.extraClasspath</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-submit-Options-jar-spark-driver-classpath-and-spark/m-p/197989#M76444</link>
      <description>&lt;P&gt;UPDATE:&lt;/P&gt;&lt;P&gt;=======&lt;/P&gt;&lt;P&gt;--driver-class-path -  worked when I passed a local path to it.... However, this did not work until I had to copy the path to be available in all the nodes.&lt;/P&gt;&lt;P&gt;Wish Spark fixes this or if there is any other alternative way - please do share ...       I wish it either accepted HDFS path .. or at least do the copy automatically like it does for --jars option.&lt;/P&gt;</description>
      <pubDate>Sat, 21 Jul 2018 01:20:11 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-submit-Options-jar-spark-driver-classpath-and-spark/m-p/197989#M76444</guid>
      <dc:creator>arun_n</dc:creator>
      <dc:date>2018-07-21T01:20:11Z</dc:date>
    </item>
    <item>
      <title>Re: Spark-submit Options --jar, --spark-driver-classpath and spark.executor.extraClasspath</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-submit-Options-jar-spark-driver-classpath-and-spark/m-p/197990#M76445</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/66220/rsoni.html"&gt;@Rahul Soni&lt;/A&gt; Hi, Currently i am using particular jar as follow : spark-shell --jars abc.jar,&lt;/P&gt;&lt;P&gt;now i am trying build jar out my code, what is the way to add this jar (abc.jar)&lt;/P&gt;</description>
      <pubDate>Wed, 03 Apr 2019 12:16:27 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Spark-submit-Options-jar-spark-driver-classpath-and-spark/m-p/197990#M76445</guid>
      <dc:creator>bybirth9</dc:creator>
      <dc:date>2019-04-03T12:16:27Z</dc:date>
    </item>
  </channel>
</rss>

