Created 03-26-2018 07:46 AM
Hi,
1- I have confusion between difference between --driver-class-path --driver-library-path.. Please help me in understanding difference between these two.
2- I am bit new to scala. can you please help in understanding difference between class path and library path. At end, both requires jar path to be set.
3- If i add extra dependencies with --jar option, then do i need to separately project jar path with driver-class-path and spark.executor.executorClassPath
Created 03-26-2018 11:40 PM
--driver-class-path is used to mention "extra" jars to add to the "driver" of the spark job
--driver-library-path is used to "change" the default library path for the jars needed for the spark driver
--driver-class-path will only push the jars to the driver machine. If you want to send the jars to "executors", you need to use --jar
Hope that helps!
Created 03-26-2018 11:40 PM
--driver-class-path is used to mention "extra" jars to add to the "driver" of the spark job
--driver-library-path is used to "change" the default library path for the jars needed for the spark driver
--driver-class-path will only push the jars to the driver machine. If you want to send the jars to "executors", you need to use --jar
Hope that helps!
Created 04-01-2018 04:19 PM
Did the answer help in the resolution of your query? Please close the thread by marking the answer as Accepted!
Created 07-20-2018 06:20 PM
I had used --driver-class-path and tried with two options:
1. used to mention hdfs:// path .. but it did not work.
2. used to mention local path .. Expected Spark to copy it to slave nodes - this also does not seem to work
The executors are working perfectly where i had mentioned the same HDFS path using --jars option.
However, the driver is not getting the reference to this path.
This path is a directory where some external customizable configurations could be kept by a user who wish to override default settings shipped in our jars..
For #2 - I am planning to copy to all slave nodes of the cluster.. and see if that does the trick.. Shall update here..
Created 07-20-2018 06:20 PM
UPDATE:
=======
--driver-class-path - worked when I passed a local path to it.... However, this did not work until I had to copy the path to be available in all the nodes.
Wish Spark fixes this or if there is any other alternative way - please do share ... I wish it either accepted HDFS path .. or at least do the copy automatically like it does for --jars option.
Created 04-03-2019 05:16 AM
@Rahul Soni Hi, Currently i am using particular jar as follow : spark-shell --jars abc.jar,
now i am trying build jar out my code, what is the way to add this jar (abc.jar)