Member since
07-20-2018
2
Posts
0
Kudos Received
0
Solutions
07-20-2018
06:20 PM
UPDATE: ======= --driver-class-path - worked when I passed a local path to it.... However, this did not work until I had to copy the path to be available in all the nodes. Wish Spark fixes this or if there is any other alternative way - please do share ... I wish it either accepted HDFS path .. or at least do the copy automatically like it does for --jars option.
... View more
07-20-2018
06:20 PM
I had used --driver-class-path and tried with two options: 1. used to mention hdfs:// path .. but it did not work. 2. used to mention local path .. Expected Spark to copy it to slave nodes - this also does not seem to work The executors are working perfectly where i had mentioned the same HDFS path using --jars option. However, the driver is not getting the reference to this path. This path is a directory where some external customizable configurations could be kept by a user who wish to override default settings shipped in our jars.. For #2 - I am planning to copy to all slave nodes of the cluster.. and see if that does the trick.. Shall update here..
... View more