Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Adding jars to spark lib

Highlighted

Adding jars to spark lib

New Contributor

I have a few external jars such as elasticsearch-spark_2.10-2.4.0.jar. Currently I use --jars option to load it for spark-shell. Is there a way to get this or other jars to load with spark for my cluster? I see through Ambari there is the spark-defaults but was wondering if I could just copy X.jar to /usr/hdp<Ver>/spark/lib and it would get picked up.

A side question somewhat related, its in the same command line, is I use the following: "--packages com.databricks:spark-avro_2.10:2.0.1". I notice the first time this is done, spark goes out and grabs the jars like maven would. But I could not find these and wonder if I need this argument as well or can I get databrick libs installed permanently as with elasticsearch?

Thanks, Mike

2 REPLIES 2

Re: Adding jars to spark lib

New Contributor

@Mike Krauss Did you try running using, spark-shell --master yarn --conf "spark.executor.extraClassPath=/usr/hdp/current/soark/lib/test.jar" --conf "spark.driver.extraClassPath=//usr/hdp/current/soark/lib/test.jar" . If this works you add these from ambari in custom spark-default

Re: Adding jars to spark lib

New Contributor

Hi,

Hi, Currently i am using particular jar as follow : spark-shell --jars abc.jar,

now i am trying build jar out my code, what is the way to add this jar (abc.jar)