Created on 05-15-2017 06:41 AM - edited 08-17-2019 05:48 PM
Using spark-shell with --packages options like databricks,
Of course, spark is downloading package library on the maven repository of internet.
But, in case of offline mode, it not useful.
How can I change or add spark package repository?
Created 05-15-2017 07:01 AM
Hm, I just fixed this issue with multiple jars option.
Created 05-17-2017 08:28 PM
Did you tried the multiple jars options?
Created 05-17-2017 11:12 PM
Yes I did.
Created 08-11-2017 01:12 AM
In my case, I navigate to the folder /data/user/flamingo/.ivy2/jars
... Ivy Default Cache set to: /data/user/flamingo/.ivy2/cache The jars for the packages stored in: /data/user/flamingo/.ivy2/jars ...
And copy all the jars below to the directory you want to store jars, then execute the spark command like:
SPARK_MAJOR_VERSION=2 bin/spark-shell --jars="/path/to/jars"
Then the result seems worked!