Using spark-shell with --packages options like databricks,
Of course, spark is downloading package library on the maven repository of internet.
But, in case of offline mode, it not useful.
How can I change or add spark package repository?
In my case, I navigate to the folder /data/user/flamingo/.ivy2/jars
... Ivy Default Cache set to: /data/user/flamingo/.ivy2/cache The jars for the packages stored in: /data/user/flamingo/.ivy2/jars ...
And copy all the jars below to the directory you want to store jars, then execute the spark command like:
SPARK_MAJOR_VERSION=2 bin/spark-shell --jars="/path/to/jars"
Then the result seems worked!