Created on 01-27-2021 02:19 AM - edited 01-27-2021 04:29 AM
Hello,
I entered the following 2 additional libraries in livy.spark.jars.packages (Livy interpreter in Zeppelin 0.8.2):
com.exasol:exasol-jdbc:7.0.4,com.exasol:spark-connector_2.11:0.3.2
While exasol:spark-connector_2.11:0.3.2 is found, Livy/Zeppelin cannot find com.exasol:exasol-jdbc:7.0.4
The reason is that Zeppelin/Livy only search in https://repo1.maven.org/maven2/
However, com.exasol:exasol-jdbc:7.0.4 is here: https://maven.exasol.com/artifactory/exasol-releases/
We already tried setting the exasol-repo in the zeppelin.interpreter.dep.mvnRepo property on the server and in "Repository" in the interpreter settings page, however it didn't work.
How to tell Livy/Zeppelin to search in multiples Maven locations???
stderr:
WARNING: User-defined SPARK_HOME (/opt/cloudera/parcels/SPARK2-2.4.0.cloudera2-1.cdh5.13.3.p0.1041012/lib/spark2) overrides detected (/opt/cloudera/parcels/SPARK2/lib/spark2).
WARNING: Running spark-class from user-defined location.
Ivy Default Cache set to: /home/hue0s01u/.ivy2/cache
The jars for the packages stored in: /home/hue0s01u/.ivy2/jars
:: loading settings :: url = jar:file:/opt/cloudera/parcels/SPARK2-2.4.0.cloudera2-1.cdh5.13.3.p0.1041012/lib/spark2/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
com.ibm.informix#jdbc added as a dependency
com.exasol#exasol-jdbc added as a dependency
com.exasol#spark-connector_2.11 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-68cee486-037e-4ec9-a90a-42513e89986b;1.0
confs: [default]
found com.ibm.informix#jdbc;4.50.4.1 in central
found org.mongodb#bson;3.8.0 in central
found com.exasol#spark-connector_2.11;0.3.2 in central
:: resolution report :: resolve 820ms :: artifacts dl 5ms
:: modules in use:
com.exasol#spark-connector_2.11;0.3.2 from central in [default]
com.ibm.informix#jdbc;4.50.4.1 from central in [default]
org.mongodb#bson;3.8.0 from central in [default]
:: evicted modules:
com.exasol#exasol-jdbc;7.0.0 by [com.exasol#exasol-jdbc;7.0.4] in [default]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 5 | 0 | 0 | 1 || 3 | 0 |
---------------------------------------------------------------------
:: problems summary ::
:::: WARNINGS
module not found: com.exasol#exasol-jdbc;7.0.4
==== local-m2-cache: tried
file:/home/hue0s01u/.m2/repository/com/exasol/exasol-jdbc/7.0.4/exasol-jdbc-7.0.4.pom
-- artifact com.exasol#exasol-jdbc;7.0.4!exasol-jdbc.jar:
file:/home/hue0s01u/.m2/repository/com/exasol/exasol-jdbc/7.0.4/exasol-jdbc-7.0.4.jar
==== local-ivy-cache: tried
/home/hue0s01u/.ivy2/local/com.exasol/exasol-jdbc/7.0.4/ivys/ivy.xml
-- artifact com.exasol#exasol-jdbc;7.0.4!exasol-jdbc.jar:
/home/hue0s01u/.ivy2/local/com.exasol/exasol-jdbc/7.0.4/jars/exasol-jdbc.jar
==== central: tried
https://repo1.maven.org/maven2/com/exasol/exasol-jdbc/7.0.4/exasol-jdbc-7.0.4.pom
-- artifact com.exasol#exasol-jdbc;7.0.4!exasol-jdbc.jar:
https://repo1.maven.org/maven2/com/exasol/exasol-jdbc/7.0.4/exasol-jdbc-7.0.4.jar
==== spark-packages: tried
https://dl.bintray.com/spark-packages/maven/com/exasol/exasol-jdbc/7.0.4/exasol-jdbc-7.0.4.pom
-- artifact com.exasol#exasol-jdbc;7.0.4!exasol-jdbc.jar:
https://dl.bintray.com/spark-packages/maven/com/exasol/exasol-jdbc/7.0.4/exasol-jdbc-7.0.4.jar
::::::::::::::::::::::::::::::::::::::::::::::
:: UNRESOLVED DEPENDENCIES ::
::::::::::::::::::::::::::::::::::::::::::::::
:: com.exasol#exasol-jdbc;7.0.4: not found
::::::::::::::::::::::::::::::::::::::::::::::
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: com.exasol#exasol-jdbc;7.0.4: not found]
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1306)
at org.apache.spark.deploy.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:54)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:315)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:143)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
YARN Diagnostics:
spark-submit start failed
at org.apache.zeppelin.livy.BaseLivyInterpreter.createSession(BaseLivyInterpreter.java:354)
at org.apache.zeppelin.livy.BaseLivyInterpreter.initLivySession(BaseLivyInterpreter.java:209)
at org.apache.zeppelin.livy.LivySharedInterpreter.open(LivySharedInterpreter.java:59)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at org.apache.zeppelin.livy.BaseLivyInterpreter.getLivySharedInterpreter(BaseLivyInterpreter.java:190)
at org.apache.zeppelin.livy.BaseLivyInterpreter.open(BaseLivyInterpreter.java:163)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:616)
at org.apache.zeppelin.scheduler.Job.run(Job.java:188)
at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:140)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Regards,
Nicola