Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How to import External Libraries for Livy Interpreter using zeppelin (Using Yarn cluser mode) ?

avatar
Explorer

I don't have any problem to import external library for Spark Interpreter using SPARK_SUBMIT_OPTIONS.

This method doesn't work with Livy Interpreter.

What is the best solution to import external library for Livy Interpreter using zeppelin ?

I prefer to import from local JARs without having to use remote repositories.

Thank you in advance.

1 ACCEPTED SOLUTION

avatar

You can load dynamic library to livy interpreter by set livy.spark.jars.packages property to comma-separated list of maven coordinates of jars to include on the driver and executor classpaths. The format for the coordinates should be groupId:artifactId:version.

Example

PropertyExampleDescription
livy.spark.jars.packagesio.spray:spray-json_2.10:1.3.1Adding extra libraries to livy interpreter

https://zeppelin.apache.org/docs/0.7.0-SNAPSHOT/interpreter/livy.html#adding-external-libraries

View solution in original post

17 REPLIES 17

avatar
Expert Contributor

@Mickaël GERVAIS check to make sure livy interpreter is listed in the interpreter bindings for the notebook. Also, set DEBUG on the livy server and check in the livy out file produced on the server. Finally, make sure you have restarted livy and zeppelin to pick up the changes. I tested and it did work for me.

avatar
Explorer

@Ian Roberts I'm sorry , but I cannot make it work.

Here is my error:

import org.apache.commons.codec.binary.Base64
import java.time.LocalDateTime.now
import com.mongodb.BasicDBObjectBuilder.start

import org.apache.commons.codec.binary.Base64
import java.time.LocalDateTime.now
<console>:35: error: object mongodb is not a member of package com
         import com.mongodb.BasicDBObjectBuilder.start

The jar mongo-java-driver-2.14.3.jar is present in:

  • livy-server/repl-jars
  • hdfs:///user/mgervais/.sparkStaging/application_1481647493263_0001

And the Spark UI show those properties:

  • spark.yarn.secondary.jars : ...mongo-java-driver-2.14.3.jar...
  • spark.jars : ...file:/usr/hdp/current/livy-server/repl-jars/mongo-java-driver-2.14.3.jar...

I've the livy interpreter enable in the Zeppelin notebook (It works , I can see the livy sessions...)

I've restarted the full Cluster with Ambari...

Thanks...

avatar
Expert Contributor

This does not look like an issue with the jar being included but rather an issue with the import statement. I breifly looked on google and see similar descriptions stating to try org.mongodb. I would focus on the import statement more than the inclusion of the jar for livy.

avatar
Explorer

Okay, but I've the same issue with others imports which are not parts of native libraries.

My own jar cannot be included neither....

Is this a problem with Zeppelin notebook?

avatar
Explorer

Hi,

I'm using this version of HDP : /usr/hdp/2.5.0.0-1245/

Is this the reason? Should I upgrade my stack?

avatar
New Contributor

Hi all,

Has anyone succeeded in using Livy with a custom maven repository?

avatar
New Contributor

hello @Laurence Da Luz

I am trying to run CRAN packages on Notebook. I downloaded the packages, and installed them on server, but somehow Zeppelin Notebook does not picks the package, for instance

%livy.sparkr

library(data.table)

returns in error on notebook.

Hope you can assist in that.

Rachna

avatar
Super Collaborator

@Rachna Dhand I know you must be way past this issue, but -- You have to install the packages on all NodeManager nodes as root so they are available to all users. Maybe this will help someone else in the future.