@Mickaël GERVAIS check to make sure livy interpreter is listed in the interpreter bindings for the notebook. Also, set DEBUG on the livy server and check in the livy out file produced on the server. Finally, make sure you have restarted livy and zeppelin to pick up the changes. I tested and it did work for me.
@Ian Roberts I'm sorry , but I cannot make it work.
Here is my error:
import org.apache.commons.codec.binary.Base64 import java.time.LocalDateTime.now import com.mongodb.BasicDBObjectBuilder.start import org.apache.commons.codec.binary.Base64 import java.time.LocalDateTime.now <console>:35: error: object mongodb is not a member of package com import com.mongodb.BasicDBObjectBuilder.start
The jar mongo-java-driver-2.14.3.jar is present in:
And the Spark UI show those properties:
I've the livy interpreter enable in the Zeppelin notebook (It works , I can see the livy sessions...)
I've restarted the full Cluster with Ambari...
This does not look like an issue with the jar being included but rather an issue with the import statement. I breifly looked on google and see similar descriptions stating to try org.mongodb. I would focus on the import statement more than the inclusion of the jar for livy.
Okay, but I've the same issue with others imports which are not parts of native libraries.
My own jar cannot be included neither....
Is this a problem with Zeppelin notebook?
I'm using this version of HDP : /usr/hdp/188.8.131.52-1245/
Is this the reason? Should I upgrade my stack?
Has anyone succeeded in using Livy with a custom maven repository?
hello @Laurence Da Luz
I am trying to run CRAN packages on Notebook. I downloaded the packages, and installed them on server, but somehow Zeppelin Notebook does not picks the package, for instance
returns in error on notebook.
Hope you can assist in that.
@Rachna Dhand I know you must be way past this issue, but -- You have to install the packages on all NodeManager nodes as root so they are available to all users. Maybe this will help someone else in the future.