Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

external jars not getting picked up in zeppelin or cli in spark

avatar

I am trying to use a mysql jdbc jar in zeppelin and cli and getting errors

b%dep

z.load("/var/lib/ambari-server/resources/mysql-connector-java-5.1.17.jar")

val url="jdbc:mysql://localhost:3306/hive"

val prop = new java.util.Properties

prop.setProperty("user","root")

prop.setProperty("password","****")

val people = sqlContext.read.jdbc(url,"version",prop)

But getting an exception :

java.sql.SQLException: No suitable driver found for jdbc:mysql://localhost:3306/hive

at java.sql.DriverManager.getConnection(DriverManager.java:596)

at java.sql.DriverManager.getConnection(DriverManager.java:187)

I tried doing this using the CLI by registering like in this blog: http://hortonworks.com/hadoop/zeppelin/#section_3

When you have a jar on the node where Zeppelin is running, the following approach can be useful:

Add spark.files property at SPARK_HOME/conf/spark-defaults.conf;

for example:spark.files /path/to/my.jar

This is my spark-defaults.conf, that i modified using ambari

spark.driver.extraJavaOptions -Dhdp.version=2.3.2.0-2950

spark.files /var/lib/ambari-server/resources/mysql-connector-java-5.1.17.jar

spark.history.kerberos.keytab none

When I run the same code I am getting the same error as above: java.sql.SQLException: No suitable driver found for jdbc:mysql://localhost:3306/hive

The jar file does exist in that path:/var/lib/ambari-server/resources/mysql-connector-java-5.1.17.jar

[root@sandbox conf]# find / -iname "mysql-connector-java*"

/usr/hdp/2.3.2.0-2950/sqoop/lib/mysql-connector-java.jar

/usr/hdp/2.3.2.0-2950/hive/lib/mysql-connector-java.jar

/usr/hdp/2.3.2.0-2950/hbase/lib/mysql-connector-java.jar

/usr/hdp/2.3.2.0-2950/knox/ext/mysql-connector-java.jar

/usr/hdp/2.3.2.0-2950/hadoop/lib/mysql-connector-java.jar

/usr/hdp/2.3.2.0-2950/hadoop-yarn/lib/mysql-connector-java.jar

/usr/hdp/2.3.2.0-2950/ranger-admin/ews/lib/mysql-connector-java.jar

/usr/share/java/mysql-connector-java-5.1.17.jar

/usr/share/java/mysql-connector-java-5.1.31-bin.jar

/usr/share/java/mysql-connector-java.jar

/var/lib/ambari-server/resources/mysql-connector-java-5.1.17.jar

/var/lib/ambari-agent/tmp/mysql-connector-java.jar

/etc/maven/fragments/mysql-connector-java

1 ACCEPTED SOLUTION

avatar
Master Mentor

@azeltov@hortonworks.com This is good information. Link

Also, this one

Apparently this is not very documented feature of Spark (and not an issue with Zeppelin itself) Here is the code that works for me and solves the similar issue:

%dep
z.load("mysql:mysql-connector-java:5.1.35")

and then

val driver = "com.mysql.jdbc.Driver"
val url = "jdbc:mysql://address=(protocol=tcp)(host=localhost)(port=3306)(user=...)(password=...)/dbname"

val jdbcDF = sqlc.load("jdbc", Map(
  "url" -> url,
  "driver" -> driver,
  "dbtable" -> "table1"))

 jdbcDF.registerTempTable("table1")

Let me know if this helps!

View solution in original post

3 REPLIES 3

avatar

There might be more needed here. See this PR for JDBC interpreter for MySQL: https://github.com/apache/incubator-zeppelin/pull/60

avatar
Master Mentor

@azeltov@hortonworks.com This is good information. Link

Also, this one

Apparently this is not very documented feature of Spark (and not an issue with Zeppelin itself) Here is the code that works for me and solves the similar issue:

%dep
z.load("mysql:mysql-connector-java:5.1.35")

and then

val driver = "com.mysql.jdbc.Driver"
val url = "jdbc:mysql://address=(protocol=tcp)(host=localhost)(port=3306)(user=...)(password=...)/dbname"

val jdbcDF = sqlc.load("jdbc", Map(
  "url" -> url,
  "driver" -> driver,
  "dbtable" -> "table1"))

 jdbcDF.registerTempTable("table1")

Let me know if this helps!

avatar

Also see the import external library section of Zeppelin Tech Preview http://hortonworks.com/hadoop-tutorial/apache-zeppelin/