- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
Created on 05-26-2017 03:41 PM - edited 08-17-2019 12:45 PM
I. Environment details
- HDP 2.5.x
- HDP 2.6.x
- Kerberos enabled
II. Steps to follow
a) JDBC interpreter
Set JDBC interpreter in Zeppelin UI like
- JDBC interpreter config phoenix.driver org.apache.phoenix.jdbc.PhoenixDriver phoenix.hbase.client.retries.number 1 phoenix.password phoenix.url jdbc:phoenix:dkhdp262.openstacklocal,dkhdp261.openstacklocal,dkhdp263.openstacklocal:/hbase-secure phoenix.user phoenixuser zeppelin.jdbc.auth.type KERBEROS zeppelin.jdbc.keytab.location /etc/security/keytabs/zeppelin.server.kerberos.keytab zeppelin.jdbc.principal zeppelin-dkhdp26@SUPPORT.COM ARTIFACTS /usr/hdp/current/phoenix-client/phoenix-client.jar
b) zeppelin notebook
Created on 07-12-2017 06:35 PM
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
On HDP 2.6, appending $CLASSPATH seems to break Spark2 interpreter with:
"org.apache.zeppelin.interpreter.InterpreterException: Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;"
Is the included Phoenix-Spark driver (phoenix-spark-4.7.0.2.6.1.0-129.jar) certified to work with Spark2? I thought it's the preferred way rather than via JDBC.
Thanks!
Created on 07-13-2017 04:02 AM
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Good catch!!! Just updated.
The phoenix jar is here to work with JDBC interpreter rather than spark.