Created 07-06-2017 07:45 PM
I believe I'm running HDP 2.5 and I'm getting the following errors running these commands in Zeppelin as part of th eSpark Risk Factor Tutorial. When these are run I get Prefix not found. Both Spark and Spark2 show running (not in maintenance mode) Ambari dashboard. Any ideas?
%spark2
valhiveContext=neworg.apache.spark.sql.SparkSession.Builder().getOrCreate()
%spark2
hiveContext.sql("show tables").show()
Created 07-07-2017 11:15 AM
I did try changing from %spark2 that is in the tutorial apparently written for HDP 2.6 to % spark
%spark val hiveContext = new org.apache.spark.sql.SparkSession.Builder().getOrCreate()
This is the error I received.
<console>:27: error: object SparkSession is not a member of package org.apache.spark.sql val hiveContext = new org.apache.spark.sql.SparkSession.Builder().getOrCreate()
I get similar errors when I try and run the tutorial code in Scala logged in as 'root'.
Created 07-13-2017 06:58 AM
You should check you "Interpreters" tab on Zeppelin to check which are the available "prefix" which actually are interpreters.
Anyway, when you run the `%spark` interpreter, the error you get is because you are trying to use a Spark2 syntax on Spark 1.6. Indeed, with `%spark`, you just need to use the `sqlc` variable which is a HiveContext instance created by Zeppelin for you. Thus, what you can do is:
%spark sqlc.sql("show tables").show()
and this should run fine.