Support Questions
Find answers, ask questions, and share your expertise

Spark Risk Factor Tutorial HDP - Prefix Not Found in Zeppelin

Highlighted

Spark Risk Factor Tutorial HDP - Prefix Not Found in Zeppelin

Explorer

I believe I'm running HDP 2.5 and I'm getting the following errors running these commands in Zeppelin as part of th eSpark Risk Factor Tutorial. When these are run I get Prefix not found. Both Spark and Spark2 show running (not in maintenance mode) Ambari dashboard. Any ideas?

%spark2 

valhiveContext=neworg.apache.spark.sql.SparkSession.Builder().getOrCreate()

%spark2 

hiveContext.sql("show tables").show()

2 REPLIES 2

Re: Spark Risk Factor Tutorial HDP - Prefix Not Found in Zeppelin

Explorer

I did try changing from %spark2 that is in the tutorial apparently written for HDP 2.6 to % spark

%spark val hiveContext = new org.apache.spark.sql.SparkSession.Builder().getOrCreate()

This is the error I received.

<console>:27: error: object SparkSession is not a member of package org.apache.spark.sql val hiveContext = new org.apache.spark.sql.SparkSession.Builder().getOrCreate()

I get similar errors when I try and run the tutorial code in Scala logged in as 'root'.

Highlighted

Re: Spark Risk Factor Tutorial HDP - Prefix Not Found in Zeppelin

Rising Star

You should check you "Interpreters" tab on Zeppelin to check which are the available "prefix" which actually are interpreters.

Anyway, when you run the `%spark` interpreter, the error you get is because you are trying to use a Spark2 syntax on Spark 1.6. Indeed, with `%spark`, you just need to use the `sqlc` variable which is a HiveContext instance created by Zeppelin for you. Thus, what you can do is:

%spark
sqlc.sql("show tables").show()

and this should run fine.