Support Questions
Find answers, ask questions, and share your expertise

Does jar files missing for spark interpreter?

Solved Go to solution

Does jar files missing for spark interpreter?

Hi,

I am facing a strange behaviour in sparK;

I am following the tutotial (lab 4 -spark analysis) but when executing the command the script is returned as you can see on the picture(red rectangle)

I have tables in My hive database default.

I checked the jar files in my spark interpreter and found jar files as you can see on picture sparkjarfiles.png (Are some jar files missing please?

Any suggestion?hivedisplay.pnghicecontextdisplay.png

1 ACCEPTED SOLUTION

Accepted Solutions

Re: Does jar files missing for spark interpreter?

@Oriane

Can you provide the following:

1. As @Bernhard Walter already asked, can you attach the screenshot of your spark interpreter config from Zeppelin UI

2. Create a new Notebook and run the below and send the output:

%sh
whoami

3. Can you attach the output of

$ ls -lrt /usr/hdp/current/zeppelin-server/local-repo

4. Is your cluster Kerberized?

View solution in original post

23 REPLIES 23

Re: Does jar files missing for spark interpreter?

Super Mentor

@Oriane

By any chance do you have any extra line before the "%spark" or if there is any special character (my be while copying and pasting it might have come). Can you manually write those lines of %spark script freshly and then test again?

Re: Does jar files missing for spark interpreter?

Super Mentor

12818-zeppelin.png

Ideally it should work.

Re: Does jar files missing for spark interpreter?

Hi @Jay,

I checked but no extra line before "%spark"

I manually write but still facing the problemhicecontextdisplay-2.png

Re: Does jar files missing for spark interpreter?

Side note: in HDP Zeppelin sqlContext defaults to hiveContext. So something like

%spark 
sqlContext.sql("show tables").collect.foreach(println)

should work.

Alternatively:

%sql
show tables

As Jay mentioned, the % needs to be in the first line

Re: Does jar files missing for spark interpreter?

Hi @Bernhard, I have tried both but facing the same problem.

Maybe jar files missing in my spark interpreter?

sqlnotwork.pngsqlcontexttest.png

Re: Does jar files missing for spark interpreter?

Does Zeppelin send to Spark Interpreter at all?

What is

%spark 
print(sc.version)

printing? No hiveContext necessary

Re: Does jar files missing for spark interpreter?

Hi @Bernhard, Excuse me not to have respected what you said before concerning the fisrt line.

the interpreter %jdbc(hive) and %sql are working well because the "show tables" display the result expected.

The problem i have is with %spark(respecting the first line) and I have errors as you can see on the joined picture.

In the interpreter binding, I have selected spark and save

sparkproblem.png

sparkproblem1.png

sparkconfiguration.png

I joined also the spark interpreter configuration

Re: Does jar files missing for spark interpreter?

and what are the Interpreter settings saying. Here are mine.

12827-interpreter-settings.png

Note: a simple "python" in zeppelin.pyspark.python is also OK.

... by the way, I have the same libs in my zeppelin libs spark folder.

Have you tried to restart the interpreter?

Re: Does jar files missing for spark interpreter?

@Oriane

Can you provide the following:

1. As @Bernhard Walter already asked, can you attach the screenshot of your spark interpreter config from Zeppelin UI

2. Create a new Notebook and run the below and send the output:

%sh
whoami

3. Can you attach the output of

$ ls -lrt /usr/hdp/current/zeppelin-server/local-repo

4. Is your cluster Kerberized?

View solution in original post