Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Does jar files missing for spark interpreter?

avatar
Contributor

Hi,

I am facing a strange behaviour in sparK;

I am following the tutotial (lab 4 -spark analysis) but when executing the command the script is returned as you can see on the picture(red rectangle)

I have tables in My hive database default.

I checked the jar files in my spark interpreter and found jar files as you can see on picture sparkjarfiles.png (Are some jar files missing please?

Any suggestion?hivedisplay.pnghicecontextdisplay.png

1 ACCEPTED SOLUTION

avatar

@Oriane

Can you provide the following:

1. As @Bernhard Walter already asked, can you attach the screenshot of your spark interpreter config from Zeppelin UI

2. Create a new Notebook and run the below and send the output:

%sh
whoami

3. Can you attach the output of

$ ls -lrt /usr/hdp/current/zeppelin-server/local-repo

4. Is your cluster Kerberized?

View solution in original post

23 REPLIES 23

avatar
Contributor

localrepo.pngHi @Daniel

The %sh

whoami works well. It returns zeppelin. the problem I have is with %spark as said to @Bernhard.

I don't know what is happening this morning but not able to see the directoty hdp as you can see. Yesterday I could 😞

The proble is resolved, The ssh port was not correct so please see the result of the command on the pic "localrepo"

hdpdirectorydisappear.png

avatar
Master Mentor

@Oriane

I see that you are trying to do ssh on port 2122 which is not right ... many of basic commands will not work that way.

So please do ssh as following:

ssh -p 2222 root@127.0.0.1

.

avatar
Contributor

Thanks @jay. I have already noticed my error.

avatar
Contributor

Thanks a lot @Jay! The problem resolved with a small notice of @Daniel Kozlowski concerning interpreter biding. The spark interpreter was "white" , I should have kept it "blue"

Thanks a lot !!

avatar
Contributor

Also facing the same problem with

sc.version or print(sc.version)

scversion.png

I have just install the sandbox on my laptop to do the "hello world" case.

I can I know if it is kerberized?

avatar

@Oriane

Do exactly this:

- in the new section type: %spark

- press <Enter> button

- type: sc.version

- press <Enter> button

Now, run it

Does this help?

I am asking as noticed that the copied code causing issues.

avatar
Contributor

Thanks @Daniel, I have already took @jay notice into account. I have no more paste the code since yesterday but type it by hand.

What i have done now is create a new notebook file

type %spark - press enter - type print(sc.version)- press enter , run it and I have a "prefix not found" display.

I have tried to open the log file from /var/log/zeppelin/zeppelin-interpreter-sh-zeppelin-sandbox.hortonworks.com.log (as you can see on the picture)scversion2.pngscversion1.pnglogfile.png

I have also retstart spark interpreter again but still have the same "prefix not found" error.

avatar

@Oriane

For the "prefix not found" - double-check if you have spark interpreter binded in that notebook.

See my screenshot - Spark needs to be "blue"

12884-dk.png

avatar
Contributor

Oh! @Daniel ! you are wright with this "blue" detail!!!!

I though for the interpreter binding , spark should be "white"!

Please see this fantastic screen!

thanks a lot ! and keep helping newbies of newbies as me. !scversion3.png

avatar

@Oriane

I am glad you have this working now.

If you believe I helped, please vote up my answer and select as best one 🙂