Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Spark Master UI

avatar
Contributor

I have Sandbox 2.4 on a VirtualBox

when I try to access Spark Master UI by the browser from my windows machine I get access denied as you can see from the pic I attached

can you tell my what wrong with it.?access-denide.png

1 ACCEPTED SOLUTION

avatar
Super Guru
@omar harb

I think HDP comes with Spark on Yarn not standalone unless you installed it manually therefore you won't find the spark master UI inside sandbox.

Please start with below doc.

http://hortonworks.com/hadoop-tutorial/a-lap-around-apache-spark/

View solution in original post

7 REPLIES 7

avatar
Expert Contributor

Are you able to ping sandbox.hortonworks.com from your windows machine? If not, try using IP address instead of hostname.

avatar
Super Guru
@omar harb

I think HDP comes with Spark on Yarn not standalone unless you installed it manually therefore you won't find the spark master UI inside sandbox.

Please start with below doc.

http://hortonworks.com/hadoop-tutorial/a-lap-around-apache-spark/

avatar
Contributor

@Jitendra Yadav

so how can I see INFO . about the workers?

avatar
Super Guru

@omar harb

There is no separate workers when spark runs on YARN, it will run normal as a normal app within YARN in the form of YARN containers. please refer below doc for spark on yarn architecture.

https://spark-summit.org/2014/wp-content/uploads/2014/07/Spark-on-YARN-A-Deep-Dive-Sandy-Ryza.pdf

http://spark.apache.org/docs/latest/running-on-yarn.html

avatar
Super Guru

Hi @omar harb Please let me know if you need more info on this or kindly accept the answer to close this thread.

avatar
Contributor

@Jitendra Yadav

I just need to know why when

I opened the URL

  1. http://localhost:8088/cluster/app/application_1464857740650_0009

and clicked on the log link to see the log, I got

Access Denied.

please see the attached photo clusterurl.pnglogurlpng.png

thank you.

avatar
Super Guru

@omar harb

If RM UI showing enough available resources then I would suggest you to stop the spark application and run it again with below options.

spark-submit --class com.Spark.MainClass --master yarn-client  /home/Test-0.0.1-SNAPSHOT.jar --executor-cores 2 --executor-memory 2g --num-executors 2

If you still see same issue then kindly share the Spark AM logs from this screen for that job. clusterurl.png