Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Cannot get Apache Spark to start

avatar
Contributor

Hello again friends -

I am working the tutorial "A Lap Around Apache Spark" and running into an issue. I am executing the following command:

./bin/spark-shell --master yarn-client --driver-memory 512m--executor-memory 512m

Which seems to start OK - but apparently not, as it seems to get stuck and repeat the following message every second:

3615-virtualbox-hortonworks-sandbox-with-hdp-232-22-04.png

And it will keep repeating the same message over and over again until I hit CTRL+C.

I am wondering about the $JAVA_HOME variable. I have tried remedying the above condition with different variations, none of which seem to have any effect. The current value of $JAVA_HOME is as follows:

3616-gowbd.png

Any thoughts? I hate the thought of giving up on this module until I understand what is creating this error.

Thanking all of you in advance - your response to my inquiries in the past have been spectacular, and I am very grateful.

Thanks,

Mike

1 ACCEPTED SOLUTION

avatar
Master Guru

Check do you have enough Yarn memory and what's your yarn.scheduler.minimum-allocation-mb. Even with driver/executor memory set to 512m, another 384m are needed for the overhead, meaning 896m for the driver and each executor. Also try using only one executor: "--num-executors 1".

View solution in original post

10 REPLIES 10

avatar
Master Guru

Hi @Mike Vogt, thanks and glad to hear it worked. Can you kindly accept the answer and thus help us managing answered questions. Tnx!