Support Questions

Find answers, ask questions, and share your expertise

Unable to complete Spark Pi example tutorial

avatar
Explorer

output.txtHi,

I am new to HDP and have downloaded the HDP sandbox 2.4 and trying out the tutorial

http://hortonworks.com/hadoop-tutorial/a-lap-around-apache-spark/

But I am not able to complete the Spark Pi example as a message keeps looping through the message "INFO Client: Application report for application_1457875462527_0006 (state: ACCEPTED)" and does not show the result as expected.

I have tried following the tutorial with and without the installation of spark_2_3_4_1_10-master and spark_2_3_4_1_10-python as I see that HDP 2.4 already come with spark 2.4.0.0-169.

Attached is the output from the command. Hope someone can help.

Thanks,

1 ACCEPTED SOLUTION

avatar
Explorer

Hi Guys,

Thanks a lot for your help. Just to summarize the issue so that someone else knows what to do:

If using HDP sandbox 2.4, for the tutorial "A Lap Around Apache Spark", there is no need to install spark, everything is in.

When I had problem starting spark, it was because there is another spark instance running. Started because I was running the "Spark on Zeppelin" tutorial.

To stop the spark session, use the "yarn application -list" and "yarn application -kill" command to kill of other spark on yarn sessions.

All is well now. Takes again for your help.

View solution in original post

11 REPLIES 11

avatar
Master Mentor

You can list the app in yarn CLI and then kill that particular job https://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/YarnCommands.html

avatar
Explorer

Hi Guys,

Thanks a lot for your help. Just to summarize the issue so that someone else knows what to do:

If using HDP sandbox 2.4, for the tutorial "A Lap Around Apache Spark", there is no need to install spark, everything is in.

When I had problem starting spark, it was because there is another spark instance running. Started because I was running the "Spark on Zeppelin" tutorial.

To stop the spark session, use the "yarn application -list" and "yarn application -kill" command to kill of other spark on yarn sessions.

All is well now. Takes again for your help.