Member since
04-03-2015
4
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4808 | 04-06-2015 06:57 AM |
04-06-2015
06:57 AM
I got the answer: "Add export STANDALONE_SPARK_MASTER_HOST=10.0.2.15 to your spark-env.sh so both master and worker agree on the same host address".
... View more
04-03-2015
12:25 PM
Is there anyway to do that on a single michine running the cloudera quickstart? OR maybe its called adding executors on the same machine I'm new to Spark so not sure what its acually called. Like this tutorial adding workers on the same machine but on CentOS running CDH5.3 quickstart: http://mbonaci.github.io/mbo-spark/ Thank you, Amr
... View more
04-03-2015
07:10 AM
Yes, I am using the cluodera quick start which runs on CentOS to run Spark on Standalone mode.
... View more
04-03-2015
06:47 AM
Hi All, I have a cloudera cdh5.3 quickstart running on a VM. I am having problems with running Spaark. I have gone thruogh those steps http://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics/cdh_ig_spark_configure.html and run the word exapmle and it worked. But when I go to the master (quickstart.cloudera:18080) it has no workers there the cores=0, memory=0... when I go to (quickstart.cloudera:18081) there is a worker. My question is how to add workers? And what should I enter in export STANDALONE_SPARK_MASTER_HOST? This is the spark-env.sh: ### Change the following to specify a real cluster's Master host ### export STANDALONE_SPARK_MASTER_HOST=worker-20150402201049-10.0.2.15-7078 export SPARK_MASTER_IP=$STANDALONE_SPARK_MASTER_HOST ### Let's run everything with JVM runtime, instead of Scala export SPARK_LAUNCH_WITH_SCALA=0 export SPARK_LIBRARY_PATH=${SPARK_HOME}/lib export SCALA_LIBRARY_PATH=${SPARK_HOME}/lib export SPARK_MASTER_WEBUI_PORT=18080 export SPARK_MASTER_PORT=7077 export SPARK_WORKER_PORT=7078 export SPARK_WORKER_WEBUI_PORT=18081 export SPARK_WORKER_DIR=/var/run/spark/work export SPARK_LOG_DIR=/var/log/spark export SPARK_PID_DIR='/var/run/spark/' if [ -n "$HADOOP_HOME" ]; then export LD_LIBRARY_PATH=:/lib/native fi export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-etc/hadoop/conf} ### Comment above 2 lines and uncomment the following if ### you want to run with scala version, that is included with the package #export SCALA_HOME=${SCALA_HOME:-/usr/lib/spark/scala} #export PATH=$PATH:$SCALA_HOME/bin Thank you, Amr
... View more
Labels:
- Labels:
-
Apache Spark