Member since
12-09-2015
61
Posts
43
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3295 | 08-05-2016 12:33 PM | |
3899 | 02-27-2016 09:19 AM | |
8914 | 02-19-2016 06:28 PM | |
1521 | 02-02-2016 09:09 PM |
02-16-2016
05:03 PM
1 Kudo
@Neeraj Sabharwal hmmm.. So this is the part where the show ends for me: Launching java with spark-submit command /usr/hdp/2.3.4.0-3485/spark/bin/spark-submit "sparkr-shell" /tmp/Rtmp69Q264/backend_portae4c24444ac20 So now I checked if spark-submit works by running the following example: cd $SPARK_HOME sudo -u spark ./bin/spark-submit --class org.apache.spark.examples.SparkPi--master yarn-client --num-executors 3--driver-memory 512m--executor-memory 512m--executor-cores 1 lib/spark-examples*.jar 10 And the result is Lots of these: INFO Client: Application report for application_1455610402042_0021 (state: ACCEPTED) Then: SparkException: Yarn application has already ended! It might have been killed or unable to launch application master. In the file, the whole error can be found spark-submit-error.txt Am I missing something in Spark setup?
... View more
02-16-2016
04:32 PM
1 Kudo
@Neeraj Sabharwal I killed process, also restarted spark from ambari. If I run sudo netstat -anp | grep 9001 I dont see anything. I also have this one in my bashrc on the node where Im running sparkR: export EXISTING_SPARKR_BACKEND_PORT=9001 Funny thing, if i run sparkR with my centos user I get the error mentioned in the original post. If i run sudo -u spark sparkR then I get: Error in socketConnection(port = monitorPort) :
cannot open the connection
In addition: Warning message:
In socketConnection(port = monitorPort) : localhost:53654 cannot be opened
... View more
02-16-2016
04:20 PM
1 Kudo
@Artem Ervits Ive seen this one as well, dont see a big difference between this one and 1.5.2 I have SPARK_HOME and JAVA_HOME defined. My hive-site.xml is also on its place. If I scroll down to the SparkR part:
R is installed on all the nodes. by the way, when I run sparkR, I dont get the nice Spark graphic (logo) seems as if Im starting just R.
... View more
02-16-2016
04:16 PM
1 Kudo
@Neeraj Sabharwal running sudo netstat -anp | grep 9001 returns: unix 2 [ ACC ] STREAM LISTENING 9001 1202/master private/proxywrite
... View more
02-16-2016
04:13 PM
1 Kudo
Im looking at the code: https://github.com/amplab-extras/SparkR-pkg/blob/master/pkg/src/src/main/scala/edu/berkeley/cs/amplab/sparkr/SparkRRunner.scala env variable EXISTING_SPARKR_BACKEND_PORT can be defined through bashrc, the try-catch that returns my error is the following: tryCatch({
connectBackend("localhost", backendPort)
error = function(err) {
stop("Failed to connect JVM\n")
Isnt it interesting that localhost is written in it this way? Or is there an explanation for it?
... View more
02-16-2016
04:04 PM
2 Kudos
I have centos 7.1. On my multinode Hadoop cluster (2.3.4) I have , through Ambari, installed spark 1.5.2.
I am trying to connect to sparkR from CLI and after I run sparkR I get the following error: Error in value[[3L]](cond) : Failed to connect JVM
In addition: Warning message:
In socketConnection(host = hostname, port = port, server = FALSE, :
localhost:9001 cannot be opened The port (9001) is opened on the namenode (where Im running sparkR)
Do you have any ideas what Im doing wrong?
Ive seen this link:
http://hortonworks.com/hadoop-tutorial/apache-spark-1-5-1-technical-preview-with-hdp-2-3/
and I followed also this link: http://www.jason-french.com/blog/2013/03/11/installing-r-in-linux/
To install R on all datanodes.
I appreicate your contribution.
... View more
Labels:
- Labels:
-
Apache Spark
02-15-2016
07:58 AM
What I had to do was restart ambari-server to get rid of this error.
... View more
02-10-2016
10:07 PM
1 Kudo
@Artem Ervits Thank you for the link. Educative reading 🙂 The clients are installed (HDFS, YARN and Hive are, I believe, required).
Ive been doing some researcher and Ive also checked the link you proposed. Adding various variables in my bashrc didnt work. I read also this document: https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html. If I run hadoop checknative -aI get, among other things, the following: hadoop: true /usr/hdp/2.3.4.0-3485/hadoop/lib/native/libhadoop.so.1.0.0 snappy: true /usr/hdp/2.3.4.0-3485/hadoop/lib/native/libsnappy.so.1
Seems as if Hadoop sees the native libraries, but Spark does not?
... View more
02-10-2016
04:24 PM
1 Kudo
Ive installed Spark 1.6 on my hortonworks cluster (2.3.4.0-3485) by using the following website: http://hortonworks.com/hadoop-tutorial/apache-spark-1-6-technical-preview-with-hdp-2-3/ When I run spark-shell or pyspark from my command line, the first two lines are these two: ls: cannot access /usr/hdp/None/hadoop/lib: No such file or directory
16/02/10 17:07:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Anything Im missing in my Spark installation?
... View more
Labels:
- Labels:
-
Apache Spark
02-04-2016
01:36 PM
1 Kudo
More an info than an answer:
proposal to join the apache incubator: http://googlecloudplatform.blogspot.ch/2016/01/Dataflow-and-open-source-proposal-to-join-the-Apache-Incubator.html
acceptance: https://wiki.apache.org/incubator/BeamProposal
... View more
- « Previous
- Next »