Created 02-16-2016 04:04 PM
I have centos 7.1.
On my multinode Hadoop cluster (2.3.4) I have , through Ambari, installed spark 1.5.2. I am trying to connect to sparkR from CLI and after I run sparkR I get the following error:
Error in value[[3L]](cond) : Failed to connect JVM In addition: Warning message: In socketConnection(host = hostname, port = port, server = FALSE, : localhost:9001 cannot be opened
The port (9001) is opened on the namenode (where Im running sparkR) Do you have any ideas what Im doing wrong? Ive seen this link: http://hortonworks.com/hadoop-tutorial/apache-spark-1-5-1-technical-preview-with-hdp-2-3/
and I followed also this link:
http://www.jason-french.com/blog/2013/03/11/installing-r-in-linux/
To install R on all datanodes. I appreicate your contribution.
Created 02-19-2016 06:28 PM
Want to get a detailed solution you have to login/registered on the community
Register/LoginCreated 02-19-2016 07:46 AM
@Neeraj Sabharwal, @Artem Ervits
Im going to give it a try on Ubuntu 14.04 today.
Neeraj, you mentioned some other people are having the same problems. Can you give more information. How did they solve it?
Created 02-19-2016 06:28 PM
Want to get a detailed solution you have to login/registered on the community
Register/LoginCreated 02-19-2016 06:39 PM
I am interested to know more on this
Created 02-22-2016 09:45 PM
I needed some more time to test it on spark 1.4.1 and 1.5.2. I also put it in a nice form. Here is the process Ive taken in order to get sparkR running:
https://markobigdata.wordpress.com/2016/02/22/installing-r-on-hadoop-cluster-to-run-sparkr/