Created 02-25-2016 12:30 PM
hi:
I am testing sparkR from HDP and i got this error from command line
./sparkR > Sys.setenv(SPARK_HOME = "/usr/hdp/2.3.2.0-2950/spark"); > .libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths())) > sc <- sparkR.init() Error: could not find function "sparkR.init"
Please i need to do somenthing more???
Created 02-25-2016 12:33 PM
Did you follow these steps http://hortonworks.com/hadoop-tutorial/apache-spark-1-6-technical-preview-with-hdp-2-3/
SparkR is new to HDP and this is the first preview where it is offered
Created 02-25-2016 12:33 PM
Did you follow these steps http://hortonworks.com/hadoop-tutorial/apache-spark-1-6-technical-preview-with-hdp-2-3/
SparkR is new to HDP and this is the first preview where it is offered
Created 02-25-2016 01:00 PM
I only see spark 1.4.1 libraries in your directory listing, sparkr is in 1.6.0 you need to point your environment to the proper release
Created 03-07-2016 10:39 AM
Hi: after i update the hdp to ne new version its work, thanks.
Created 02-25-2016 12:37 PM
Created 02-25-2016 12:38 PM
You need to have this https://gist.github.com/nsabharwal/9163e0adfc66af080145
Created 02-25-2016 05:37 PM
You don't have correct version..See the entry in bold
[root@phdns01 spark]# ls -l
total 92
drwxr-xr-x. 2 root root 4096 Feb 14 20:29 bin
lrwxrwxrwx. 1 root root 25 Feb 14 20:29 conf -> /etc/spark/2.3.4.0-3485/0
drwxr-xr-x. 3 root root 4096 Feb 14 20:29 data
drwxr-xr-x. 2 root root 4096 Dec 15 23:18 doc
drwxr-xr-x. 3 root root 4096 Feb 14 20:29 ec2
drwxr-xr-x. 3 root root 4096 Feb 14 20:29 examples
drwxr-xr-x. 2 root root 4096 Feb 14 20:29 lib
-rw-r--r--. 1 root root 17356 Dec 15 22:17 LICENSE
-rw-r--r--. 1 root root 23158 Dec 15 22:17 NOTICE
drwxr-xr-x. 6 root root 4096 Feb 14 20:29 python
drwxr-xr-x. 3 root root 4096 Feb 14 20:29 R ---> SparkR package
-rw-r--r--. 1 root root 507 Dec 15 22:17 README
-rw-r--r--. 1 root root 3593 Dec 15 22:17 README.md
-rw-r--r--. 1 root root 61 Dec 15 23:18 RELEASE
drwxr-xr-x. 2 root root 4096 Feb 14 20:29 sbin
lrwxrwxrwx. 1 root root 19 Feb 14 20:29 work -> /var/run/spark/work
Created 02-25-2016 05:46 PM
@Roberto Sancho You have to follow the tutorial step by step to make it work http://hortonworks.com/hadoop-tutorial/apache-spark-1-6-technical-preview-with-hdp-2-3
Also, you can follow https://gist.github.com/nsabharwal/9163e0adfc66af080145 but I suggest to stick with the tutorial
Please see the output that I posted. In your case, there is no R package with in spark
Created 02-25-2016 12:55 PM
Hi:
Yes its the tutorial i follow, but i think the sparkR lib doestn exist, on my server:, i have install R correclty long time ago
[root@lnxbig05 spark]# pwd /usr/hdp/2.3.2.0-2950/spark [root@lnxbig05 spark]# ls -lrt total 120 -rwxr-xr-x 1 root root 3624 Sep 30 23:32 README.md -rwxr-xr-x 1 root root 507 Sep 30 23:32 README -rwxr-xr-x 1 root root 22559 Sep 30 23:32 NOTICE -rwxr-xr-x 1 root root 50971 Sep 30 23:32 LICENSE drwxr-xr-x 2 root root 4096 Oct 1 00:33 doc -rwxr-xr-x 1 root root 61 Oct 1 00:33 RELEASE drwxr-xr-x 3 root root 4096 Jan 27 18:10 data drwxr-xr-x 3 root root 4096 Jan 27 18:10 examples drwxr-xr-x 3 root root 4096 Jan 27 18:10 ec2 drwxr-xr-x 2 root root 4096 Jan 27 18:11 lib lrwxrwxrwx 1 root root 19 Jan 27 18:11 work -> /var/run/spark/work drwxr-xr-x 2 root root 4096 Jan 27 18:11 sbin drwxr-xr-x 6 root root 4096 Jan 27 18:11 python lrwxrwxrwx 1 root root 25 Jan 27 18:11 conf -> /etc/spark/2.3.2.0-2950/0 drwxr-xr-x 2 root root 4096 Feb 25 13:07 bin and mi libs -rwxr-xr-x 1 root root 1809447 Oct 1 00:06 datanucleus-rdbms-3.2.9.jar -rwxr-xr-x 1 root root 1890075 Oct 1 00:06 datanucleus-core-3.2.10.jar -rwxr-xr-x 1 root root 339666 Oct 1 00:06 datanucleus-api-jdo-3.2.6.jar -rwxr-xr-x 1 root root 93644684 Oct 1 00:18 spark-examples-1.4.1.2.3.2.0-2950-hadoop2.7.1.2.3.2.0-2950.jar -rwxr-xr-x 1 root root 4154588 Oct 1 00:22 spark-1.4.1.2.3.2.0-2950-yarn-shuffle.jar -rwxr-xr-x 1 root root 167557539 Oct 1 00:30 spark-assembly-1.4.1.2.3.2.0-2950-hadoop2.7.1.2.3.2.0-2950.jar
So, i dont know if I need any lib more.
Thanks