Support Questions

Find answers, ask questions, and share your expertise

How to use R with spark

avatar
Master Collaborator

Hi:

my data science want to use spark, so he need an IDE or somenthing, he want to use R libraries, any idea??

Thanks

1 ACCEPTED SOLUTION

avatar

You can either run Spark natively and declare a SparkR context, via sparkR.init(), or use RStudio for IDE access. Instructions for both are included here:

https://spark.apache.org/docs/latest/sparkr.html

View solution in original post

4 REPLIES 4

avatar

You can either run Spark natively and declare a SparkR context, via sparkR.init(), or use RStudio for IDE access. Instructions for both are included here:

https://spark.apache.org/docs/latest/sparkr.html

avatar
Master Collaborator

Hi:

After install the H2o cluster i run some algoritmict, i can say it very good 🙂

Many thanks.

avatar

avatar
Super Collaborator

I would recommend you to use RStudio which is best IDE for R users for now. Check the insttruction here

https://spark.apache.org/docs/latest/sparkr.html#starting-up-from-rstudio