Support Questions

Find answers, ask questions, and share your expertise

Spark-sklearn integration

avatar
Expert Contributor

Hi,

We have a Hadoop on-premise cluster and are planning to integrate spark with scikit learn using the spark-sklearn package. Can you please let me know if we need to install sklearn and spark-sklearn package in all nodes or just the node where spark2-history server has been installed. We will be using yarn for resource allocation.

Thanks,

Chandra

1 ACCEPTED SOLUTION

avatar

@chandramouli muthukumaran

You'll want to install sklearn (pip install -U scikit-learn) and spark-sklearn on all datanodes of the cluster, as well as other relevant python packages such as numpy, scipy, etc. I'd also recommend using YARN as the resource manager, so you are on the right path there. Hope this helps!

View solution in original post

2 REPLIES 2

avatar

@chandramouli muthukumaran

You'll want to install sklearn (pip install -U scikit-learn) and spark-sklearn on all datanodes of the cluster, as well as other relevant python packages such as numpy, scipy, etc. I'd also recommend using YARN as the resource manager, so you are on the right path there. Hope this helps!

avatar
Expert Contributor

Thanks much for your response.