Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Configure IDE (on local windows) to run PySpark Jobs on Remote HDP Cluster

Configure IDE (on local windows) to run PySpark Jobs on Remote HDP Cluster

Did anyone configured IDE to run the Pyspark code on a remote HDP Cluster.

1. Eclipse IDE

I did not understand how to refer the remote HDP cluster from Eclipse / PyDev. I would like to know step-by-step instructions.

2.PyCharm Community Edition

Did anyone tried connecting to remote HDP from Pycharm Community Edition? I did not see how to connect to remote interpreters.

3. PyCharm Enterprise Edition

I am able to connect to the cluster but it is giving me issues now and then. I cannot use it now as my trial period got expired.

1 REPLY 1
Highlighted

Re: Configure IDE (on local windows) to run PySpark Jobs on Remote HDP Cluster

New Contributor

Hi there,

It might help you to configure pycharm with HDP sandbox.

https://hortonworks.com/tutorials/?tab=product-hdp&filters=apache-spark

And on the same page you will find different links to configure eclipse for java or pycharm for scala

Don't have an account?
Coming from Hortonworks? Activate your account here