Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Using SparkLauncher from backend service machine and execute spark jobs in HDinsight cluster

Highlighted

Using SparkLauncher from backend service machine and execute spark jobs in HDinsight cluster

New Contributor

Hi,

We are testing HDInsight Spark Cluster (YARN) and triying use SparkLauncher (setDeployMode: cluster) to send jobs to the cluster from another machine (backend service application without hadoop or related) out the cluster in the same network

Testing with a pc and another virtual Machine with Hortonworks standalone version works right

.setMaster("spark://192.168.10.183:7077")

In azure hdisight we haven't seem this port open and seems only is used in standalone hadoop installation.But now, we are testing this approach in the HDinsight cluster unsuccessfully.

Someone has used this configuration?

P.D.

We have tested livy with succes but we'd like analyze this other way.

Thanks