Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hortonworks HDP, Run Java Spark Application on Yarn in Cluster mode from within Intellij Idea

Highlighted

Hortonworks HDP, Run Java Spark Application on Yarn in Cluster mode from within Intellij Idea

New Contributor
I am new to Hortonworks Data Platform (HDP). I have a fully functional HDP cluster (not hortownworks sandbox) running sepratly and i have root access to all machines of the cluster. I followed this guide to setup a Spark Development Environment with Java in Intellij Idea. I am developing application on my local desktop computer. To run the application, i have to do the following:
  1. Write the Java Program for Spark
  2. Build using mvn package from within Intellij Idea
  3. Upload the jar file to the Spark master node via SFTP
  4. SSH to Spark master node and submit the spark job using ./bin/spark-submit --class "MyClass" --master yarn /path/to/MyJar.jar

It runs perfectly and i can check the job in Ambari UI→Yarn→ResourceManager UI→Applications. The problem is, i have to do this entire process each time i make changes in my code.

Is there any way to automate this process? Is it possible to configure Intellij Idea to build, deploy, and run spark application on the cluster by single click? So that i can run and debug my application without doing this entire process. I have studied this and this but no joy.

Don't have an account?
Coming from Hortonworks? Activate your account here