I am new to Hortonworks Data Platform (HDP). I have a fully functional HDP cluster (not hortownworks sandbox) running sepratly and i have root access to all machines of the cluster. I followed this guide to setup a Spark Development Environment with Java in Intellij Idea. I am developing application on my local desktop computer. To run the application, i have to do the following:
Write the Java Program for Spark
Build using mvn package from within Intellij Idea
Upload the jar file to the Spark master node via SFTP
SSH to Spark master node and submit the spark job using ./bin/spark-submit --class "MyClass" --master yarn /path/to/MyJar.jar
It runs perfectly and i can check the job in Ambari UI→Yarn→ResourceManager UI→Applications. The problem is, i have to do this entire process each time i make changes in my code.
Is there any way to automate this process? Is it possible to configure Intellij Idea to build, deploy, and run spark application on the cluster by single click? So that i can run and debug my application without doing this entire process. I have studied this and this but no joy.