Hi,
I am using HDP 2.4, Spark 1.6.2.
I've recently installed Falcon and I was able to deploy the primary and backup clusters. I've also successfully run a mirror job.
Now I'm working on scheduling a spark app. When I want to create a process, I am only able to choose from Oozie, Pig and Hive. I am not able to select Spark as an engine. When I try to add it using XML the spark-attributes get cleared.
I am using an xml like below
<process xmlns='uri:falcon:process:0.1' name='spark-process'>
<clusters>
<cluster name='primaryCluster'>
<validity start='2017-07-03T00:00Z' end='2017-07-05T00:00Z'/>
</cluster>
</clusters>
<parallel>1</parallel>
<order>LIFO</order>
<frequency>minutes(5)</frequency>
<timezone>UTC</timezone>
<workflow engine="spark" path="/app/spark"/>
<spark-attributes>
<master>local</master>
<name>Test Spark Wordcount</name>
<class>org.apache.falcon.example.spark.SparkWordCount</class>
<jar>/app/spark/word-count.jar</jar>
<spark-opts>--num-executors 1 --driver-memory 512m --executor-memory 512m --executor-cores 1</spark-opts>
</spark-attributes> <retry policy='periodic' delay='minutes(3)' attempts='3'/>
<ACL owner='ambari-qa' group='users' permission='0755'/>
</process>
Is there something I need to do before using Spark with Falcon or is this functionality not supported with these component versions?
See screenshots to visualise the issue

