Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Falcon: Refreshing/Updating Oozie Workflow in /apps/falcon/cluster/staging area

avatar
Rising Star

We have Falcon jobs that use oozie workflows (workflow.xml) in HDFS. However, we've made some changes to the Oozie workflow, specifically argument values for actions, but don't see them reflected in Falcon. I notice that the workflow.xml is present in the /apps/falcon/clusterName/staging area for the Falcon process, but it's the older version.

How can I get Falcon to refresh/rebuild this area and incorporate the new workflow it points to in HDFS? Will Falcon -update or -repair do it?

Edit: Resolved! Falcon -update does the trick.

falcon entity -type process -suspend -name hdp0106x-my-process
falcon entity -type process -update -name hdp0106x-my-process

That provided this output:

falcon/update/default/Updated successfully(process) hdp0106x-my-process/Effective Time: 2016-01-20T09:30Z. Old workflow id: 0001498-151208005457707-oozie-oozi-C. New workflow id: 0007639-151208005457707-oozie-oozi-B

Explanation:

In a recent project, we experienced a minor issue with a Falcon job not adapting future instances to changes we made to its Oozie workflow. In short, we made an enhancement, but Falcon didn’t recognize the change simply by modifying the Oozie workflow.

Cause: When you submit & schedule a set of entities to Falcon, it

  • creates a home for it in /apps/falcon/clusterName/staging/falcon/workflows/process/
  • places the workflow.xml you specified in the process.xml in that home (including a lib folder if you have one)
  • Uses that workflow.xml for future generated instances

The only issue is when you want to make changes to that Oozie workflow — Falcon doesn’t re-ingest this workflow to its home at any frequency. It only does this at submit/schedule time. So if you make changes to an Oozie workflow, you need to tell Falcon that it has changed (because it technically runs copies of the workflow and /lib folder you give it, not the HDFS versions themselves).

However... the above solution (using falcon update) does the trick!

1 ACCEPTED SOLUTION

avatar
Guru
@Landon Robinson

Have you tried the following:

Update operation allows an already submitted/scheduled entity to be updated. Cluster update is currently not allowed.

Usage: $FALCON_HOME/bin/falcon entity -type [feed|process] -name <<name>> -update -file <<path_to_file>>

The path to file will be the definition.xml

View solution in original post

2 REPLIES 2

avatar
Guru
@Landon Robinson

Have you tried the following:

Update operation allows an already submitted/scheduled entity to be updated. Cluster update is currently not allowed.

Usage: $FALCON_HOME/bin/falcon entity -type [feed|process] -name <<name>> -update -file <<path_to_file>>

The path to file will be the definition.xml

avatar
Rising Star

Thanks! I gave that a try after posting this, and have updated my question with my findings accordingly. Thanks for answering!