Reply
New Contributor
Posts: 2
Registered: ‎11-13-2017

How to pass the pameter to Oozie workflow from hdfs file/local system file

[ Edited ]

Hi

I have a hive query where i have parameterized the limit. ${limit} -- passing parameter like this it prompts me when i run the workflow directly or if i schedule it i have to give value in the co-ordinator. If I want to pass this from a file .. which can be present in my local system or hdfs location. How to do that?? Please help.

Highlighted
Posts: 388
Topics: 11
Kudos: 60
Solutions: 34
Registered: ‎09-02-2016

Re: How to pass the pameter to Oozie workflow from hdfs file/local system file

@Yashsrigupta

 

There could be a different approach but my recommendation is 

 

Instead of sending hive parameter from Oozie, you can control hive parameter from hiveconf, So Oozie calls only hive and hive calls hiveconf,  i mean 

 

instead of "Oozie -> hive"  and "Oozie -> hive parameter"

Try "Oozie -> hive" and "hive -> hive parameter"

 

Below link explains how to pass hiveconf from hive

https://stackoverflow.com/questions/12464636/how-to-set-variables-in-hive-scripts

New Contributor
Posts: 2
Registered: ‎11-13-2017

Re: How to pass the pameter to Oozie workflow from hdfs file/local system file

Thanks for the solution. But this will work for hive.

 

what if i have to pass the parameter while calling a sqoop or a pig command or other options available in oozie from a file present either on local system or hdfs. In that case this solution will not work. Do you have any other ways of doing this.

Explorer
Posts: 7
Registered: ‎04-07-2017

Re: How to pass the pameter to Oozie workflow from hdfs file/local system file

use shell action in your workflow and echo the required parameter in your shell executable file and use them as ${wf:actionData('shell_action_name') ['shell_variable']} in further on hive action as hiveconf or to the sqoop

Announcements