Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

How to pass the pameter to Oozie workflow from hdfs file/local system file

How to pass the pameter to Oozie workflow from hdfs file/local system file

New Contributor

Hi

I have a hive query where i have parameterized the limit. ${limit} -- passing parameter like this it prompts me when i run the workflow directly or if i schedule it i have to give value in the co-ordinator. If I want to pass this from a file .. which can be present in my local system or hdfs location. How to do that?? Please help.

3 REPLIES 3

Re: How to pass the pameter to Oozie workflow from hdfs file/local system file

Champion

@Yashsrigupta

 

There could be a different approach but my recommendation is 

 

Instead of sending hive parameter from Oozie, you can control hive parameter from hiveconf, So Oozie calls only hive and hive calls hiveconf,  i mean 

 

instead of "Oozie -> hive"  and "Oozie -> hive parameter"

Try "Oozie -> hive" and "hive -> hive parameter"

 

Below link explains how to pass hiveconf from hive

https://stackoverflow.com/questions/12464636/how-to-set-variables-in-hive-scripts

Re: How to pass the pameter to Oozie workflow from hdfs file/local system file

New Contributor

Thanks for the solution. But this will work for hive.

 

what if i have to pass the parameter while calling a sqoop or a pig command or other options available in oozie from a file present either on local system or hdfs. In that case this solution will not work. Do you have any other ways of doing this.

Re: How to pass the pameter to Oozie workflow from hdfs file/local system file

use shell action in your workflow and echo the required parameter in your shell executable file and use them as ${wf:actionData('shell_action_name') ['shell_variable']} in further on hive action as hiveconf or to the sqoop