Created on 10-07-2021 12:21 PM - last edited on 10-07-2021 06:57 PM by ask_bill_brooks
Hi Team,
I have a requirement where I need to schedule an oozie work flow using shell action. Shell action is using a shell script which in turn will be calling a hql file which is available on HDFS location.
I am not able to call that hql file from the shell script which I am using in shell action.
I know using beeline - f parameters to use to call local files but can you please guide me to call a file which is on HDFS.
Any help/suggestions will be highly appreciated.
Thanks,
Wasim
Created 10-26-2021 03:57 AM
Thank you all for your responses. This issue was because of credential problem and it's resolved now.
Created 10-15-2021 11:42 PM
Hi @wasimakram ,
Are you facing any issues/errors while calling the hive script ? can you share your workflow.xml file to have a look?
Below is the simple example for it:
<workflow-app xmlns = "uri:oozie:workflow:0.4" name = "simple-Workflow">
<start to = "fork_node" />
<fork name = "fork_node">
<path start = "Create_External_Table"/>
<path start = "Create_orc_Table"/>
</fork>
<action name = "Create_External_Table">
<hive xmlns = "uri:oozie:hive-action:0.4">
<job-tracker>xyz.com:8088</job-tracker>
<name-node>hdfs://rootname</name-node>
<script>hdfs_path_of_script/external.hive</script>
</hive>
<ok to = "end" />
<error to = "kill_job" />
</action>
<kill name = "kill_job">
<message>Job failed</message>
</kill>
<end name = "end" />
</workflow-app>
Regards,
Chethan YM
Created 10-26-2021 03:57 AM
Thank you all for your responses. This issue was because of credential problem and it's resolved now.