Reply
Explorer
Posts: 31
Registered: ‎09-24-2015

Execute Shell script through oozie job in all node

I tried to execute shell script through an oozie job , It seems it just executed in jobTracker host not in other nodes. I am expecting the script to be executed in all the nodes. did I required any other specific configuration . Or I missed anything here.

 

workflow.xml

------------------------------

 

<workflow-app name="script_oozie_job" xmlns="uri:oozie:workflow:0.3">
<start to='Test' />
<action name="Test">
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property>
</configuration>
<exec>CopyFiles.sh</exec>
<argument>${argument1}</argument>
<file>hdfs://nameNode-host:8020/user/oozie/script/script_oozie_job/CopyFiles.sh#CopyFiles.sh</file>
</shell>
<ok to="end"/>
<error to="fail"/>
</action>
<kill name="fail">
<message>Script failed</message>
</kill>
<end name='end' />

 

 

job.properties

---------------------------------

 

nameNode=hdfs://nameNode-host:8020
jobTracker=jobTacker-host:8032
queueName=default
argument1=""
oozie.wf.application.path=hdfs://nameNode-host:8020/user/oozie/script/script_oozie_job

 

 

 

Regards

-Khirod

Cloudera Employee Sue
Cloudera Employee
Posts: 44
Registered: ‎09-11-2015

Re: Execute Shell script through oozie job in all node

I recommend checking out this blog post: How To: Use Oozie Shell and Java Actions

The Oozie shell action is run as a Hadoop job with one map task and zero reduce tasks - the job runs on one arbitrary node in the cluster.

 

If you are trying to run a shell-based MapReduce job across the cluster, use Hadoop Streaming. See these blog posts:

 

Explorer
Posts: 31
Registered: ‎09-24-2015

Re: Execute Shell script through oozie job in all node

Thanks Sue, for help.

 

I will try Hadoop Streaming and update here how it goes.

 

-Khirod

 

 

Announcements