Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

OOZIE setup for Spark and hive jars

OOZIE setup for Spark and hive jars

Expert Contributor

Does any one have a sample workflow for the below scenario :

1. Sqoop job

2.Pig Job

3 Spark Job

4. Hive Job

Any help will be greatly appreciated!

2 REPLIES 2
Highlighted

Re: OOZIE setup for Spark and hive jars

Contributor

Please find the workflow.xml for Spark with Shell:

8759-wofkflowxml.jpg

<workflow-app name="rchamaku.test-oozie-spark" xmlns="uri:oozie:workflow:0.5"> <start to="shell-spark"/> <kill name="Kill"> <message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message> </kill> <action name="shell-spark"> <shell xmlns="uri:oozie:shell-action:0.1"> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <prepare> <delete path="/hdfs_path/job"/> </prepare> <configuration> <property> <name>oozie.launcher.mapred.job.queue.name</name> <value>MCE</value> </property> <property> <name>spark.yarn.queue</name> <value>MCE</value> </property> </configuration> <exec>spark-submit</exec> <argument>--queue</argument> <argument>quename</argument> <argument>--class</argument> <argument>org.emp.Employee</argument> <argument>--name</argument> <argument>Spark_rchamaku</argument> <argument>--master</argument> <argument>yarn-cluster</argument> <argument>--properties-file</argument> <argument>spark-submit.properties</argument> <argument>--keytab</argument> <argument>rambabu.keytab</argument> <argument>--files</argument> <argument>hive-site.xml</argument> <argument>employee_project.10-0.1-SNAPSHOT.jar</argument> <file> spark-submit.properties#spark-submit.properties </file> <file> rchamaku050613.keytab#rchamaku050613.keytab </file> <file>/hdfs_path/hive-site.xml#hive-site.xml </file> <file>/hdfs_path/employee_project.10-0.1-SNAPSHOT.jar#employee_project.10-0.1-SNAPSHOT.jar </file> <capture-output/> </shell> <ok to="End"/> <error to="Kill"/> </action> <end name="End"/> </workflow-app>


wofkflowxml.jpg
Highlighted

Re: OOZIE setup for Spark and hive jars

Explorer
@Amit Dass

Please find the sample workflows for sqoop, pig and hive jbos.

9022-wf.jpg

3. Hive job workflow

9021-wf2.jpg

Don't have an account?
Coming from Hortonworks? Activate your account here