Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Not able to run Spark jar action in Oozie

avatar
Contributor

I am trying to run the Spark action in Oozie with spark .jar

 

Provided below values in Hue Oozie Spark action,

Jar/py Name: solution.jar

Main Class: Module.final_solution

Files: /user/hadoop/solution/solution.jar

 

Below are the properties:

Spark Master: yarn-master

Mode: Cluster

App Name: Final Solution

 

Getting below error while running Oozie action with above configurations,

error.jpg

 

Same jar working fine from Edge node with Spark submit.

Any help appriciated.

1 ACCEPTED SOLUTION

avatar
Contributor

Sorry, could not focus on this, was busy with production activities.

 

Finally, I could able to run it successfully with below configurations,

 

Jar / py name   :  ${nameNode}/user/solution.jar

Main Class       :  Module.final_solution

Options List     : --conf spark.yarn.jar=local:/opt/cloudera/parcels/CDH/lib/spark/lib/spark-assembly.jar

 

Properties:

Spark Master : yarn

Mode               : cluster

App name       : Final Solution

View solution in original post

4 REPLIES 4

avatar
Explorer

This looks like Hue might be having problems communicating with Yarn. Have you tried submitting the job.properties from the Edge node using

oozie job -submit

?

 

Is hue able to submit a hello-world oozie job and monitor it? Are you able to submit other jobs via Hue (Hive/pig, etc)?

avatar
Contributor

Yes, other jobs are wokring fine.

I am on CDH 5.9, so just tried with jar path in options list which is atleast running the job now.

I am encountering below error while running the job,

Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, requirement failed
java.lang.IllegalArgumentException: requirement failed

 

 

avatar
Rising Star

Can you post a full stack trace?

avatar
Contributor

Sorry, could not focus on this, was busy with production activities.

 

Finally, I could able to run it successfully with below configurations,

 

Jar / py name   :  ${nameNode}/user/solution.jar

Main Class       :  Module.final_solution

Options List     : --conf spark.yarn.jar=local:/opt/cloudera/parcels/CDH/lib/spark/lib/spark-assembly.jar

 

Properties:

Spark Master : yarn

Mode               : cluster

App name       : Final Solution