Created 07-17-2017 02:31 PM
I'm using HDP2.6.0 where I have installed 4.2.0v of Oozie and 2.1.1. of Spark.
My question is, can I manage and adjust those 2 so I can submit job of Spark2 application to Oozie or I need to wait for 5.0 version of Oozie. Does anyone had any of problems like this and how did he fix it?
Created 07-21-2017 02:13 PM
@Ivan Majnaric I think, you can manage and submit Spark2 applications to Oozie.
Here are the steps, I would like to suggest you:
Hope this will help you and get you going. 🙂
Created 09-21-2017 06:45 AM
Here are the steps to run spark2 jobs using oozie:
Created 10-05-2017 01:22 PM
I followed word by word aforementiond post. But the result same: Failed or killed 🙂
Created 10-25-2017 05:51 PM
I followed instructions mentioned in Configuring Oozie Spark Action for Spark 2
And Injected the setting oozie.action.sharelib.for.spark=spark2 in job properties.
With the above mentioned setting, the job failed.
Once I added the setting in workflow.xml, the job succeeded.
<property> <name>oozie.action.sharelib.for.spark</name> <value>spark2</value> </property>
P.S: Driver jar for the job was created with scope as provided for Spark2 dependencies.
Created 12-18-2017 01:03 PM
1. Follow instructions from this site:
2. Some of the libs that you'll have put in the spark2 sharelib directory on hdfs will probably conflct with libs from oozie sharelib directory. You will need to remove the conflicting libs from the spark2 sharelib directory.
3. After removing the conflicting libs, run oozie admin -sharelibupdate.