Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Spark2-Submit through Oozie shell action hangs and restarts spark application

avatar
Contributor

I have a Oozie job which starts a Oozie shell action,the shell action starts a spark application (spark2-submit). I am mostly doing spark sql. The jobs runs for a while and suddenly hangs. It starts the spark application all over again. 

 

I ran the same spark application in CDSW and it ran fine without issues. 

 

The same is happening with other Oozie job . The only common thing between these two jobs is that they run longer, around 2hrs. 

 

Any help will be helpful.

1 ACCEPTED SOLUTION

avatar
Contributor

The oozie mapper was running out of 4GB memory. I changed that to 8GB. Now the job ran fine without restarts. 

 

<configuration>
<property>
<name>oozie.launcher.mapreduce.map.memory.mb</name>
<value>8000</value>
</property>
<property>
<name>oozie.launcher.mapreduce.map.java.opts</name>
<value>-Xmx1500m</value>
</property>
<property>
<name>oozie.launcher.yarn.app.mapreduce.am.resource.mb</name>
<value>1024</value>
</property>
<property>
<name>oozie.launcher.mapreduce.map.java.opts</name>
<value>-Xmx870m</value>
</property>
</configuration>

 

View solution in original post

14 REPLIES 14

avatar
Contributor

Thanks for help 

avatar
Community Manager

Congratulations on resolving your issue @Sunil. Please don't forget to mark the reply that helped resolve the issue as the answer. That way when others have a similar issue they will be more likely to find it. 


Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

avatar
Expert Contributor

Great Sunil

 

Regards

Bimal

avatar
Explorer

I get the following error

 

 line 2: spark-submit: command not found
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.ShellMain], exit code [1]

avatar
Contributor

 

env -i spark2-submit --keytab svc.keytab --principal svc@CORP.COM sample.py
 

We submit the jobs someothing like this.