Reply
Contributor
Posts: 43
Registered: ‎03-04-2015

Oozie Spark Action : getting out java.lang.OutOfMemoryError: PermGen space Error

Hi

 

I have tried to run my Spark Application using oozie on CDH 5.5.2  but  After some time my Oozie launcher MAPREDUCE application show below Errors :

 

ERROR [sparkDriver-akka.actor.default-dispatcher-35] akka.actor.ActorSystemImpl: Uncaught fatal error from thread [sparkDriver-akka.actor.default-dispatcher-24] shutting down ActorSystem [sparkDriver]
java.lang.OutOfMemoryError: PermGen space
at sun.misc.Unsafe.defineClass(Native Method)
at sun.reflect.ClassDefiner.defineClass(ClassDefiner.java:63)
at sun.reflect.MethodAccessorGenerator$1.run(MethodAccessorGenerator.java:399)
at sun.reflect.MethodAccessorGenerator$1.run(MethodAccessorGenerator.java:396)
at java.security.AccessController.doPrivileged(Native Method)
at sun.reflect.MethodAccessorGenerator.generate(MethodAccessorGenerator.java:395)
at sun.reflect.MethodAccessorGenerator.generateSerializationConstructor(MethodAccessorGenerator.java:113)
at sun.reflect.ReflectionFactory.newConstructorForSerialization(ReflectionFactory.java:331)
at java.io.ObjectStreamClass.getSerializableConstructor(ObjectStreamClass.java:1376)
at java.io.ObjectStreamClass.access$1500(ObjectStreamClass.java:72)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:493)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:468)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:468)
at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:365)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:602)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
at scala.util.Try$.apply(Try.scala:161)
at akka.serialization.Serialization.deserialize(Serialization.scala:98)
at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)

 

 

 

my workflow.xml is :

 

<workflow-app xmlns='uri:oozie:workflow:0.5' name='FaceDetection'>
<start to='FaceDetection' />
<action name="FaceDetection">
<spark xmlns="uri:oozie:spark-action:0.1">
<job-tracker>ivcp-m01:8032</job-tracker>
<name-node>hdfs://
ivcp-m01:8020</name-node>
 master>yarn-client</master>
<mode>client</mode>
<name>Face Detection</name>
<class>com.streaming.facedetection.streaming_facedetection</class>
<jar>hdfs://
ivcp-m01:8020/user/master/oozie/lib/face_detection_streaming-assembly-3.0.jar</jar>
<spark-opts>--num-executors 6 --conf spark.executor.extraLibraryPath=/opt/cloudera/parcels/CDH-5.5.0-1.cdh5.5.0.p0.8/lib/hadoop/lib/native --conf spark.authenticate=false --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.executorIdleTimeout=60 --conf spark.dynamicAllocation.minExecutors=0 --conf spark.dynamicAllocation.schedulerBacklogTimeout=1 --conf spark.eventLog.dir=hdfs://
ivcp-m01:8020/user/spark/applicationHistory --conf spark.eventLog.enabled=true --conf spark.serializer=org.apache.spark.serializer.KryoSerializer --conf spark.shuffle.service.enabled=true --conf spark.shuffle.service.port=7337 --conf spark.yarn.historyServer.address=http://ivcp-m01:18088 --conf spark.yarn.jar=local:/opt/cloudera/parcels/CDH-5.5.2-1.cdh5.5.2.p0.4/lib/spark/lib/spark-assembly.jar</spark-opts>
<arg>ivcp-m01:9092</arg>
<arg>a-0</arg>
<arg>output</arg>
</spark>
<ok to="end"/>
<error to="fail"/>
</action>
<kill name="fail">
<message>Workflow failed, error
message[${wf:errorMessage(wf:lastErrorNode())}]
</message>
</kill>
<end name='end' />
</workflow-app>

 

 

my job.properties :

 

jobOutput=/user/master/output
oozie.wf.application.path=/user/master/oozie/
oozie.use.system.libpath=true

 

 

how can I solve it?

 

Regards

Prateek

New Contributor
Posts: 7
Registered: ‎06-14-2017

Re: Oozie Spark Action : getting out java.lang.OutOfMemoryError: PermGen space Error

have you sloved this?
Cloudera Employee
Posts: 22
Registered: ‎06-17-2016

Re: Oozie Spark Action : getting out java.lang.OutOfMemoryError: PermGen space Error

In yarn client mode the Spark driver runs in Oozie's LauncherMapper. The way to increase the memory allocated to the LauncherMapper is described here

Highlighted
New Contributor
Posts: 7
Registered: ‎06-14-2017

Re: Oozie Spark Action : getting out java.lang.OutOfMemoryError: PermGen space Error

``` xml
thank your point,
just edit workflow.xml file, add :

<workflow-app name="simple-ONE-wf" xmlns="uri:oozie:workflow:0.1">
<start to='ONE'/>
<action name="ONE">
<spark xmlns="uri:oozie:spark-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>oozie.launcher.mapreduce.map.memory.mb</name>
<value>4096</value>
</property>
<property>
<name>oozie.launcher.mapreduce.map.java.opts</name>
<value>-Xmx3200m</value>
</property>
<property>
<name>oozie.launcher.mapreduce.map.java.opts</name>
<value>-XX:MaxPermSize=1g</value>
</property>
...
</configuration>
...
</action>

<kill name='kill'>
<message>Something went wrong: ${wf:errorCode('firstdemo')}</message>
</kill>
<end name='end'/>
</workflow-app>
```
Announcements