Member since
03-12-2018
10
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5010 | 12-07-2018 02:38 PM |
01-02-2019
10:50 AM
env -i spark2-submit --keytab svc.keytab --principal svc@CORP.COM sample.py We submit the jobs someothing like this.
... View more
12-07-2018
02:38 PM
Thanks for help
... View more
12-07-2018
02:38 PM
1 Kudo
The oozie mapper was running out of 4GB memory. I changed that to 8GB. Now the job ran fine without restarts. <configuration> <property> <name>oozie.launcher.mapreduce.map.memory.mb</name> <value>8000</value> </property> <property> <name>oozie.launcher.mapreduce.map.java.opts</name> <value>-Xmx1500m</value> </property> <property> <name>oozie.launcher.yarn.app.mapreduce.am.resource.mb</name> <value>1024</value> </property> <property> <name>oozie.launcher.mapreduce.map.java.opts</name> <value>-Xmx870m</value> </property> </configuration>
... View more
12-07-2018
12:12 PM
sorry forgot to mention.. I have been using executor memory of 14GB and driver memory of 10GB. None of my tasks spill memory to disk. this is so strange and its shaking my fundamental understanding of spark. I have memory overhead of 3G. Again, the same setting in CDSW are used but it never failed from there. Its when I run the job in Oozie it fails. It restarts on its own and that one completes without any failures. When would spark use physical memory and virtual memory?
... View more
12-07-2018
08:24 AM
I applied this property and increased the limit to 6 GB. It still fails with exact same error message.
... View more
12-05-2018
12:18 PM
Hi, We see the error as below. "The container showed error ...Container [pid=68208,containerID=container_e59_1543621459332_7731_01_000002] is running beyond physical memory limits. Current usage: 4.0 GB of 4 GB physical memory used; 26.5 GB of 8.4 GB virtual memory used. Killing container..." When I run in CDSW or through Oozie both have same memory and configurations for my spark application(executor memory,core,driver memory,memory overhead etc). From CDSW it never failed but when I run from Oozie Shell Action (caloing spark2-submit) it randomly fails. Trying to understand what is different in Oozie, how do I set this memory limit.
... View more
12-04-2018
03:11 PM
Most of the logs in spark UI are showing "No logs available for container"
... View more
12-04-2018
03:07 PM
Thanks for reply. I tried to see logs from SparkUI but it was blank. It will be great if you can guide me on checking yarn logs. I tried below command and it gave me blank too yarn logs -applicationId application_1543621459332_8094 > spark_app.log
... View more
12-04-2018
02:49 PM
I have a Oozie job which starts a Oozie shell action,the shell action starts a spark application (spark2-submit). I am mostly doing spark sql. The jobs runs for a while and suddenly hangs. It starts the spark application all over again. I ran the same spark application in CDSW and it ran fine without issues. The same is happening with other Oozie job . The only common thing between these two jobs is that they run longer, around 2hrs. Any help will be helpful.
... View more
Labels:
- Labels:
-
Apache Oozie
-
Apache Spark