Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Zeppelin max open file limits

avatar
New Contributor

Running some queries e we get this error:

 

Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 11 in stage 102.0 failed 1 times, most recent failure: Lost task 11.0 in stage 102.0 (TID 4722, localhost, executor driver): java.io.FileNotFoundException: /tmp/blockmgr-95c88a5a-8d68-4b16-878d-158f40123999/1c/temp_shuffle_f82e49e7-a383-4803-82b3-030425703624 (Too many open files)

 

We modified limits.conf adding these value :

 

zeppelin soft nofile 32000
zeppelin hard nofile 32000

 

anyway looking at value of zeppelin process about open file limits the value is 4096

 

zeppelin@ ~]$ cat /proc/24222/limits
Limit Soft Limit Hard Limit Units

Max processes 510371 510371 processes
Max open files 4096 4096 files
Max locked memory 65536 65536 bytes

[zeppelin@~]$

 

Is there some configuration file to set somewhere?

 

Thanks in advance

 

1 REPLY 1

avatar
Super Collaborator

Hi @Paop 

 

We don't have enough information (how much data, spark submit command etc) to provide solution. Please raise a case for this issue.