Support Questions

Find answers, ask questions, and share your expertise

creating UNIX domain socket with SOCK_STREAM

avatar
Rising Star

Hi All,

I was running command hadoop fs -get /user/centos/dist/a.txt

I got error during command java.net.SocketException: error creating UNIX domain socket with SOCK_STREAM: Too many open files

Open file limit is default limit. I didn't change.

Can some one suggest should i have to increase open file limit in sysctl and limit.conf?

Thanks

vinay

1 ACCEPTED SOLUTION

avatar
Contributor

Hi @Vinay K

It depends, you might have some underlying issue, first check who is holding up too many open files, by using the os command "lsof"

If everything seems OK but and you just have this issue due to the workload, increase the number of open files in limit.conf, but beware that if you are using Ambari and using of of the hadoop users, its limit will be overridden by the value from the Ambari configuration.

View solution in original post

2 REPLIES 2

avatar
Contributor

Hi @Vinay K

It depends, you might have some underlying issue, first check who is holding up too many open files, by using the os command "lsof"

If everything seems OK but and you just have this issue due to the workload, increase the number of open files in limit.conf, but beware that if you are using Ambari and using of of the hadoop users, its limit will be overridden by the value from the Ambari configuration.

avatar
Rising Star

@tsokorai

I have increased the hard and soft value in limits.conf.

Thanks