- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
creating UNIX domain socket with SOCK_STREAM
- Labels:
-
Apache Hadoop
Created ‎06-08-2018 09:51 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi All,
I was running command hadoop fs -get /user/centos/dist/a.txt
I got error during command java.net.SocketException: error creating UNIX domain socket with SOCK_STREAM: Too many open files
Open file limit is default limit. I didn't change.
Can some one suggest should i have to increase open file limit in sysctl and limit.conf?
Thanks
vinay
Created ‎06-08-2018 12:13 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @Vinay K
It depends, you might have some underlying issue, first check who is holding up too many open files, by using the os command "lsof"
If everything seems OK but and you just have this issue due to the workload, increase the number of open files in limit.conf, but beware that if you are using Ambari and using of of the hadoop users, its limit will be overridden by the value from the Ambari configuration.
Created ‎06-08-2018 12:13 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @Vinay K
It depends, you might have some underlying issue, first check who is holding up too many open files, by using the os command "lsof"
If everything seems OK but and you just have this issue due to the workload, increase the number of open files in limit.conf, but beware that if you are using Ambari and using of of the hadoop users, its limit will be overridden by the value from the Ambari configuration.
Created ‎06-08-2018 12:21 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
