java.io.IOException: Too many open files


Solutions on the web

Solution icon of apache
via hadoop-hdfs-dev by Apache Hudson Server, 1 year ago
Too many open files

Solution icon of apache
via hadoop-hdfs-dev by Apache Hudson Server, 1 year ago
Too many open files

Solution icon of apache
via hadoop-hdfs-dev by Apache Hudson Server, 1 year ago
Too many open files

Solution icon of googlegroups
via Google Groups by cracknut, 1 year ago
Too many open files

Solution icon of googlegroups
via Google Groups by Vinod, 9 months ago
Too many open files

Solution icon of web
via grokbase.com by Unknown author, 1 year ago
Too many open files

Solution icon of web
Too many open files

Solution icon of apache
via hadoop-hdfs-dev by Apache Hudson Server, 1 year ago
Too many open files

Solution icon of apache
via synapse-user by Khaled Farj, 1 year ago
Too many open files

Solution icon of github
Too many open files

Stack trace

java.io.IOException: Too many open files
	at sun.nio.ch.IOUtil.initPipe(Native Method)
	at sun.nio.ch.EPollSelectorImpl.<init>(EPollSelectorImpl.java:49)
	at sun.nio.ch.EPollSelectorProvider.openSelector(EPollSelectorProvider.java:18)
	at java.nio.channels.Selector.open(Selector.java:209)
	at org.apache.hadoop.ipc.Server$Responder.<init>(Server.java:602)
	at org.apache.hadoop.ipc.Server.<init>(Server.java:1511)
	at org.apache.hadoop.ipc.RPC$Server.<init>(RPC.java:408)
	at org.apache.hadoop.ipc.WritableRpcEngine$Server.<init>(WritableRpcEngine.java:332)
	at org.apache.hadoop.ipc.WritableRpcEngine.getServer(WritableRpcEngine.java:292)
	at org.apache.hadoop.ipc.WritableRpcEngine.getServer(WritableRpcEngine.java:47)
	at org.apache.hadoop.ipc.RPC.getServer(RPC.java:382)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.initIpcServer(DataNode.java:421)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:512)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:282)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:264)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1575)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1518)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1485)
	at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:630)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:464)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:186)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:71)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:178)
	at org.apache.hadoop.hdfs.TestFileConcurrentReader.init(TestFileConcurrentReader.java:88)
	at org.apache.hadoop.hdfs.TestFileConcurrentReader.setUp(TestFileConcurrentReader.java:73)

Write tip

You have a different solution? A short tip here would help you and many other users who saw this issue last week.

Users with the same issue

Samebug visitor profile picture
Unknown user
Once, 1 year ago
Samebug visitor profile picture
Unknown user
Once, 1 year ago
Samebug visitor profile picture
Unknown user
Once, 2 years ago
Samebug visitor profile picture
Unknown user
Once, 3 years ago
Samebug visitor profile picture
Unknown user
Once, 3 years ago
4 more bugmates