java.io.IOException: Too many open files

hadoop-hdfs-dev | Apache Hudson Server | 6 years ago
  1. 0

    Hadoop-Hdfs-trunk - Build # 542 - Still Failing

    hadoop-hdfs-dev | 6 years ago | Apache Hudson Server
    java.io.IOException: Too many open files
  2. 0

    Hadoop-Hdfs-trunk - Build # 547 - Still Failing

    hadoop-hdfs-dev | 6 years ago | Apache Hudson Server
    java.io.IOException: Too many open files
  3. 0

    Hadoop-Hdfs-trunk - Build # 540 - Still Failing

    hadoop-hdfs-dev | 6 years ago | Apache Hudson Server
    java.io.IOException: Too many open files
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Hadoop-Hdfs-trunk - Build # 549 - Still Failing

    hadoop-hdfs-dev | 6 years ago | Apache Hudson Server
    java.io.IOException: Too many open files
  6. 0

    Hadoop-Hdfs-trunk - Build # 561 - Still Failing

    hadoop-hdfs-dev | 6 years ago | Apache Hudson Server
    java.io.IOException: Too many open files

  1. rp 1 times, last 2 months ago
  2. Justin 1 times, last 4 months ago
13 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.io.IOException

    Too many open files

    at sun.nio.ch.IOUtil.initPipe()
  2. Java RT
    Selector.open
    1. sun.nio.ch.IOUtil.initPipe(Native Method)
    2. sun.nio.ch.EPollSelectorImpl.<init>(EPollSelectorImpl.java:49)
    3. sun.nio.ch.EPollSelectorProvider.openSelector(EPollSelectorProvider.java:18)
    4. java.nio.channels.Selector.open(Selector.java:209)
    4 frames
  3. Hadoop
    RPC.getServer
    1. org.apache.hadoop.ipc.Server$Listener.<init>(Server.java:318)
    2. org.apache.hadoop.ipc.Server.<init>(Server.java:1502)
    3. org.apache.hadoop.ipc.RPC$Server.<init>(RPC.java:408)
    4. org.apache.hadoop.ipc.WritableRpcEngine$Server.<init>(WritableRpcEngine.java:332)
    5. org.apache.hadoop.ipc.WritableRpcEngine.getServer(WritableRpcEngine.java:292)
    6. org.apache.hadoop.ipc.WritableRpcEngine.getServer(WritableRpcEngine.java:47)
    7. org.apache.hadoop.ipc.RPC.getServer(RPC.java:382)
    7 frames
  4. Apache Hadoop HDFS
    TestFileConcurrentReader.setUp
    1. org.apache.hadoop.hdfs.server.datanode.DataNode.initIpcServer(DataNode.java:416)
    2. org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:507)
    3. org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:281)
    4. org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:263)
    5. org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1570)
    6. org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1513)
    7. org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1480)
    8. org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:630)
    9. org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:464)
    10. org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:186)
    11. org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:71)
    12. org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:178)
    13. org.apache.hadoop.hdfs.TestFileConcurrentReader.init(TestFileConcurrentReader.java:88)
    14. org.apache.hadoop.hdfs.TestFileConcurrentReader.setUp(TestFileConcurrentReader.java:73)
    14 frames