org.apache.hadoop.util.DiskChecker$DiskErrorException: Invalid value for volsFailed : 1 , Volumes tolerated : 0

Super User | drjrm3 | 2 years ago
  1. 0

    How do I find which disk (volume) is failing?

    Super User | 2 years ago | drjrm3
    org.apache.hadoop.util.DiskChecker$DiskErrorException: Invalid value for volsFailed : 1 , Volumes tolerated : 0
  2. 0

    Re: trouble adding volumes to DataNode Data Directory - Grokbase

    grokbase.com | 1 year ago
    org.apache.hadoop.util.DiskChecker$DiskErrorException: Too many failed volumes - current valid volumes: 1, volumes configured: 2, volumes failed: 1, volume failures tolerated: 0
  3. 0

    hadoop hdfs 增添数据目录出错 - 编程

    myexception.cn | 1 year ago
    org.apache.hadoop.util.DiskChecker$DiskErrorException: Too many failed volumes - current valid volumes: 1, volumes configured: 2, volum es failed: 1, volume failures tolerated: 0
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    [HDFS-1592] Datanode startup doesn't honor volumes.tolerated - ASF JIRA

    apache.org | 1 year ago
    org.apache.hadoop.util.DiskChecker$DiskErrorException: Invalid value for volumes required - validVolsRequired: 3, Current valid volumes: 2, volsConfigured: 4, volFailuresTolerated: 1
  6. 0

    [HDFS-1592] Datanode startup doesn't honor volumes.tolerated - ASF JIRA

    apache.org | 1 year ago
    org.apache.hadoop.util.DiskChecker$DiskErrorException: Invalid value for volumes required - validVolsRequired: 4, Current valid volumes: 3, volsConfigured: 4, volFailuresTolerated: 0

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.apache.hadoop.util.DiskChecker$DiskErrorException

      Invalid value for volsFailed : 1 , Volumes tolerated : 0

      at org.apache.hadoop.hdfs.server.datanode.FSDataset.<init>()
    2. Apache Hadoop HDFS
      DataNode.main
      1. org.apache.hadoop.hdfs.server.datanode.FSDataset.<init>(FSDataset.java:974)
      2. org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:403)
      3. org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:309)
      4. org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1651)
      5. org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
      6. org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1608)
      7. org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1734)
      8. org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1751)
      8 frames