java.io.IOException: replica.getGenerationStamp() < block.getGenerationStamp(), block=blk_1075552677_1829938, replica=ReplicaWaitingToBeRecovered, blk_1075552677_1828815, RWR getNumBytes() = 17814 getBytesOnDisk() = 17814 getVisibleLength()= -1 getVolume() = /var/data/hadoop/hdfs/dn/current getBlockFile() = /var/data/hadoop/hdfs/dn/current/BP-133353882-127.0.1.1-1438188921629/current/rbw/blk_1075552677 unlinked=false

Stack Overflow | sofia | 4 months ago
  1. 0

    Datanode error - Failed to obtain replica info for block

    Stack Overflow | 4 months ago | sofia
    java.io.IOException: replica.getGenerationStamp() < block.getGenerationStamp(), block=blk_1075552677_1829938, replica=ReplicaWaitingToBeRecovered, blk_1075552677_1828815, RWR getNumBytes() = 17814 getBytesOnDisk() = 17814 getVisibleLength()= -1 getVolume() = /var/data/hadoop/hdfs/dn/current getBlockFile() = /var/data/hadoop/hdfs/dn/current/BP-133353882-127.0.1.1-1438188921629/current/rbw/blk_1075552677 unlinked=false
  2. 0

    After IP change on server java.io.IOException: replica.getGenerationStamp()

    Stack Overflow | 2 years ago | Raghuveer
    java.io.IOException: replica.getGenerationStamp() &lt; block.getGenerationStamp(), block=blk_1073757987_17249, replica=ReplicaWaitingToBeRecovered, blk_1073757987_17179, RWR getNumBytes() = 81838954 getBytesOnDisk() = 81838954 getVisibleLength()= -1 getVolume() = /var/hadoop/data/current getBlockFile() = /var/hadoop/data/current/BP-967573188-192.168.XX.XX-1413284771002/current/rbw/blk_1073757987 unlinked=false
  3. 0

    HDInsight – Log storage attempt #1 – Soft Entropy

    softentropy.com | 8 months ago
    java.io.IOException: Corrupted block: ReplicaBeingWritten, blk_1073741859_7815, RBW getNumBytes() = 910951 getBytesOnDisk() = 910951 getVisibleLength()= 910951 getVolume() = /tmp/hadoop-root/dfs/data/current getBlockFile() = /tmp/hadoop-root/dfs/data/current/BP-953099033-10.0.0.17-1409838183920/current/rbw/blk_1073741859 bytesAcked=910951 bytesOnDisk=910951
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Soft Entropy – tending towards chaos

    softentropy.com | 8 months ago
    java.io.IOException: Corrupted block: ReplicaBeingWritten, blk_1073741859_7815, RBW getNumBytes() = 910951 getBytesOnDisk() = 910951 getVisibleLength()= 910951 getVolume() = /tmp/hadoop-root/dfs/data/current getBlockFile() = /tmp/hadoop-root/dfs/data/current/BP-953099033-10.0.0.17-1409838183920/current/rbw/blk_1073741859 bytesAcked=910951 bytesOnDisk=910951
  6. 0

    Soft Entropy – tending towards chaos

    softentropy.com | 4 months ago
    java.io.IOException: Corrupted block: ReplicaBeingWritten, blk_1073741859_7815, RBW getNumBytes() = 910951 getBytesOnDisk() = 910951 getVisibleLength()= 910951 getVolume() = /tmp/hadoop-root/dfs/data/current getBlockFile() = /tmp/hadoop-root/dfs/data/current/BP-953099033-10.0.0.17-1409838183920/current/rbw/blk_1073741859 bytesAcked=910951 bytesOnDisk=910951

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.io.IOException

      replica.getGenerationStamp() < block.getGenerationStamp(), block=blk_1075552677_1829938, replica=ReplicaWaitingToBeRecovered, blk_1075552677_1828815, RWR getNumBytes() = 17814 getBytesOnDisk() = 17814 getVisibleLength()= -1 getVolume() = /var/data/hadoop/hdfs/dn/current getBlockFile() = /var/data/hadoop/hdfs/dn/current/BP-133353882-127.0.1.1-1438188921629/current/rbw/blk_1075552677 unlinked=false

      at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.initReplicaRecovery()
    2. Apache Hadoop HDFS
      DataNode$5.run
      1. org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.initReplicaRecovery(FsDatasetImpl.java:2288)
      2. org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.initReplicaRecovery(FsDatasetImpl.java:2254)
      3. org.apache.hadoop.hdfs.server.datanode.DataNode.initReplicaRecovery(DataNode.java:2537)
      4. org.apache.hadoop.hdfs.server.datanode.DataNode.callInitReplicaRecovery(DataNode.java:2548)
      5. org.apache.hadoop.hdfs.server.datanode.DataNode.recoverBlock(DataNode.java:2620)
      6. org.apache.hadoop.hdfs.server.datanode.DataNode.access$400(DataNode.java:243)
      7. org.apache.hadoop.hdfs.server.datanode.DataNode$5.run(DataNode.java:2522)
      7 frames
    3. Java RT
      Thread.run
      1. java.lang.Thread.run(Thread.java:745)
      1 frame