java.io.IOException: replica.getGenerationStamp() < block.getGenerationStamp(), block=blk_1074806221_1829936, replica=ReplicaWaitingToBeRecovered, blk_1074806221_1305648, RWR getNumBytes() = 9259189 getBytesOnDisk() = 9259189 getVisibleLength()= -1 getVolume() = /var/data/hadoop/hdfs/dn/current getBlockFile() = /var/data/hadoop/hdfs/dn/current/BP-133353882-127.0.1.1-1438188921629/current/rbw/blk_1074806221 unlinked=false

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Stack Overflow by sofia
, 1 year ago
getVolume() = /var/data/hadoop/hdfs/dn/current getBlockFile() = /var/data/hadoop/hdfs/dn/current/BP-133353882-127.0.1.1-1438188921629/current/rbw/blk_1074806221 unlinked=false
via Stack Overflow by Raghuveer
, 2 years ago
replica.getGenerationStamp() &lt; block.getGenerationStamp(), block=blk_1073757987_17249, replica=ReplicaWaitingToBeRecovered, blk_1073757987_17179, RWR getNumBytes() = 81838954 getBytesOnDisk() = 81838954 getVisibleLength()= -1
via apache.org by Unknown author, 1 year ago
getVisibleLength()= 794 getVolume() = /home/ec2-user/jenkins/workspace/HBase-0.95-Hadoop-2/hbase-server/target/test-data/f2763e32-fe49-4988-ac94-eeca82431821/dfscluster_643a635e-4e39-4aa5-974c-25e01db16ff7/dfs/data/data1/current getBlockFile
java.io.IOException: replica.getGenerationStamp() < block.getGenerationStamp(), block=blk_1074806221_1829936, replica=ReplicaWaitingToBeRecovered, blk_1074806221_1305648, RWR getNumBytes() = 9259189 getBytesOnDisk() = 9259189 getVisibleLength()= -1 getVolume() = /var/data/hadoop/hdfs/dn/current getBlockFile() = /var/data/hadoop/hdfs/dn/current/BP-133353882-127.0.1.1-1438188921629/current/rbw/blk_1074806221 unlinked=false
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.initReplicaRecovery(FsDatasetImpl.java:2288)
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.initReplicaRecovery(FsDatasetImpl.java:2254)
at org.apache.hadoop.hdfs.server.datanode.DataNode.initReplicaRecovery(DataNode.java:2537)
at org.apache.hadoop.hdfs.server.datanode.DataNode.callInitReplicaRecovery(DataNode.java:2548)
at org.apache.hadoop.hdfs.server.datanode.DataNode.recoverBlock(DataNode.java:2620)
at org.apache.hadoop.hdfs.server.datanode.DataNode.access$400(DataNode.java:243)
at java.lang.Thread.run(Thread.java:745)

Users with the same issue

You are the first who have seen this exception. Write a tip to help other users and build your expert profile.

Know the solutions? Share your knowledge to help other developers to debug faster.