java.io.IOException: Block blk_2440422069461309270_3925117 is not valid.

hbase-user | Stanley Xu | 6 years ago
  1. 0

    Re: How could I make sure the famous "xceiver" parameters works in the data node?

    hbase-user | 6 years ago | Stanley Xu
    java.io.IOException: Block blk_2440422069461309270_3925117 is not valid.
  2. 0

    Cannot write to local HDFS datanode

    Stack Overflow | 3 years ago
    java.io.IOException: Version Mismatch (Expected: 28, Received: 26738 )
  3. 0

    [HDFS-issues] [jira] [Created] (HDFS-3436) Append to file is failing when one of the datanode where the block present is down. - Grokbase

    grokbase.com | 6 months ago
    java.io.IOException: BP-2001850558-xx.xx.xx.xx-1337249347060:blk_-8165642083860293107_1002 is neither a RBW nor a Finalized, r=ReplicaBeingWritten, blk_-8165642083860293107_1003, RBW getNumBytes() = 1024 getBytesOnDisk() = 1024 getVisibleLength()= 1024 getVolume() = E:\MyWorkSpace\branch-2\Test\build\test\data\dfs\data\data1\current getBlockFile() = E:\MyWorkSpace\branch-2\Test\build\test\data\dfs\data\data1\current\BP-2001850558-xx.xx.xx.xx-1337249347060\current\rbw\blk_-8165642083860293107 bytesAcked=1024 bytesOnDisk=102
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Soft Entropy – tending towards chaos

    softentropy.com | 8 months ago
    java.io.IOException: Corrupted block: ReplicaBeingWritten, blk_1073741859_7815, RBW getNumBytes() = 910951 getBytesOnDisk() = 910951 getVisibleLength()= 910951 getVolume() = /tmp/hadoop-root/dfs/data/current getBlockFile() = /tmp/hadoop-root/dfs/data/current/BP-953099033-10.0.0.17-1409838183920/current/rbw/blk_1073741859 bytesAcked=910951 bytesOnDisk=910951
  6. 0

    Problem syncing commit log: /hypertable/servers/rs18/log/user/3252: Error flushing DFS fd 73135

    Google Groups | 3 years ago | David
    java.io.IOException: Interrupted receiveBlock

    1 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.io.IOException

      Block blk_2440422069461309270_3925117 is not valid.

      at org.apache.hadoop.hdfs.server.datanode.FSDataset.getBlockFile()
    2. Apache Hadoop HDFS
      DataXceiver.run
      1. org.apache.hadoop.hdfs.server.datanode.FSDataset.getBlockFile(FSDataset.java:734)
      2. org.apache.hadoop.hdfs.server.datanode.FSDataset.getLength(FSDataset.java:722)
      3. org.apache.hadoop.hdfs.server.datanode.BlockSender.<init>(BlockSender.java:92)
      4. org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:172)
      5. org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:95)
      5 frames
    3. Java RT
      Thread.run
      1. java.lang.Thread.run(Thread.java:619)
      1 frame