java.io.IOException: Could not obtain block: blk_5604690829708125511_15489 file=/usr/collarity/data/urls-new/part-00000/20081110-163426/_0.tis

hadoop-hdfs-dev | Todd Lipcon (JIRA) | 7 years ago
tip
Click on the to mark the solution that helps you, Samebug will learn from it.
As a community member, you’ll be rewarded for you help.
  1. 0

    [jira] Reopened: (HDFS-127) DFSClient block read failures cause open DFSInputStream to become unusable

    hadoop-hdfs-dev | 7 years ago | Todd Lipcon (JIRA)
    java.io.IOException: Could not obtain block: blk_5604690829708125511_15489 file=/usr/collarity/data/urls-new/part-00000/20081110-163426/_0.tis
  2. 0

    Reading file from HDFS using HDFS Java API

    Stack Overflow | 3 years ago | dnivra
    java.io.IOException: Could not obtain block: blk_-747325769320762541_16269493 file=/user/s3t.txt
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

  4. 0

    DataXceiver java.io.InterruptedIOException error on scannning Hbase table

    Google Groups | 3 years ago | AnushaGuntaka
    java.io.IOException: Could not seek StoreFileScanner[HFileScanner for reader reader=hdfs://172.20.193.234:9000/assortmentLinking/performance_weekly_sku/fa0fb91bd58f2117443db90278c3a3fe/cf/1597dcfc99e540 25bc7b848cfb998b1f, compression=none, cacheConf=CacheConfig:enabled [cacheDataOnRead=true] [cacheDataOnWrite=false] [cacheIndexesOnWrite=false] [cacheBloomsOnWrite=false] [cacheEvictOnClose=false] [cacheCompressed=false], firstKey=SKU128331STORE3942WEEK37/cf:facings/1397826519184/Put, lastKey=SKU129999STORE3966WEEK9/cf:week_id/1397827347036/Put, avgKeyLen=53, avgValueLen=3, entries=120178401, length=7838467097, cur=null] to key SKU128331STORE3942WEEK37/cf:/LATEST_TIMESTAMP/DeleteFamily/vlen=0/ts=0
  5. 0

    The number of fd and CLOSE_WAIT keep increasing.

    Google Groups | 6 years ago | Xu-Feng Mao
    java.io.IOException: Got error in response to OP_READ_BLOCK self=/ 10.150.161.64:55229, remote=/10.150.161.73:50010 for file /hbase/S3Table/d0d5004792ec47e02665d1f0947be6b6/file/8279698872781984241 for block 2791681537571770744_132142063

  1. rp 1 times, last 3 months ago
1 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.io.IOException

    Could not obtain block: blk_5604690829708125511_15489 file=/usr/collarity/data/urls-new/part-00000/20081110-163426/_0.tis

    at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode()
  2. Apache Hadoop HDFS
    DFSClient$DFSInputStream.read
    1. org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1708)
    2. org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1536)
    3. org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1663)
    3 frames
  3. Java RT
    DataInputStream.read
    1. java.io.DataInputStream.read(DataInputStream.java:132)
    1 frame
  4. Apache Nutch
    FsDirectory$DfsIndexInput.readInternal
    1. org.apache.nutch.indexer.FsDirectory$DfsIndexInput.readInternal(FsDirectory.java:174)
    1 frame
  5. Lucene
    SegmentTermDocs.seek
    1. org.apache.lucene.store.BufferedIndexInput.refill(BufferedIndexInput.java:152)
    2. org.apache.lucene.store.BufferedIndexInput.readByte(BufferedIndexInput.java:38)
    3. org.apache.lucene.store.IndexInput.readVInt(IndexInput.java:76)
    4. org.apache.lucene.index.TermBuffer.read(TermBuffer.java:63)
    5. org.apache.lucene.index.SegmentTermEnum.next(SegmentTermEnum.java:131)
    6. org.apache.lucene.index.SegmentTermEnum.scanTo(SegmentTermEnum.java:162)
    7. org.apache.lucene.index.TermInfosReader.scanEnum(TermInfosReader.java:223)
    8. org.apache.lucene.index.TermInfosReader.get(TermInfosReader.java:217)
    9. org.apache.lucene.index.SegmentTermDocs.seek(SegmentTermDocs.java:54)
    9 frames