java.io.IOException: Could not obtain block: blk_3380512596555557728_1002 file=/hbase/hbase.version

hadoop-hdfs-dev | Bassam Tabbara (JIRA) | 7 years ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    [jira] Created: (HDFS-872) DFSClient 0.20.1 is incompatible with HDFS 0.20.2

    hadoop-hdfs-dev | 7 years ago | Bassam Tabbara (JIRA)
    java.io.IOException: Could not obtain block: blk_3380512596555557728_1002 file=/hbase/hbase.version
  2. 0

    Reading file from HDFS using HDFS Java API

    Stack Overflow | 4 years ago | dnivra
    java.io.IOException: Could not obtain block: blk_-747325769320762541_16269493 file=/user/s3t.txt
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

  4. 0

    Re: [Aurelius] Faunus: Incremental loading for nodes and edges?

    Google Groups | 3 years ago | David
    java.io.IOException: Blocklist for /user/graphie/output/job-1/part-r-00000 has changed!
  5. 0

    DataXceiver java.io.InterruptedIOException error on scannning Hbase table

    Google Groups | 3 years ago | AnushaGuntaka
    java.io.IOException: Could not seek StoreFileScanner[HFileScanner for reader reader=hdfs://172.20.193.234:9000/assortmentLinking/performance_weekly_sku/fa0fb91bd58f2117443db90278c3a3fe/cf/1597dcfc99e540 25bc7b848cfb998b1f, compression=none, cacheConf=CacheConfig:enabled [cacheDataOnRead=true] [cacheDataOnWrite=false] [cacheIndexesOnWrite=false] [cacheBloomsOnWrite=false] [cacheEvictOnClose=false] [cacheCompressed=false], firstKey=SKU128331STORE3942WEEK37/cf:facings/1397826519184/Put, lastKey=SKU129999STORE3966WEEK9/cf:week_id/1397827347036/Put, avgKeyLen=53, avgValueLen=3, entries=120178401, length=7838467097, cur=null] to key SKU128331STORE3942WEEK37/cf:/LATEST_TIMESTAMP/DeleteFamily/vlen=0/ts=0

  1. rp 1 times, last 4 months ago
1 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.io.IOException

    Could not obtain block: blk_3380512596555557728_1002 file=/hbase/hbase.version

    at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode()
  2. Apache Hadoop HDFS
    DFSClient$DFSInputStream.read
    1. org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1788)
    2. org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1616)
    3. org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1743)
    4. org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1673)
    4 frames
  3. Java RT
    DataInputStream.readUTF
    1. java.io.DataInputStream.readUnsignedShort(DataInputStream.java:320)
    2. java.io.DataInputStream.readUTF(DataInputStream.java:572)
    2 frames
  4. HBase
    FSUtils.checkVersion
    1. org.apache.hadoop.hbase.util.FSUtils.getVersion(FSUtils.java:189)
    2. org.apache.hadoop.hbase.util.FSUtils.checkVersion(FSUtils.java:208)
    2 frames
  5. HBase - Client
    HMaster.<init>
    1. org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:208)
    1 frame
  6. Java RT
    Constructor.newInstance
    1. sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    2. sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
    3. sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
    4. java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    4 frames
  7. HBase - Client
    HMaster.main
    1. org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1241)
    2. org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1282)
    2 frames