Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Google Groups by Oussama Jilal, 1 year ago
Could not obtain block: BP-304127416-10.0.0.7-1465905487911:blk_1073770935_30121 file=/hbase/WALs/datanode-2,16020,1466263181091-splitting/datanode-2%2C16020%2C1466263181091.default.1469654596681
via Stack Overflow by Kshitiz Sharma
, 2 years ago
Could not obtain block: BP-1205966836-127.0.0.1-1406828172945:blk_1073742119_1304 file=/notice.html
via Stack Overflow by McKracken
, 1 year ago
Could not obtain block: BP-2005327120-10.1.1.55-1467731650291:blk_1073741836_1015 file=/user/myuser/path/to/my/file.txt
via Stack Overflow by Schnuette
, 1 year ago
Could not obtain block: BP-863187118-172.17.0.3-1467826253798:blk_1073741856_1032 file=/user/test/wf1.csv
via Stack Overflow by user3033194
, 1 year ago
Could not obtain block: BP-971868671-192.168.50.2-1406571670535:blk_1073743276_2475 file=/wikidumps/enwiki-20130904-pages-meta-history3.xml-p000032706p000037161
via Google Groups by Philippe Kernévez, 1 year ago
Could not obtain block: BP-1831277630-10.16.37.124-1484306078618:blk_1073793876_55013 file=/test/inputdata/derby.log
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block:
BP-304127416-10.0.0.7-1465905487911:blk_1073770935_30121
file=/hbase/WALs/datanode-2,16020,1466263181091-splitting/datanode-2%2C16020%2C1466263181091.default.1469654596681	at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:882)	at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:563)	at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:793)	at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:840)	at java.io.DataInputStream.read(DataInputStream.java:100)	at org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:298)	at org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:267)	at org.apache.hadoop.hbase.wal.WALSplitter.getReader(WALSplitter.java:839)	at org.apache.hadoop.hbase.wal.WALSplitter.getReader(WALSplitter.java:763)	at org.apache.hadoop.hbase.wal.WALSplitter.splitLogFile(WALSplitter.java:297)	at org.apache.hadoop.hbase.wal.WALSplitter.splitLogFile(WALSplitter.java:235)	at org.apache.hadoop.hbase.regionserver.SplitLogWorker$1.exec(SplitLogWorker.java:104)	at org.apache.hadoop.hbase.regionserver.handler.WALSplitterHandler.process(WALSplitterHandler.java:72)	at org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:129)	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)	at java.lang.Thread.run(Thread.java:745)