Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Google Groups by gr...@cloudera.com, 2 years ago
Can not read value at 0 in block -1 in file hdfs://jenkins-parquet-1.ent.cloudera.com:8020/user/hive/warehouse/n1pu/-5600290658369385718--7673663513613548897_995880915_data.0
via Google Groups by Lbchen Chen, 2 years ago
Can not read value at 0 in block -1 in file hdfs://hdws02.houzz.com:8020/user/hive/warehouse/logs.db/test_par/dt=2013-09-30/670256257930807821--5192281563845402708_201244084_data.0
via Stack Overflow by Nagaraj Malaiappan
, 1 year ago
Can not read value at 1 in block 0 in file hdfs://quickstart.cloudera:8020/parq/customer/wocomp/part-m-00000.parquet
via GitHub by Fantoccini
, 2 years ago
Can not read value at 0 in block -1 in file hdfs://nameservice1/prod/view/warehouse/HLS/LOAN_MASTER_SZLNMST/year=2015/month=07/day=06/part-1436198627511-00000-00000-m-00000.parquet
via Google Groups by Unknown author, 1 year ago
Can not read value at 0 in block 0 in file file:/.../src/test/resources/test_data/test.align.adam/part-r-00000.gz.parquet
via GitHub by Vidhyaparvathy
, 2 years ago
Can not read value at 0 in block -1 in file
java.io.EOFException: 	at parquet.bytes.BytesUtils.readIntLittleEndianOnOneByte(BytesUtils.java:76)	at parquet.column.values.dictionary.DictionaryValuesReader.initFromPage(DictionaryValuesReader.java:56)	at parquet.column.impl.ColumnReaderImpl.readPage(ColumnReaderImpl.java:544)	at parquet.column.impl.ColumnReaderImpl.checkRead(ColumnReaderImpl.java:509)	at parquet.column.impl.ColumnReaderImpl.consume(ColumnReaderImpl.java:560)	at parquet.column.impl.ColumnReaderImpl.(ColumnReaderImpl.java:355)	at parquet.column.impl.ColumnReadStoreImpl.newMemColumnReader(ColumnReadStoreImpl.java:63)	at parquet.column.impl.ColumnReadStoreImpl.getColumnReader(ColumnReadStoreImpl.java:58)	at parquet.io.RecordReaderImplementation.(RecordReaderImplementation.java:265)	at parquet.io.MessageColumnIO.getRecordReader(MessageColumnIO.java:59)	at parquet.io.MessageColumnIO.getRecordReader(MessageColumnIO.java:73)	at parquet.hadoop.InternalParquetRecordReader.checkRead(InternalParquetRecordReader.java:110)	at parquet.hadoop.InternalParquetRecordReader.nextKeyValue(InternalParquetRecordReader.java:172)	at parquet.hadoop.ParquetRecordReader.nextKeyValue(ParquetRecordReader.java:130)	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)	at java.security.AccessController.doPrivileged(Native Method)	at javax.security.auth.Subject.doAs(Subject.java:396)	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1438)	at org.apache.hadoop.mapred.Child.main(Child.java:262)