parquet.io.ParquetDecodingException

Can not read value at 0 in block 0 in file file:/home/file/ALL.adam/part-r-00062.gz.parquet

Samebug tips0

There are no available Samebug tips for this exception. If you know how to solve this issue, help other users by writing a short tip.

Don't give up yet. Paste your full stack trace to get a solution.

Solutions on the web31

  • via GitHub by car2008
    , 11 months ago
    Can not read value at 0 in block 0 in file file:/home/file/ALL.adam/part-r-00062.gz.parquet
  • via GitHub by car2008
    , 10 months ago
    Can not read value at 0 in block 0 in file hdfs://192.168.2.85:9000/user/ALL.adam/part-r-00001.gz.parquet
  • Can not read value at 0 in block 0 in file file:/.../src/test/resources/test_data/test.align.adam/part-r-00000.gz.parquet
  • Stack trace

    • parquet.io.ParquetDecodingException: Can not read value at 0 in block 0 in file file:/home/file/ALL.adam/part-r-00062.gz.parquet at parquet.hadoop.InternalParquetRecordReader.nextKeyValue(InternalParquetRecordReader.java:228) at parquet.hadoop.ParquetRecordReader.nextKeyValue(ParquetRecordReader.java:201) at org.apache.spark.rdd.NewHadoopRDD$$anon$1.hasNext(NewHadoopRDD.scala:168) at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327) at scala.collection.Iterator$$anon$14.hasNext(Iterator.scala:388) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327) at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:197) at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:64) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) at org.apache.spark.scheduler.Task.run(Task.scala:89) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to org.bdgenomics.formats.avro.Variant at org.bdgenomics.formats.avro.Genotype.put(Genotype.java:148) at parquet.avro.AvroIndexedRecordConverter.set(AvroIndexedRecordConverter.java:143) at parquet.avro.AvroIndexedRecordConverter.access$000(AvroIndexedRecordConverter.java:39) at parquet.avro.AvroIndexedRecordConverter$1.add(AvroIndexedRecordConverter.java:78) at parquet.avro.AvroIndexedRecordConverter.end(AvroIndexedRecordConverter.java:163) at parquet.io.RecordReaderImplementation.read(RecordReaderImplementation.java:413) at parquet.hadoop.InternalParquetRecordReader.nextKeyValue(InternalParquetRecordReader.java:209) ... 16 more

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    You’re the first here who have seen this exception.