org.apache.parquet.io.ParquetEncodingException

/Users/heuermh/working/adam/sorted.adam/part-r-00000.gz.parquet invalid: all the files must be contained in the root sorted.adam

Solutions on the web1

  • via GitHub by heuermh
    , 6 months ago
    /Users/heuermh/working/adam/sorted.adam/part-r-00000.gz.parquet invalid: all the files must be contained in the root sorted.adam
  • Stack trace

    • org.apache.parquet.io.ParquetEncodingException: /Users/heuermh/working/adam/sorted.adam/part-r-00000.gz.parquet invalid: all the files must be contained in the root sorted.adam at org.apache.parquet.hadoop.ParquetFileWriter.mergeFooters(ParquetFileWriter.java:444) at org.apache.parquet.hadoop.ParquetFileWriter.writeMetadataFile(ParquetFileWriter.java:420) at org.apache.parquet.hadoop.ParquetOutputCommitter.writeMetaDataFile(ParquetOutputCommitter.java:58) at org.apache.parquet.hadoop.ParquetOutputCommitter.commitJob(ParquetOutputCommitter.java:48) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1145) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1074) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1074) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111) at org.apache.spark.rdd.RDD.withScope(RDD.scala:316) at org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:1074) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopFile$2.apply$mcV$sp(PairRDDFunctions.scala:994) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopFile$2.apply(PairRDDFunctions.scala:985) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopFile$2.apply(PairRDDFunctions.scala:985) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111) at org.apache.spark.rdd.RDD.withScope(RDD.scala:316) at org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopFile(PairRDDFunctions.scala:985) at org.apache.spark.rdd.InstrumentedPairRDDFunctions.saveAsNewAPIHadoopFile(InstrumentedPairRDDFunctions.scala:477) at org.bdgenomics.adam.rdd.ADAMRDDFunctions$$anonfun$saveRddAsParquet$1.apply$mcV$sp(ADAMRDDFunctions.scala:159) at org.bdgenomics.adam.rdd.ADAMRDDFunctions$$anonfun$saveRddAsParquet$1.apply(ADAMRDDFunctions.scala:143) at org.bdgenomics.adam.rdd.ADAMRDDFunctions$$anonfun$saveRddAsParquet$1.apply(ADAMRDDFunctions.scala:143) at scala.Option.fold(Option.scala:157) at org.apache.spark.rdd.Timer.time(Timer.scala:48) at org.bdgenomics.adam.rdd.ADAMRDDFunctions.saveRddAsParquet(ADAMRDDFunctions.scala:143) at org.bdgenomics.adam.rdd.AvroGenomicRDD.saveAsParquet(GenomicRDD.scala:908) at org.bdgenomics.adam.rdd.AvroGenomicRDD.saveAsParquet(GenomicRDD.scala:883) at org.bdgenomics.adam.cli.Vcf2ADAM.run(Vcf2ADAM.scala:74) at org.bdgenomics.utils.cli.BDGSparkCommand$class.run(BDGCommand.scala:55) at org.bdgenomics.adam.cli.Vcf2ADAM.run(Vcf2ADAM.scala:53) at org.bdgenomics.adam.cli.ADAMMain.apply(ADAMMain.scala:128) at org.bdgenomics.adam.cli.ADAMMain$.main(ADAMMain.scala:68) at org.bdgenomics.adam.cli.ADAMMain.main(ADAMMain.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    You are the first who have seen this exception. Write a tip to help other users and build your expert profile.