java.lang.ClassCastException: org.bdgenomics.formats.avro.Fragment cannot be cast to

Google Groups | someya sayeh | 1 year ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    adam2vcf

    Google Groups | 1 year ago | someya sayeh
    java.lang.ClassCastException: org.bdgenomics.formats.avro.Fragment cannot be cast to

    Root Cause Analysis

    1. java.lang.ClassCastException

      org.bdgenomics.formats.avro.Fragment cannot be cast to

      at org.bdgenomics.adam.rdd.variation.GenotypeRDDFunctions$$anonfun$10.apply()
    2. org.bdgenomics.adam
      GenotypeRDDFunctions$$anonfun$10.apply
      1. org.bdgenomics.adam.rdd.variation.GenotypeRDDFunctions$$anonfun$10.apply(VariationRDDFunctions.scala:153)
      1 frame
    3. Spark
      RDD$$anonfun$keyBy$1.apply
      1. org.apache.spark.rdd.RDD$$anonfun$keyBy$1.apply(RDD.scala:1324)
      2. org.apache.spark.rdd.RDD$$anonfun$keyBy$1.apply(RDD.scala:1324)
      2 frames
    4. Scala
      Iterator$$anon$11.next
      1. scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
      2. scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
      2 frames
    5. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.util.collection.ExternalSorter.spillToPartitionFiles(ExternalSorter.scala:371)
      2. org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:211)
      3. org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
      4. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
      5. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
      6. org.apache.spark.scheduler.Task.run(Task.scala:64)
      7. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
      7 frames
    6. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames
    7. Spark
      DAGScheduler$$anonfun$abortStage$1.apply
      1. org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1204)
      2. org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1193)
      3. org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1192)
      3 frames
    8. Scala
      ArrayBuffer.foreach
      1. scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
      2. scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
      2 frames
    9. Spark
      DAGScheduler$$anonfun$handleTaskSetFailed$1.apply
      1. org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1192)
      2. org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:693)
      3. org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:693)
      3 frames
    10. Scala
      Option.foreach
      1. scala.Option.foreach(Option.scala:236)
      1 frame
    11. Spark
      EventLoop$$anon$1.run
      1. org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:693)
      2. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1393)
      3. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1354)
      4. org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
      4 frames