java.lang.ClassCastException: org.bdgenomics.formats.avro.Fragment cannot be cast to

Google Groups | someya sayeh | 10 months ago
  1. 0

    adam2vcf

    Google Groups | 10 months ago | someya sayeh
    java.lang.ClassCastException: org.bdgenomics.formats.avro.Fragment cannot be cast to
  2. 0
    When you have no build parameters, uncheck 'This build is parameterized'
  3. 0

    Scala 2.11.5 and Squeryl?

    Google Groups | 2 years ago | Clint Gilbert
    java.lang.ClassCastException: scala.None$ cannot be cast to
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Memcached Java客户端2.6.1发布 - 编程语言 - ITeye资讯

    iteye.com | 1 year ago
    java.lang.ClassCastException: cannot be cast to
  6. 0

    cannot be cast to javax.servlet.Filter | Oracle Community

    oracle.com | 10 months ago
    java.lang.ClassCastException: cannot be cast to javax.servlet.Filter

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.ClassCastException

      org.bdgenomics.formats.avro.Fragment cannot be cast to

      at org.bdgenomics.adam.rdd.variation.GenotypeRDDFunctions$$anonfun$10.apply()
    2. org.bdgenomics.adam
      GenotypeRDDFunctions$$anonfun$10.apply
      1. org.bdgenomics.adam.rdd.variation.GenotypeRDDFunctions$$anonfun$10.apply(VariationRDDFunctions.scala:153)
      1 frame
    3. Spark
      RDD$$anonfun$keyBy$1.apply
      1. org.apache.spark.rdd.RDD$$anonfun$keyBy$1.apply(RDD.scala:1324)
      2. org.apache.spark.rdd.RDD$$anonfun$keyBy$1.apply(RDD.scala:1324)
      2 frames
    4. Scala
      Iterator$$anon$11.next
      1. scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
      2. scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
      2 frames
    5. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.util.collection.ExternalSorter.spillToPartitionFiles(ExternalSorter.scala:371)
      2. org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:211)
      3. org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
      4. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
      5. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
      6. org.apache.spark.scheduler.Task.run(Task.scala:64)
      7. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
      7 frames
    6. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames
    7. Spark
      DAGScheduler$$anonfun$abortStage$1.apply
      1. org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1204)
      2. org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1193)
      3. org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1192)
      3 frames
    8. Scala
      ArrayBuffer.foreach
      1. scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
      2. scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
      2 frames
    9. Spark
      DAGScheduler$$anonfun$handleTaskSetFailed$1.apply
      1. org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1192)
      2. org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:693)
      3. org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:693)
      3 frames
    10. Scala
      Option.foreach
      1. scala.Option.foreach(Option.scala:236)
      1 frame
    11. Spark
      EventLoop$$anon$1.run
      1. org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:693)
      2. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1393)
      3. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1354)
      4. org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
      4 frames