Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via GitHub by ryan-williams
, 1 year ago
Job aborted due to stage failure: Task 2 in stage 6.0 failed 1 times, most recent failure: Lost task 2.0 in stage 6.0 (TID 42, localhost): java.lang.NoSuchMethodError: breeze.linalg.sum$.sumSummableThings(Lscala/Predef$$less$colon$less;Lbreeze/generic/UFunc$UImpl2;)Lbreeze/generic/UFunc$UImpl;
org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 6.0 failed 1 times, most recent failure: Lost task 2.0 in stage 6.0 (TID 42, localhost): java.lang.NoSuchMethodError: breeze.linalg.sum$.sumSummableThings(Lscala/Predef$$less$colon$less;Lbreeze/generic/UFunc$UImpl2;)Lbreeze/generic/UFunc$UImpl;	at org.hammerlab.guacamole.likelihood.Likelihood$$anonfun$4.apply(Likelihood.scala:161)	at org.hammerlab.guacamole.likelihood.Likelihood$$anonfun$4.apply(Likelihood.scala:156)	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)	at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)	at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)	at org.hammerlab.guacamole.likelihood.Likelihood$.likelihoodsOfGenotypes(Likelihood.scala:156)	at org.hammerlab.guacamole.likelihood.Likelihood$.likelihoodsOfAllPossibleGenotypesFromPileup(Likelihood.scala:78)	at org.hammerlab.guacamole.commands.SomaticStandard$Caller$.findPotentialVariantAtLocus(SomaticStandardCaller.scala:182)	at org.hammerlab.guacamole.commands.SomaticStandard$Caller$$anonfun$1.apply(SomaticStandardCaller.scala:95)	at org.hammerlab.guacamole.commands.SomaticStandard$Caller$$anonfun$1.apply(SomaticStandardCaller.scala:94)	at org.hammerlab.guacamole.distributed.PileupFlatMapUtils$$anonfun$pileupFlatMapTwoSamples$1.apply(PileupFlatMapUtils.scala:84)	at org.hammerlab.guacamole.distributed.PileupFlatMapUtils$$anonfun$pileupFlatMapTwoSamples$1.apply(PileupFlatMapUtils.scala:79)	at org.hammerlab.guacamole.distributed.WindowFlatMapUtils$$anonfun$windowFlatMapWithState$1$$anonfun$apply$1.apply(WindowFlatMapUtils.scala:65)	at org.hammerlab.guacamole.distributed.WindowFlatMapUtils$$anonfun$windowFlatMapWithState$1$$anonfun$apply$1.apply(WindowFlatMapUtils.scala:55)	at org.hammerlab.guacamole.distributed.WindowFlatMapUtils$$anonfun$splitPartitionByContigAndMap$2.apply(WindowFlatMapUtils.scala:141)	at org.hammerlab.guacamole.distributed.WindowFlatMapUtils$$anonfun$splitPartitionByContigAndMap$2.apply(WindowFlatMapUtils.scala:131)	at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)	at org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:284)	at org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:171)	at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:78)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:268)	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)	at org.apache.spark.scheduler.Task.run(Task.scala:89)	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)	at java.lang.Thread.run(Thread.java:745)