Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Data Science by SaCvP
, 1 year ago
Job aborted due to stage failure: Task 0 in stage 10.0 failed 1 times, most recent failure: Lost task 0.0 in stage 10.0 (TID 7, localhost): org.apache.spark.SparkException: Items in a transaction must be unique but got WrappedArray(13873775, 4, 99, 9909, 102113020, 15704, 2012-03-19:00, 6.25, OZ, 4, 11.96).
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 10.0 failed 1 times, most recent failure: Lost task 0.0 in stage 10.0 (TID 7, localhost): org.apache.spark.SparkException: Items in a transaction must be unique but got WrappedArray(13873775, 4, 99, 9909, 102113020, 15704, 2012-03-19:00, 6.25, OZ, 4, 11.96). at org.apache.spark.mllib.fpm.FPGrowth$$anonfun$1.apply(FPGrowth.scala:143) at org.apache.spark.mllib.fpm.FPGrowth$$anonfun$1.apply(FPGrowth.scala:140) at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327) at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:189) at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:64) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) at org.apache.spark.scheduler.Task.run(Task.scala:89) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745)