org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 130.0 failed 1 times, most recent failure: Lost task 0.0 in stage 130.0 (TID 132, localhost): java.lang.ClassCastException Driver stacktrace:

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Samebug tips

Do you know how to solve this issue? Write a tip to help other users and build your expert profile.

Solutions on the web

via GitHub by sudcha
, 1 year ago
Job aborted due to stage failure: Task 0 in stage 130.0 failed 1 times, most recent failure: Lost task 0.0 in stage 130.0 (TID 132, localhost): java.lang.ClassCastException Driver stacktrace:
via activeintelligence.org by Unknown author, 1 year ago
Job aborted due to stage failure: Task 3343 in stage 33.0 failed 1 times, most recent failure: Lost task 3343.0 in stage 33.0 (TID 147765, localhost): ExecutorLostFailure (executor driver lost) Driver stacktrace:
via Stack Overflow by Zach Moshe
, 1 year ago
Job aborted due to stage failure: Task 2 in stage 40431.0 failed 1 times, most recent failure: Lost task 2.0 in stage 40431.0 (TID 8029, localhost): java.lang.ArrayIndexOutOfBoundsException Driver stacktrace:
via Stack Overflow by user4342532
, 11 months ago
Job aborted due to stage failure: Task 0.0 in stage 0.0 (TID 0) had a not serializable result: org.apache.spark.InterruptibleIterator Serialization stack: - object not serializable (class: org.apache.spark.InterruptibleIterator, value: non-empty
via Stack Overflow by unekwu
, 9 months ago
Job aborted due to stage failure: Task 0.0 in stage 1.0 (TID 4) had a not serializable result: java.util.ArrayList$Itr Serialization stack: - object not serializable (class: java.util.ArrayList$Itr, value: java.util.ArrayList$Itr@e6275ea) - field
via Stack Overflow by srinivas amara
, 9 months ago
Job aborted due to stage failure: Task 1.0 in stage 5.0 (TID 11) had a not serializable result: org.apache.hadoop.io.Text Serialization stack: - object not serializable (class: org.apache.hadoop.io.Text, value: Hadoop) - field (class
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 130.0 failed 1 times, most recent failure: Lost task 0.0 in stage 130.0 (TID 132, localhost): java.lang.ClassCastException Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1450)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1438)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1437)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:811)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:811)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1659)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1618)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1607)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:632)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1871)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1884)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1897)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1911)
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:893)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)
at org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.x$4$lzycompute(BinaryClassificationMetrics.scala:190)
at org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.x$4(BinaryClassificationMetrics.scala:144)
at org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.confusions$lzycompute(BinaryClassificationMetrics.scala:146)
at org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.confusions(BinaryClassificationMetrics.scala:146)
at org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.createCurve(BinaryClassificationMetrics.scala:221)
at org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.roc(BinaryClassificationMetrics.scala:85)

Users with the same issue

Once, 9 months ago
Samebug visitor profile picture
Unknown user
Once, 1 year ago
Once, 1 year ago
Samebug visitor profile picture
Unknown user
Once, 1 year ago
Once, 10 months ago

Write tip

Know the solutions? Share your knowledge to help other developers to debug faster.