org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 20.0 failed 1 times, most recent failure: Lost task 0.0 in stage 20.0 (TID 30, localhost): java.lang.ArrayIndexOutOfBoundsException: 0

GitHub | bolau | 8 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Collection of empty and non-empty items fails in serialization

    GitHub | 8 months ago | bolau
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 20.0 failed 1 times, most recent failure: Lost task 0.0 in stage 20.0 (TID 30, localhost): java.lang.ArrayIndexOutOfBoundsException: 0
  2. 0

    GitHub comment 78#216242788

    GitHub | 10 months ago | drtjre
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 12.0 failed 1 times, most recent failure: Lost task 1.0 in stage 12.0 (TID 226, localhost): java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String

    Root Cause Analysis

    1. org.apache.spark.SparkException

      Job aborted due to stage failure: Task 0 in stage 20.0 failed 1 times, most recent failure: Lost task 0.0 in stage 20.0 (TID 30, localhost): java.lang.ArrayIndexOutOfBoundsException: 0

      at org.apache.spark.sql.catalyst.expressions.GenericInternalRow.genericGet()
    2. Spark Project Catalyst
      GeneratedClass$SpecificUnsafeProjection.apply
      1. org.apache.spark.sql.catalyst.expressions.GenericInternalRow.genericGet(rows.scala:227)
      2. org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow$class.getAs(rows.scala:35)
      3. org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow$class.getUTF8String(rows.scala:46)
      4. org.apache.spark.sql.catalyst.expressions.GenericInternalRow.getUTF8String(rows.scala:221)
      5. org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source)
      6. org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source)
      6 frames
    3. Scala
      Iterator$$anon$11.next
      1. scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
      1 frame
    4. org.apache.spark
      InMemoryRelation$$anonfun$3$$anon$1.next
      1. org.apache.spark.sql.execution.columnar.InMemoryRelation$$anonfun$3$$anon$1.next(InMemoryColumnarTableScan.scala:140)
      2. org.apache.spark.sql.execution.columnar.InMemoryRelation$$anonfun$3$$anon$1.next(InMemoryColumnarTableScan.scala:130)
      2 frames
    5. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:285)
      2. org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:171)
      3. org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:78)
      4. org.apache.spark.rdd.RDD.iterator(RDD.scala:268)
      5. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      6. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      7. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      8. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      9. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      10. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      11. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      12. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      13. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      14. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
      15. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
      16. org.apache.spark.scheduler.Task.run(Task.scala:89)
      17. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
      17 frames
    6. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      3. java.lang.Thread.run(Thread.java:744)
      3 frames