org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 2.0 failed 4 times, most recent failure: Lost task 0.3 in stage 2.0 (TID 8, slave2-172-31-47-102): java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE

GitHub | abhaymise | 8 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE

    GitHub | 8 months ago | abhaymise
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 2.0 failed 4 times, most recent failure: Lost task 0.3 in stage 2.0 (TID 8, slave2-172-31-47-102): java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE
  2. 0

    Spark. ~100 million rows. Size exceeds Integer.MAX_VALUE?

    Stack Overflow | 6 months ago | clay
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 22 in stage 1.0 failed 4 times, most recent failure: Lost task 22.3 in stage 1.0 (TID 77, ip-172-31-97-24.us-west-2.compute.internal): java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE
  3. 0

    Spark Java Error: Size exceeds Integer.MAX_VALUE

    Stack Overflow | 2 years ago | peng
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 4.0 failed 4 times, most recent failure: Lost task 1.3 in stage 4.0 (TID 9, workernode0.sparkexperience4a7.d5.internal.cloudapp.net): java.lang.RuntimeException: java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.apache.spark.SparkException

      Job aborted due to stage failure: Task 0 in stage 2.0 failed 4 times, most recent failure: Lost task 0.3 in stage 2.0 (TID 8, slave2-172-31-47-102): java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE

      at sun.nio.ch.FileChannelImpl.map()
    2. Java RT
      FileChannelImpl.map
      1. sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:860)
      1 frame
    3. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.storage.DiskStore$$anonfun$getBytes$2.apply(DiskStore.scala:127)
      2. org.apache.spark.storage.DiskStore$$anonfun$getBytes$2.apply(DiskStore.scala:115)
      3. org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1250)
      4. org.apache.spark.storage.DiskStore.getBytes(DiskStore.scala:129)
      5. org.apache.spark.storage.DiskStore.getBytes(DiskStore.scala:136)
      6. org.apache.spark.storage.BlockManager.doGetLocal(BlockManager.scala:503)
      7. org.apache.spark.storage.BlockManager.getLocal(BlockManager.scala:420)
      8. org.apache.spark.storage.BlockManager.get(BlockManager.scala:625)
      9. org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:44)
      10. org.apache.spark.rdd.RDD.iterator(RDD.scala:268)
      11. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      12. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      13. org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:69)
      14. org.apache.spark.rdd.RDD.iterator(RDD.scala:268)
      15. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
      16. org.apache.spark.scheduler.Task.run(Task.scala:89)
      17. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
      17 frames
    4. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames