java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE

Stack Overflow | eexxoo | 1 week ago
tip
Do you know that we can give you better hits? Get more relevant results from Samebug’s stack trace search.
  1. 0

    SQL query in Spark/scala Size exceeds Integer.MAX_VALUE

    Stack Overflow | 1 week ago | eexxoo
    java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE
  2. 0

    Spark : Size exceeds Integer.MAX_VALUE When Joining 2 Large DFs

    Stack Overflow | 3 months ago | rohitvk
    java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE
  3. 0

    Spark. ~100 million rows. Size exceeds Integer.MAX_VALUE?

    Stack Overflow | 6 months ago | clay
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 22 in stage 1.0 failed 4 times, most recent failure: Lost task 22.3 in stage 1.0 (TID 77, ip-172-31-97-24.us-west-2.compute.internal): java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE

    GitHub | 8 months ago | abhaymise
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 2.0 failed 4 times, most recent failure: Lost task 0.3 in stage 2.0 (TID 8, slave2-172-31-47-102): java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE

  1. tyson925 2 times, last 5 months ago
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.IllegalArgumentException

    Size exceeds Integer.MAX_VALUE

    at sun.nio.ch.FileChannelImpl.map()
  2. Java RT
    FileChannelImpl.map
    1. sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:869)
    1 frame
  3. Spark
    Executor$TaskRunner.run
    1. org.apache.spark.storage.DiskStore$$anonfun$getBytes$2.apply(DiskStore.scala:103)
    2. org.apache.spark.storage.DiskStore$$anonfun$getBytes$2.apply(DiskStore.scala:91)
    3. org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1287)
    4. org.apache.spark.storage.DiskStore.getBytes(DiskStore.scala:105)
    5. org.apache.spark.storage.BlockManager.getLocalValues(BlockManager.scala:439)
    6. org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:672)
    7. org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:330)
    8. org.apache.spark.rdd.RDD.iterator(RDD.scala:281)
    9. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
    10. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
    11. org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
    12. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
    13. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
    14. org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
    15. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
    16. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
    17. org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
    18. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79)
    19. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)
    20. org.apache.spark.scheduler.Task.run(Task.scala:85)
    21. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
    21 frames
  4. Java RT
    Thread.run
    1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    3. java.lang.Thread.run(Thread.java:745)
    3 frames