java.lang.OutOfMemoryError: Unable to acquire 1073741824 bytes of memory, got 1060044737

Apache's JIRA Issue Tracker | Josh Rosen | 1 year ago
tip
Click on the to mark the solution that helps you, Samebug will learn from it.
As a community member, you’ll be rewarded for you help.
  1. 0

    Re: java.lang.OutOfMemoryError: Unable to acquire bytes of memory

    spark-dev | 1 year ago | james
    java.lang.OutOfMemoryError: Unable to acquire 1073741824 bytes of memory, got 1060110796
  2. 0

    Apache Spark Developers List - java.lang.OutOfMemoryError: Unable to acquire bytes of memory

    nabble.com | 1 month ago
    java.lang.OutOfMemoryError: Unable to acquire 1073741824 bytes of memory, got 1060110796
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

  4. 0

    [jira] [Created] (SPARK-14363) Executor OOM while trying to acquire new page from the memory manager

    spark-issues | 12 months ago | Sital Kedia (JIRA)
    java.lang.OutOfMemoryError: Unable to acquire 76 bytes of memory, got 0
  5. 0

    [jira] [Updated] (SPARK-14363) Executor OOM while trying to acquire new page from the memory manager

    spark-issues | 12 months ago | Sital Kedia (JIRA)
    java.lang.OutOfMemoryError: Unable to acquire 76 bytes of memory, got 0

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.OutOfMemoryError

      Unable to acquire 1073741824 bytes of memory, got 1060044737

      at org.apache.spark.memory.MemoryConsumer.allocateArray()
    2. org.apache.spark
      UnsafeExternalSorter.insertRecord
      1. org.apache.spark.memory.MemoryConsumer.allocateArray(MemoryConsumer.java:91)
      2. org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.growPointerArrayIfNecessary(UnsafeExternalSorter.java:295)
      3. org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.insertRecord(UnsafeExternalSorter.java:330)
      3 frames
    3. Spark Project SQL
      Sort$$anonfun$1.apply
      1. org.apache.spark.sql.execution.UnsafeExternalRowSorter.insertRow(UnsafeExternalRowSorter.java:91)
      2. org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:168)
      3. org.apache.spark.sql.execution.Sort$$anonfun$1.apply(Sort.scala:90)
      4. org.apache.spark.sql.execution.Sort$$anonfun$1.apply(Sort.scala:64)
      4 frames
    4. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
      2. org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:728)
      3. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      4. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      5. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      6. org.apache.spark.rdd.ZippedPartitionsRDD2.compute(ZippedPartitionsRDD.scala:88)
      7. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      8. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      9. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      10. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      11. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      12. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      13. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      14. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      15. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      16. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      17. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      18. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
      19. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
      20. org.apache.spark.scheduler.Task.run(Task.scala:89)
      21. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
      21 frames
    5. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames