java.lang.OutOfMemoryError

GC overhead limit exceeded

Samebug tips0

There are no available Samebug tips for this exception. If you know how to solve this issue, help other users by writing a short tip.

Don't give up yet. Paste your full stack trace to get a solution.

Solutions on the web1483

  • via Terracotta by aqt, 11 months ago
    GC overhead limit exceeded
  • via Terracotta by tylercal, 11 months ago
    GC overhead limit exceeded
  • via Terracotta by JavaRef, 11 months ago
    GC overhead limit exceeded
  • Stack trace

    • java.lang.OutOfMemoryError: GC overhead limit exceeded at scala.collection.IndexedSeqLike$class.iterator(IndexedSeqLike.scala:91) at scala.collection.mutable.WrappedArray.iterator(WrappedArray.scala:34) at scala.collection.Iterator$.apply(Iterator.scala:63) at org.apache.spark.util.collection.ExternalAppendOnlyMap.insert(ExternalAppendOnlyMap.scala:105) at org.apache.spark.Aggregator.combineCombinersByKey(Aggregator.scala:93) at org.apache.spark.shuffle.hash.HashShuffleReader.read(HashShuffleReader.scala:44) at org.apache.spark.rdd.ShuffledRDD.compute(ShuffledRDD.scala:92) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277) at org.apache.spark.rdd.RDD.iterator(RDD.scala:244) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277) at org.apache.spark.rdd.RDD.iterator(RDD.scala:244) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61) at org.apache.spark.scheduler.Task.run(Task.scala:64) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745)[ERROR]

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    You’re the first here who have seen this exception.