org.apache.spark.SparkException: Job aborted due to stage failure: Task 9 in stage 2.0 failed 1 times, most recent failure: Lost task 9.0 in stage 2.0 (TID 201, localhost): java.lang.OutOfMemoryError: GC overhead limit exceeded

GitHub | car2008 | 8 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    GitHub comment 173#243313005

    GitHub | 8 months ago | car2008
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 9 in stage 2.0 failed 1 times, most recent failure: Lost task 9.0 in stage 2.0 (TID 201, localhost): java.lang.OutOfMemoryError: GC overhead limit exceeded
  2. 0

    GitHub comment 190#243338857

    GitHub | 8 months ago | car2008
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 9 in stage 2.0 failed 1 times, most recent failure: Lost task 9.0 in stage 2.0 (TID 201, localhost): java.lang.OutOfMemoryError: GC overhead limit exceeded

    Root Cause Analysis

    1. org.apache.spark.SparkException

      Job aborted due to stage failure: Task 9 in stage 2.0 failed 1 times, most recent failure: Lost task 9.0 in stage 2.0 (TID 201, localhost): java.lang.OutOfMemoryError: GC overhead limit exceeded

      at java.lang.Long.valueOf()
    2. Java RT
      Long.valueOf
      1. java.lang.Long.valueOf(Long.java:840)
      1 frame
    3. Kryo
      Kryo.readClassAndObject
      1. com.esotericsoftware.kryo.serializers.DefaultSerializers$LongSerializer.read(DefaultSerializers.java:113)
      2. com.esotericsoftware.kryo.serializers.DefaultSerializers$LongSerializer.read(DefaultSerializers.java:103)
      3. com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:729)
      3 frames
    4. chill-java
      Tuple2Serializer.read
      1. com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:41)
      2. com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33)
      2 frames
    5. Kryo
      Kryo.readClassAndObject
      1. com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:729)
      2. com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:338)
      3. com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:293)
      4. com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:648)
      5. com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.read(FieldSerializer.java:605)
      6. com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:221)
      7. com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:729)
      7 frames
    6. Spark
      ExternalAppendOnlyMap$DiskMapIterator.hasNext
      1. org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSerializer.scala:228)
      2. org.apache.spark.serializer.DeserializationStream.readValue(Serializer.scala:171)
      3. org.apache.spark.util.collection.ExternalAppendOnlyMap$DiskMapIterator.readNextItem(ExternalAppendOnlyMap.scala:478)
      4. org.apache.spark.util.collection.ExternalAppendOnlyMap$DiskMapIterator.hasNext(ExternalAppendOnlyMap.scala:498)
      4 frames
    7. Scala
      Iterator$$anon$1.hasNext
      1. scala.collection.Iterator$$anon$1.hasNext(Iterator.scala:847)
      1 frame
    8. Spark
      ExternalAppendOnlyMap$ExternalIterator$$anonfun$5.apply
      1. org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.org$apache$spark$util$collection$ExternalAppendOnlyMap$ExternalIterator$$readNextHashCode(ExternalAppendOnlyMap.scala:295)
      2. org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator$$anonfun$5.apply(ExternalAppendOnlyMap.scala:279)
      3. org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator$$anonfun$5.apply(ExternalAppendOnlyMap.scala:277)
      3 frames
    9. Scala
      List.foreach
      1. scala.collection.immutable.List.foreach(List.scala:318)
      1 frame
    10. Spark
      RDD.iterator
      1. org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.<init>(ExternalAppendOnlyMap.scala:277)
      2. org.apache.spark.util.collection.ExternalAppendOnlyMap.iterator(ExternalAppendOnlyMap.scala:253)
      3. org.apache.spark.Aggregator.combineValuesByKey(Aggregator.scala:47)
      4. org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:89)
      5. org.apache.spark.rdd.ShuffledRDD.compute(ShuffledRDD.scala:98)
      6. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      7. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      8. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      9. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      10. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      10 frames