org.apache.spark.SparkException: Job aborted due to stage failure: Task 9 in stage 2.0 failed 1 times, most recent failure: Lost task 9.0 in stage 2.0 (TID 201, localhost): java.lang.OutOfMemoryError: GC overhead limit exceeded


Solutions on the web2

Solution icon of github
via GitHub by car2008
, 1 year ago
Job aborted due to stage failure: Task 9 in stage 2.0 failed 1 times, most recent failure: Lost task 9.0 in stage 2.0 (TID 201, localhost): java.lang.OutOfMemoryError: GC overhead limit exceeded

Solution icon of github
via GitHub by car2008
, 1 year ago
Job aborted due to stage failure: Task 9 in stage 2.0 failed 1 times, most recent failure: Lost task 9.0 in stage 2.0 (TID 201, localhost): java.lang.OutOfMemoryError: GC overhead limit exceeded

Stack trace

org.apache.spark.SparkException: Job aborted due to stage failure: Task 9 in stage 2.0 failed 1 times, most recent failure: Lost task 9.0 in stage 2.0 (TID 201, localhost): java.lang.OutOfMemoryError: GC overhead limit exceeded
	at java.lang.Long.valueOf(Long.java:840)
	at com.esotericsoftware.kryo.serializers.DefaultSerializers$LongSerializer.read(DefaultSerializers.java:113)
	at com.esotericsoftware.kryo.serializers.DefaultSerializers$LongSerializer.read(DefaultSerializers.java:103)
	at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:729)
	at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:41)
	at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33)
	at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:729)
	at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:338)
	at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:293)
	at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:648)
	at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.read(FieldSerializer.java:605)
	at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:221)
	at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:729)
	at org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSerializer.scala:228)
	at org.apache.spark.serializer.DeserializationStream.readValue(Serializer.scala:171)
	at org.apache.spark.util.collection.ExternalAppendOnlyMap$DiskMapIterator.readNextItem(ExternalAppendOnlyMap.scala:478)
	at org.apache.spark.util.collection.ExternalAppendOnlyMap$DiskMapIterator.hasNext(ExternalAppendOnlyMap.scala:498)
	at scala.collection.Iterator$$anon$1.hasNext(Iterator.scala:847)
	at org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.org$apache$spark$util$collection$ExternalAppendOnlyMap$ExternalIterator$$readNextHashCode(ExternalAppendOnlyMap.scala:295)
	at org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator$$anonfun$5.apply(ExternalAppendOnlyMap.scala:279)
	at org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator$$anonfun$5.apply(ExternalAppendOnlyMap.scala:277)
	at scala.collection.immutable.List.foreach(List.scala:318)
	at org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.<init>(ExternalAppendOnlyMap.scala:277)
	at org.apache.spark.util.collection.ExternalAppendOnlyMap.iterator(ExternalAppendOnlyMap.scala:253)
	at org.apache.spark.Aggregator.combineValuesByKey(Aggregator.scala:47)
	at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:89)
	at org.apache.spark.rdd.ShuffledRDD.compute(ShuffledRDD.scala:98)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)

Write tip

You have a different solution? A short tip here would help you and many other users who saw this issue last week.

Users with the same issue

You are the first who have seen this exception. Write a tip to help other users and build your expert profile.