java.lang.OutOfMemoryError: GC overhead limit exceeded

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Samebug tips

Do you know how to solve this issue? Write a tip to help other users and build your expert profile.

Solutions on the web

via GitHub by car2008
, 1 year ago
GC overhead limit exceeded
via GitHub by Akado2009
, 7 months ago
GC overhead limit exceeded
java.lang.OutOfMemoryError: GC overhead limit exceeded
at htsjdk.samtools.BinaryTagCodec.readTags(BinaryTagCodec.java:282)
at htsjdk.samtools.BAMRecord.decodeAttributes(BAMRecord.java:308)
at htsjdk.samtools.BAMRecord.getAttribute(BAMRecord.java:288)
at org.hammerlab.guacamole.reads.Read$.apply(Read.scala:77)
at org.hammerlab.guacamole.readsets.ReadSets$$anonfun$8.apply(ReadSets.scala:276)
at org.hammerlab.guacamole.readsets.ReadSets$$anonfun$8.apply(ReadSets.scala:266)
at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
at scala.collection.Iterator$class.toStream(Iterator.scala:1143)
at scala.collection.AbstractIterator.toStream(Iterator.scala:1157)
at scala.collection.Iterator$$anonfun$toStream$1.apply(Iterator.scala:1143)
at scala.collection.Iterator$$anonfun$toStream$1.apply(Iterator.scala:1143)
at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1085)
at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1077)
at scala.collection.immutable.Stream.length(Stream.scala:284)
at scala.collection.SeqLike$class.size(SeqLike.scala:106)
at scala.collection.AbstractSeq.size(Seq.scala:40)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:248)
at org.apache.spark.rdd.ParallelCollectionRDD$.slice(ParallelCollectionRDD.scala:154)
at org.apache.spark.rdd.ParallelCollectionRDD.getPartitions(ParallelCollectionRDD.scala:97)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)

Users with the same issue

You are the first who have seen this exception. Write a tip to help other users and build your expert profile.

Write tip

Know the solutions? Share your knowledge to help other developers to debug faster.