Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

java.lang.OutOfMemoryError: GC overhead limit exceeded	at org.apache.hadoop.hdfs.util.ByteArrayManager$NewByteArrayWithoutLimit.newByteArray(ByteArrayManager.java:308)	at org.apache.hadoop.hdfs.DFSOutputStream.createPacket(DFSOutputStream.java:197)	at org.apache.hadoop.hdfs.DFSOutputStream.writeChunkImpl(DFSOutputStream.java:1835)	at org.apache.hadoop.hdfs.DFSOutputStream.writeChunk(DFSOutputStream.java:1813)	at org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunks(FSOutputSummer.java:206)	at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:163)	at org.apache.hadoop.fs.FSOutputSummer.flush(FSOutputSummer.java:182)	at java.io.FilterOutputStream.flush(FilterOutputStream.java:140)	at java.io.DataOutputStream.flush(DataOutputStream.java:123)	at org.codehaus.jackson.impl.Utf8Generator.flush(Utf8Generator.java:1091)	at org.apache.avro.io.JsonEncoder.flush(JsonEncoder.java:67)	at org.apache.hadoop.mapreduce.jobhistory.EventWriter.write(EventWriter.java:67)	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler$MetaInfo.writeEvent(JobHistoryEventHandler.java:1248)	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler.handleEvent(JobHistoryEventHandler.java:565)	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler$1.run(JobHistoryEventHandler.java:318)	at java.lang.Thread.run(Thread.java:745)