java.lang.OutOfMemoryError: Java heap space

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Samebug tips

,
via oracle.com by Unknown author

An easy way to solve OutOfMemoryError in java is to increase the maximum heap size by using JVM options -Xmx512M, this will immediately solve your OutOfMemoryError.

,
via Stack Overflow by Eugene Yokota

In Eclipse : go to Run --> Run Configurations --> then select the project under maven build --> then select the tab "JRE" --> then enter -Xmx1024m.

This should increase the memory heap for all the builds/projects. The above memory size is 1 GB.

Solutions on the web

via nabble.com by Unknown author, 2 years ago
via GitHub by Timoux
, 1 year ago
via GitHub by ahmed-reda2100
, 11 months ago
java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:3236)
at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:118)
at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93)
at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:153)
at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1853)
at java.io.ObjectOutputStream.write(ObjectOutputStream.java:709)
at org.apache.spark.util.Utils$.writeByteBuffer(Utils.scala:183)
at org.apache.spark.scheduler.DirectTaskResult$$anonfun$writeExternal$1.apply$mcV$sp(TaskResult.scala:52)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1204)
at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1459)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1430)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:44)
at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:101)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Users with the same issue

2 times, 5 days ago
3 times, 1 month ago
Once, 2 months ago
Once, 2 months ago
2 times, 3 months ago

Write tip

Know the solutions? Share your knowledge to help other developers to debug faster.