Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Samebug tips

  1. ,
    via oracle.com by Unknown author

    An easy way to solve OutOfMemoryError in java is to increase the maximum heap size by using JVM options -Xmx512M, this will immediately solve your OutOfMemoryError.

  2. ,
    via Stack Overflow by Eugene Yokota

    In Eclipse : go to Run --> Run Configurations --> then select the project under maven build --> then select the tab "JRE" --> then enter -Xmx1024m.

    This should increase the memory heap for all the builds/projects. The above memory size is 1 GB.

Solutions on the web

via Stack Overflow by Jack
, 1 year ago
via JDK Bug System by Webbug Group, 1 year ago
via ne.jp by Unknown author, 1 year ago
via sourcecodebig.com by Unknown author, 1 year ago
Requested array size exceeds VM limit
java.lang.OutOfMemoryError: Requested array size exceeds VM limit	at java.util.Arrays.copyOf(Arrays.java:2271)	at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:113)	at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93)	at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:140)	at org.apache.spark.util.ByteBufferOutputStream.write(ByteBufferOutputStream.scala:41)	at java.io.ObjectOutputStream$BlockDataOutputStream.drain(ObjectOutputStream.java:1876)	at java.io.ObjectOutputStream$BlockDataOutputStream.setBlockDataMode(ObjectOutputStream.java:1785)	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1188)	at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347)	at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:43)	at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100)	at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:295)	at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:288)	at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:108)	at org.apache.spark.SparkContext.clean(SparkContext.scala:2056)	at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:366)	at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:365)	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)	at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)	at org.apache.spark.rdd.RDD.map(RDD.scala:365)