java.lang.OutOfMemoryError: Java heap space

Stack Overflow | jgp | 7 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Apache Spark out of Java heap space: where does it happen?

    Stack Overflow | 7 months ago | jgp
    java.lang.OutOfMemoryError: Java heap space
  2. 0

    java.lang.OutOfMemoryError for simple rdd.count() operation

    Stack Overflow | 2 years ago | santon
    java.lang.OutOfMemoryError: Java heap space
  3. 0

    wholeTextFiles OutOfiMemoryError:java heap space

    Stack Overflow | 2 years ago | shen
    java.lang.OutOfMemoryError: Java heap space
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Spark cluster computing framework

    gmane.org | 11 months ago
    java.lang.OutOfMemoryError: Java heap space

    1 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.OutOfMemoryError

      Java heap space

      at java.nio.HeapCharBuffer.<init>()
    2. Java RT
      CharsetDecoder.decode
      1. java.nio.HeapCharBuffer.<init>(HeapCharBuffer.java:57)
      2. java.nio.CharBuffer.allocate(CharBuffer.java:335)
      3. java.nio.charset.CharsetDecoder.decode(CharsetDecoder.java:810)
      3 frames
    3. Hadoop
      Text.toString
      1. org.apache.hadoop.io.Text.decode(Text.java:412)
      2. org.apache.hadoop.io.Text.decode(Text.java:389)
      3. org.apache.hadoop.io.Text.toString(Text.java:280)
      3 frames
    4. org.apache.spark
      JSONRelation$$anonfun$org$apache$spark$sql$execution$datasources$json$JSONRelation$$createBaseRdd$1.apply
      1. org.apache.spark.sql.execution.datasources.json.JSONRelation$$anonfun$org$apache$spark$sql$execution$datasources$json$JSONRelation$$createBaseRdd$1.apply(JSONRelation.scala:105)
      2. org.apache.spark.sql.execution.datasources.json.JSONRelation$$anonfun$org$apache$spark$sql$execution$datasources$json$JSONRelation$$createBaseRdd$1.apply(JSONRelation.scala:105)
      2 frames
    5. Scala
      AbstractIterator.aggregate
      1. scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
      2. scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
      3. scala.collection.Iterator$class.foreach(Iterator.scala:727)
      4. scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
      5. scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:144)
      6. scala.collection.AbstractIterator.foldLeft(Iterator.scala:1157)
      7. scala.collection.TraversableOnce$class.aggregate(TraversableOnce.scala:201)
      8. scala.collection.AbstractIterator.aggregate(Iterator.scala:1157)
      8 frames
    6. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$23.apply(RDD.scala:1135)
      2. org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$23.apply(RDD.scala:1135)
      3. org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$24.apply(RDD.scala:1136)
      4. org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$24.apply(RDD.scala:1136)
      5. org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
      6. org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
      7. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      8. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      9. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      10. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
      11. org.apache.spark.scheduler.Task.run(Task.scala:89)
      12. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
      12 frames
    7. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames