org.apache.spark.SparkException

Job aborted due to stage failure: Task 13 in stage 8821.0 failed 1 times, most recent failure: Lost task 13.0 in stage 8821.0 (TID 83708, localhost, executor driver): java.lang.OutOfMemoryError: Java heap space

Samebug tips0

We couldn't find tips for this exception.

Don't give up yet. Paste your full stack trace to get a solution.

Solutions on the web2

  • via Stack Overflow by Masha
    ,
  • via Stack Overflow by Lokesh Kumar P
    ,
  • Stack trace

    • org.apache.spark.SparkException: Job aborted due to stage failure: Task 13 in stage 8821.0 failed 1 times, most recent failure: Lost task 13.0 in stage 8821.0 (TID 83708, localhost, executor driver): java.lang.OutOfMemoryError: Java heap space at java.nio.HeapByteBuffer.<init>(HeapByteBuffer.java:57) at java.nio.ByteBuffer.allocate(ByteBuffer.java:335) at org.apache.spark.sql.execution.columnar.NullableColumnBuilder$class.build(NullableColumnBuilder.scala:74) at org.apache.spark.sql.execution.columnar.ComplexColumnBuilder.build(ColumnBuilder.scala:91) at org.apache.spark.sql.execution.columnar.InMemoryRelation$$anonfun$1$$anon$1$$anonfun$next$2.apply(InMemoryRelation.scala:134) at org.apache.spark.sql.execution.columnar.InMemoryRelation$$anonfun$1$$anon$1$$anonfun$next$2.apply(InMemoryRelation.scala:133) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) at scala.collection.TraversableLike$class.map(TraversableLike.scala:234) at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186) at org.apache.spark.sql.execution.columnar.InMemoryRelation$$anonfun$1$$anon$1.next(InMemoryRelation.scala:133) at org.apache.spark.sql.execution.columnar.InMemoryRelation$$anonfun$1$$anon$1.next(InMemoryRelation.scala:97) at org.apache.spark.storage.memory.PartiallyUnrolledIterator.next(MemoryStore.scala:706) at org.apache.spark.serializer.SerializationStream.writeAll(Serializer.scala:140) at org.apache.spark.serializer.SerializerManager.dataSerializeStream(SerializerManager.scala:170) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1$$anonfun$apply$5.apply(BlockManager.scala:964) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1$$anonfun$apply$5.apply(BlockManager.scala:963) at org.apache.spark.storage.DiskStore.put(DiskStore.scala:57) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:963) at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:947) at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:887) at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:947) at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:693) at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    tyson925tyson925
    Once,