Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Stack Overflow by ssyue
, 1 year ago
java.lang.OutOfMemoryError: Direct buffer memory
org.apache.spark.shuffle.FetchFailedException: Direct buffer memory	at org.apache.spark.storage.ShuffleBlockFetcherIterator.throwFetchFailedException(ShuffleBlockFetcherIterator.scala:323)	at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:300)	at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:51)	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)	at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)	at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:32)	at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)	at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:167)	at org.apache.spark.sql.execution.Sort$$anonfun$1.apply(Sort.scala:90)	at org.apache.spark.sql.execution.Sort$$anonfun$1.apply(Sort.scala:64)	at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:759)	at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$21.apply(RDD.scala:759)