Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via GitHub by liangzhaozeng
, 1 year ago
Lost task 18.0 in stage 0.0 (TID 1, ulnode373.pv05.siri.apple.com): java.io.IOException: com.esotericsoftware.kryo.KryoException: Error during Java deserialization.
java.lang.ClassNotFoundException: com.databricks.spark.avro.DefaultSource$SerializableConfiguration	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)	at java.security.AccessController.doPrivileged(Native Method)	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)	at java.lang.Class.forName0(Native Method)	at java.lang.Class.forName(Class.java:278)	at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:625)	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)	at com.esotericsoftware.kryo.serializers.JavaSerializer.read(JavaSerializer.java:63)	at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)	at org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSerializer.scala:229)	at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$10.apply(TorrentBroadcast.scala:254)	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1287)	at org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:255)	at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1.apply(TorrentBroadcast.scala:189)	at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1253)	at org.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:174)	at org.apache.spark.broadcast.TorrentBroadcast._value$lzycompute(TorrentBroadcast.scala:65)	at org.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:65)	at org.apache.spark.broadcast.TorrentBroadcast.getValue(TorrentBroadcast.scala:89)	at org.apache.spark.broadcast.Broadcast.value(Broadcast.scala:70)	at com.databricks.spark.avro.DefaultSource$$anonfun$buildReader$1.apply(DefaultSource.scala:146)	at com.databricks.spark.avro.DefaultSource$$anonfun$buildReader$1.apply(DefaultSource.scala:145)	at org.apache.spark.sql.execution.datasources.FileFormat$$anon$1.apply(fileSourceInterfaces.scala:279)	at org.apache.spark.sql.execution.datasources.FileFormat$$anon$1.apply(fileSourceInterfaces.scala:263)	at org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1.nextIterator(FileScanRDD.scala:116)	at org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1.hasNext(FileScanRDD.scala:91)	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)	at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:370)	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)	at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1682)	at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1115)	at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1115)	at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1897)	at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1897)	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)	at org.apache.spark.scheduler.Task.run(Task.scala:85)	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)	at java.lang.Thread.run(Thread.java:745)