Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

java.io.NotSerializableException: java.lang.Object
Serialization stack:
    - object not serializable (class: java.lang.Object, value: java.lang.Object@4395342)
    - writeObject data (class: java.util.HashMap)
    - object (class java.util.HashMap, {UTF-8=java.lang.Object@4395342, WINDOWS-1252=com.mysql.jdbc.SingleByteCharsetConverter@72ffabab, US-ASCII=com.mysql.jdbc.SingleByteCharsetConverter@6f5fa288})
    - field (class: com.mysql.jdbc.ConnectionImpl, name: charsetConverterMap, type: interface java.util.Map)
    - object (class com.mysql.jdbc.JDBC4Connection, com.mysql.jdbc.JDBC4Connection@6761e52a)
    - field (class: final_file.Insert$2, name: conn, type: interface com.mysql.jdbc.Connection)
    - object (class final_file.Insert$2, final_file.Insert$2@45436e66)
    - field (class: org.apache.spark.api.java.JavaRDDLike$$anonfun$foreachPartition$1, name: f$12, type: interface org.apache.spark.api.java.function.VoidFunction)
    - object (class org.apache.spark.api.java.JavaRDDLike$$anonfun$foreachPartition$1, <function1>)	at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)	at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47)	at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:101)	at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:301)	at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:294)	at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:122)	at org.apache.spark.SparkContext.clean(SparkContext.scala:2055)	at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:919)	at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:918)	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)	at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)	at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:918)	at org.apache.spark.api.java.JavaRDDLike$class.foreachPartition(JavaRDDLike.scala:225)	at org.apache.spark.api.java.AbstractJavaRDDLike.foreachPartition(JavaRDDLike.scala:46)	at final_file.Insert.main(Insert.java:59)