org.apache.spark.SparkException

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • How to read textual lzo seq with scala spark?
    via Stack Overflow by Jas
    ,
    • org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:304) at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:294) at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:122) at org.apache.spark.SparkContext.clean(SparkContext.scala:2055) at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:919) at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:918) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111) at org.apache.spark.rdd.RDD.withScope(RDD.scala:316) at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:918) at org.apache.spark.api.java.JavaRDDLike$class.foreachPartition(JavaRDDLike.scala:225) at org.apache.spark.api.java.AbstractJavaRDDLike.foreachPartition(JavaRDDLike.scala:46) at final_file.Insert.main(Insert.java:59) Caused by: java.io.NotSerializableException: java.lang.Object Serialization stack: - object not serializable (class: java.lang.Object, value: java.lang.Object@4395342) - writeObject data (class: java.util.HashMap) - object (class java.util.HashMap, {UTF-8=java.lang.Object@4395342, WINDOWS-1252=com.mysql.jdbc.SingleByteCharsetConverter@72ffabab, US-ASCII=com.mysql.jdbc.SingleByteCharsetConverter@6f5fa288}) - field (class: com.mysql.jdbc.ConnectionImpl, name: charsetConverterMap, type: interface java.util.Map) - object (class com.mysql.jdbc.JDBC4Connection, com.mysql.jdbc.JDBC4Connection@6761e52a) - field (class: final_file.Insert$2, name: conn, type: interface com.mysql.jdbc.Connection) - object (class final_file.Insert$2, final_file.Insert$2@45436e66) - field (class: org.apache.spark.api.java.JavaRDDLike$$anonfun$foreachPartition$1, name: f$12, type: interface org.apache.spark.api.java.function.VoidFunction) - object (class org.apache.spark.api.java.JavaRDDLike$$anonfun$foreachPartition$1, <function1>) at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40) at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47) at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:101) at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:301) ... 12 more

    Users with the same issue

    rp
    1 times, last one,
    bandoca
    1 times, last one,
    johnxfly
    14 times, last one,
    Nikolay Rybak
    15 times, last one,
    tyson925
    22 times, last one,
    7 more bugmates