org.apache.spark.SparkException: Task not serializable org.apache.spark.SparkException: Task not serializable


Solutions on the web

Solution icon of stackoverflow
Task not serializable org.apache.spark.SparkException: Task not serializable

Solution icon of stackoverflow
via Stack Overflow by Steven.Prgm
, 11 months ago
Task not serializable org.apache.spark.SparkException: Task not serializable

Solution icon of github
via GitHub by danielcsant
, 9 months ago
Task not serializable

Solution icon of github
Task not serializable

Solution icon of github
Task not serializable

Solution icon of github
via GitHub by huitseeker
, 1 year ago
Task not serializable

Solution icon of apache
via spark-dev by Wail Alkowaileet, 3 months ago
Task not serializable

Solution icon of apache
via spark-user by mickdelaney, 1 year ago
Task not serializable

Solution icon of apache
via spark-dev by Wail Alkowaileet, 3 months ago
Task not serializable

Solution icon of stackoverflow
Task not serializable

Stack trace

org.apache.spark.SparkException: Task not serializable org.apache.spark.SparkException: Task not serializable
	at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:304)
	at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:294)
	at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:122)
	at org.apache.spark.SparkContext.clean(SparkContext.scala:2067)
	at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:911)
	at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:910)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
	at org.apache.spark.rdd.RDD.foreach(RDD.scala:910)
	at com.personal.practice.kafka.KafkaConsumer$$anonfun$main$2.apply(KafkaConsumer.scala:87)
	at com.personal.practice.kafka.KafkaConsumer$$anonfun$main$2.apply(KafkaConsumer.scala:84)
	at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:661)
	at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:661)
	at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:50)
	at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:50)
	at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:50)
	at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:426)
	at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:49)
	at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:49)
	at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:49)
	at scala.util.Try$.apply(Try.scala:161)
	at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
	at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:224)
	at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:224)
	at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:224)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
	at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:223)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.NotSerializableException: io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient Serialization stack: - object not serializable (class: io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient, value: io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient@163b6ba7) - field (class: scala.runtime.ObjectRef, name: elem, type: class java.lang.Object) - object (class scala.runtime.ObjectRef, io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient@163b6ba7) - field (class: com.personal.practice.kafka.KafkaConsumer$$anonfun$main$2, name: schemaRegistry$1, type: class scala.runtime.ObjectRef) - object (class com.personal.practice.kafka.KafkaConsumer$$anonfun$main$2, ) - field (class: com.personal.practice.kafka.KafkaConsumer$$anonfun$main$2$$anonfun$apply$1, name: $outer, type: class com.personal.practice.kafka.KafkaConsumer$$anonfun$main$2) - object (class com.personal.practice.kafka.KafkaConsumer$$anonfun$main$2$$anonfun$apply$1, )
	at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
	at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47)
	at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:101)
	at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:301)
	... 30 more

Write tip

You have a different solution? A short tip here would help you and many other users who saw this issue last week.

Users with the same issue

Once, 1 month ago
Once, 4 months ago
2 times, 11 months ago
14 times, 7 months ago
30 times, 9 months ago
7 more bugmates