Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Stack Overflow by hsuk
, 1 year ago
cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
via Stack Overflow by Rahul Shukla
, 1 year ago
cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
via Stack Overflow by renegademonkey
, 5 months ago
cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
via Stack Overflow by user3709612
, 3 months ago
cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
via Google Groups by ** rafael **, 1 month ago
cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
via Stack Overflow by user1870400
, 1 year ago
cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
java.lang.ClassCastException: cannot assign instance of 
scala.collection.immutable.List$SerializationProxy to field 
org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of 
type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD	at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2063)	at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1241)	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1976)	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1894)	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1777)	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1970)	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1894)	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1777)	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:71)	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)	at org.apache.spark.scheduler.Task.run(Task.scala:85)	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)	at java.lang.Thread.run(Thread.java:722)