java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD

Stack Overflow | sourabh | 4 months ago
  1. 0

    Define spark udf by reflection on a String

    Stack Overflow | 4 months ago | sourabh
    java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
  2. 0

    Spark 1.6.0 throwing classcast exception in Cluster Mode works fine in local mode

    Stack Overflow | 10 months ago | Rahul Shukla
    java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
  3. 0

    How to fix java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List to field type scala.collection.Seq?

    Stack Overflow | 2 months ago | user1870400
    java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    com.esotericsoftware.kryo.KryoException

    GitHub | 2 years ago | security08
    java.lang.ClassCastException: cannot assign instance of com.cloudera.oryx.lambda.ValueToWritableFunction to field org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.x$335 of type org.apache.spark.api.java.function.PairFunction in instance of org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1
  6. 0

    java.lang.ClassCastException using lambda expressions in spark job on remote server

    Stack Overflow | 2 years ago | SAM
    java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1

  1. bpbhat77 2 times, last 4 months ago
  2. shiva768 1 times, last 11 months ago
11 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.ClassCastException

    cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD

    at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues()
  2. Java RT
    ObjectInputStream.readObject
    1. java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2133)
    2. java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1305)
    3. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2024)
    4. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
    5. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
    6. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
    7. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
    8. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
    9. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
    10. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
    11. java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
    11 frames
  3. Spark
    Executor$TaskRunner.run
    1. org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
    2. org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
    3. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    4. org.apache.spark.scheduler.Task.run(Task.scala:85)
    5. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
    5 frames
  4. Java RT
    Thread.run
    1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    3. java.lang.Thread.run(Thread.java:745)
    3 frames