java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD

Stack Overflow | user1870400 | 6 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    How to fix java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List to field type scala.collection.Seq?

    Stack Overflow | 6 months ago | user1870400
    java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
  2. 0

    Spark 1.6.0 throwing classcast exception in Cluster Mode works fine in local mode

    Stack Overflow | 1 year ago | Rahul Shukla
    java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
  3. 0

    Define spark udf by reflection on a String

    Stack Overflow | 8 months ago | sourabh
    java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    com.esotericsoftware.kryo.KryoException

    GitHub | 2 years ago | security08
    java.lang.ClassCastException: cannot assign instance of com.cloudera.oryx.lambda.ValueToWritableFunction to field org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.x$335 of type org.apache.spark.api.java.function.PairFunction in instance of org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1
  6. 0

    java.lang.ClassCastException using lambda expressions in spark job on remote server

    Stack Overflow | 2 years ago | SAM
    java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1

  1. johnxfly 54 times, last 5 days ago
  2. bpbhat77 2 times, last 8 months ago
  3. shiva768 1 times, last 1 year ago
11 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.ClassCastException

    cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD

    at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues()
  2. Java RT
    ObjectInputStream.readObject
    1. java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2133)
    2. java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1305)
    3. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2006)
    4. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
    5. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
    6. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
    7. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
    8. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
    9. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
    10. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
    11. java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
    11 frames
  3. Spark
    Executor$TaskRunner.run
    1. org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
    2. org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
    3. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    4. org.apache.spark.scheduler.Task.run(Task.scala:85)
    5. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
    5 frames
  4. Java RT
    Thread.run
    1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    3. java.lang.Thread.run(Thread.java:745)
    3 frames