java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD

Stack Overflow | hsuk | 3 months ago
  1. 0

    Apache Spark, working on local but gives error on cluster

    Stack Overflow | 3 months ago | hsuk
    java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
  2. 0

    Spark 1.6.0 throwing classcast exception in Cluster Mode works fine in local mode

    Stack Overflow | 10 months ago | Rahul Shukla
    java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
  3. 0

    com.esotericsoftware.kryo.KryoException

    GitHub | 2 years ago | security08
    java.lang.ClassCastException: cannot assign instance of com.cloudera.oryx.lambda.ValueToWritableFunction to field org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.x$335 of type org.apache.spark.api.java.function.PairFunction in instance of org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    java.lang.ClassCastException using lambda expressions in spark job on remote server

    Stack Overflow | 2 years ago | SAM
    java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaRDD$$anonfun$filter$1
  6. 0

    GitHub comment 207#104686934

    GitHub | 2 years ago | dsdinter
    java.lang.ClassCastException: cannot assign instance of scala.None$ to field org.apache.spark.scheduler.Task.metrics of type scala.Option in instance of org.apache.spark.scheduler.ResultTask

  1. bpbhat77 2 times, last 4 months ago
  2. shiva768 1 times, last 11 months ago
11 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.ClassCastException

    cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD

    at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues()
  2. Java RT
    ObjectInputStream.readObject
    1. java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2063)
    2. java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1241)
    3. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1976)
    4. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1894)
    5. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1777)
    6. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
    7. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1970)
    8. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1894)
    9. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1777)
    10. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
    11. java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
    11 frames
  3. Spark
    Executor$TaskRunner.run
    1. org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
    2. org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
    3. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:71)
    4. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)
    5. org.apache.spark.scheduler.Task.run(Task.scala:85)
    6. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
    6 frames
  4. Java RT
    Thread.run
    1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    3. java.lang.Thread.run(Thread.java:722)
    3 frames