java.lang.ClassCastException: cannot assign instance of com.databricks.spark.csv.package$CsvSchemaRDD$$anonfun$9 to field org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$22.cleanedF$2 of type scala.Function2 in instance of org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$22

GitHub | zzeekk | 4 months ago
  1. 0

    ClassCastException when writing CSV-File

    GitHub | 4 months ago | zzeekk
    java.lang.ClassCastException: cannot assign instance of com.databricks.spark.csv.package$CsvSchemaRDD$$anonfun$9 to field org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$22.cleanedF$2 of type scala.Function2 in instance of org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$22
  2. 0

    Spark - java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDDLike

    Stack Overflow | 5 months ago | Manish Kumar
    java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.f$3 of type org.apache.spark.api.java.function.FlatMapFunction in instance of org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1
  3. 0

    GitHub comment 427#170101477

    GitHub | 11 months ago | kevinmeredith
    java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.scalatest.prop.Checkers$$anonfun$doCheck$3.scalaCheckArgs$2 of type scala.collection.immutable.List in instance of org.scalatest.prop.Checkers$$anonfun$doCheck$3
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    GitHub comment 427#170433961

    GitHub | 11 months ago | sullis
    java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.scalacheck.Test$Failed.args of type scala.collection.immutable.List in instance of org.scalacheck.Test$Failed
  6. 0

    Spark 1.6.0 throwing classcast exception in Cluster Mode works fine in local mode

    Stack Overflow | 9 months ago | Rahul Shukla
    java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD

  1. bpbhat77 2 times, last 4 months ago
  2. shiva768 1 times, last 11 months ago
11 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.ClassCastException

    cannot assign instance of com.databricks.spark.csv.package$CsvSchemaRDD$$anonfun$9 to field org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$22.cleanedF$2 of type scala.Function2 in instance of org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$22

    at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues()
  2. Java RT
    ObjectInputStream.readObject
    1. java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2089)
    2. java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1261)
    3. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2006)
    4. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
    5. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
    6. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
    7. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
    8. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
    9. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
    10. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
    11. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
    12. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
    13. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
    14. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
    15. java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
    15 frames
  3. Scala
    $colon$colon.readObject
    1. scala.collection.immutable.$colon$colon.readObject(List.scala:362)
    1 frame
  4. Java RT
    ObjectInputStream.readObject
    1. sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
    2. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    3. java.lang.reflect.Method.invoke(Method.java:497)
    4. java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
    5. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1900)
    6. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
    7. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
    8. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
    9. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
    10. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
    11. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
    12. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
    13. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
    14. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
    15. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
    16. java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
    16 frames
  5. Spark
    Executor$TaskRunner.run
    1. org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
    2. org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
    3. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
    4. org.apache.spark.scheduler.Task.run(Task.scala:89)
    5. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    5 frames
  6. Java RT
    Thread.run
    1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    3. java.lang.Thread.run(Thread.java:745)
    3 frames