java.io.IOException: java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.HashMap$SerializationProxy to field org.apache.spark.executor.TaskMetrics._accumulatorUpdates of type scala.collection.immutable.Map in instance of org.apache.spark.executor.TaskMetrics

GitHub | AliTajeldin | 4 months ago
  1. 0

    testcase "test Edd dataPathToEddPath" is generating an exception.

    GitHub | 4 months ago | AliTajeldin
    java.io.IOException: java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.HashMap$SerializationProxy to field org.apache.spark.executor.TaskMetrics._accumulatorUpdates of type scala.collection.immutable.Map in instance of org.apache.spark.executor.TaskMetrics
  2. 0

    Apache Spark: Uncaught exception in thread driver-heartbeater

    Stack Overflow | 1 year ago | rakesh
    java.io.IOException: java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.HashMap$SerializationProxy to field org.apache.spark.executor.TaskMetrics._accumulatorUpdates of type scala.collection.immutable.Map in instance of org.apache.spark.executor.TaskMetrics
  3. 0

    Spark 1.6.0 executor dies because of ClassCastException and causes timeout

    Stack Overflow | 11 months ago | Alexandru Rosianu
    java.io.IOException: java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.HashMap$SerializationProxy to field org.apache.spark.executor.TaskMetrics._accumulatorUpdates of type scala.collection.immutable.Map in instance of org.apache.spark.executor.TaskMetrics
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    spark sql dataframe join with renaming in a loop

    Stack Overflow | 10 months ago | user2038119
    java.io.IOException: java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.HashMap$SerializationProxy to field org.apache.spark.executor.TaskMetrics._accumulatorUpdates of type scala.collection.immutable.Map in instance of org.apache.spark.executor.TaskMetrics
  6. 0

    [jira] [Commented] (SPARK-12675) Executor dies because of ClassCastException and causes timeout

    spark-issues | 8 months ago | Matt Butler (JIRA)
    java.io.IOException: java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.HashMap$SerializationProxy to field org.apache.spark.executor.TaskMetrics._accumulatorUpdates of type scala.collection.immutable.Map in instance of org.apache.spark.executor.TaskMetrics

    5 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.ClassCastException

      cannot assign instance of scala.collection.immutable.HashMap$SerializationProxy to field org.apache.spark.executor.TaskMetrics._accumulatorUpdates of type scala.collection.immutable.Map in instance of org.apache.spark.executor.TaskMetrics

      at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues()
    2. Java RT
      ObjectInputStream.defaultReadObject
      1. java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2089)
      2. java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1261)
      3. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1999)
      4. java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:501)
      4 frames
    3. Spark
      TaskMetrics.readObject
      1. org.apache.spark.executor.TaskMetrics$$anonfun$readObject$1.apply$mcV$sp(TaskMetrics.scala:220)
      2. org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1160)
      3. org.apache.spark.executor.TaskMetrics.readObject(TaskMetrics.scala:219)
      3 frames
    4. Java RT
      ObjectInputStream.readObject
      1. sun.reflect.GeneratedMethodAccessor53.invoke(Unknown Source)
      2. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      3. java.lang.reflect.Method.invoke(Method.java:483)
      4. java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
      5. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1896)
      6. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
      7. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
      8. java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
      8 frames
    5. Spark
      Executor$$anonfun$org$apache$spark$executor$Executor$$reportHeartBeat$1$$anonfun$apply$6.apply
      1. org.apache.spark.util.Utils$.deserialize(Utils.scala:91)
      2. org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$reportHeartBeat$1$$anonfun$apply$6.apply(Executor.scala:440)
      3. org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$reportHeartBeat$1$$anonfun$apply$6.apply(Executor.scala:430)
      3 frames
    6. Scala
      Option.foreach
      1. scala.Option.foreach(Option.scala:236)
      1 frame
    7. Spark
      Executor$$anonfun$org$apache$spark$executor$Executor$$reportHeartBeat$1.apply
      1. org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$reportHeartBeat$1.apply(Executor.scala:430)
      2. org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$reportHeartBeat$1.apply(Executor.scala:428)
      2 frames
    8. Scala
      AbstractIterable.foreach
      1. scala.collection.Iterator$class.foreach(Iterator.scala:727)
      2. scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
      3. scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
      4. scala.collection.AbstractIterable.foreach(Iterable.scala:54)
      4 frames
    9. Spark
      Executor$$anon$1.run
      1. org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$reportHeartBeat(Executor.scala:428)
      2. org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply$mcV$sp(Executor.scala:472)
      3. org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply(Executor.scala:472)
      4. org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply(Executor.scala:472)
      5. org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
      6. org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:472)
      6 frames
    10. Java RT
      Thread.run
      1. java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
      2. java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
      3. java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
      4. java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
      5. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      6. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      7. java.lang.Thread.run(Thread.java:745)
      7 frames