java.io.InvalidClassException: scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc serialVersionUID = -4937928798201944954, local class serialVersionUID = -8102093212602380348

JIRA | Hari Sekhon | 2 years ago
  1. 0

    I'm deploying SparkR on a second cluster now (Hortonworks HDP 2.1) but am seeing an error I've seen before when running SparkR: {code} java.io.InvalidClassException: scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc serialVersionUID = -4937928798201944954, local class serialVersionUID = -8102093212602380348 {code} I've gotten past this before by compiling with the right SPARK_HADOOP_VERSION but I've migrated the exact same version of Spark (1.0.0-bin-hadoop2 the ready made one) to the new cluster, started it in the same way (standalone) and I've tried copying both the original SparkR lib as well as compiling a new version with the more correct hadoop version 2.4.0 but both give the same result. I've tried switching {code} SPARK_HADOOP_VERSION=2.2.0 {code} to {code} SPARK_HADOOP_VERSION=2.4.0 {code} Spark hasn't changed since the old cluster, still spark-1.0.0-bin-hadoop2, although it's not clear which version of Hadoop that was compiled against to try tweaking SPARK_HADOOP_VERSION. I'm also running the same version of Scala on both clusters, 2.10.3 from the typesafe rpm. The only other difference is that this is running in the Revolution R distro which is actually just open source R-3.0.3 with a few additional libraries. Any ideas on what I need to do to recompile this with the correct settings for spark-1.0.0-bin-hadoop2 and HDP 2.1? Full output below: {code} > library(SparkR) Loading required package: rJava [SparkR] Initializing with classpath /usr/lib64/Revo-7.2/R-3.0.3/lib64/R/library/SparkR/sparkr-assembly-0.1.jar > sc <- sparkR.init(master="spark://myMasterHost:7077", appName="Hari's app") SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/lib64/Revo-7.2/R-3.0.3/lib64/R/library/SparkR/sparkr-assembly-0.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 14/08/04 17:47:04 INFO Slf4jLogger: Slf4jLogger started 14/08/04 17:47:05 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable > lines <- textFile(sc, "hdfs:/data/file") 14/08/04 17:47:06 WARN SizeEstimator: Failed to check whether UseCompressedOops is set; assuming yes 14/08/04 17:47:06 WARN BlockReaderLocal: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. 14/08/04 17:47:06 INFO FileInputFormat: Total input paths to process : 1 14/08/04 17:47:11 WARN TaskSetManager: Lost TID 0 (task 0.0:0) 14/08/04 17:47:11 WARN TaskSetManager: Loss was due to java.io.InvalidClassException java.io.InvalidClassException: scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc serialVersionUID = -4937928798201944954, local class serialVersionUID = -8102093212602380348 at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) at scala.collection.immutable.$colon$colon.readObject(List.scala:362) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) at scala.collection.immutable.$colon$colon.readObject(List.scala:362) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63) at org.apache.spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:61) at org.apache.spark.scheduler.ResultTask.readExternal(ResultTask.scala:141) at java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1837) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63) at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:85) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:169) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) 14/08/04 17:47:11 WARN TaskSetManager: Lost TID 1 (task 0.0:1) 14/08/04 17:47:11 WARN TaskSetManager: Lost TID 2 (task 0.0:1) 14/08/04 17:47:12 WARN TaskSetManager: Lost TID 4 (task 0.0:1) 14/08/04 17:47:12 WARN TaskSetManager: Lost TID 3 (task 0.0:0) 14/08/04 17:47:12 WARN TaskSetManager: Lost TID 5 (task 0.0:0) 14/08/04 17:47:12 WARN TaskSetManager: Lost TID 7 (task 0.0:0) 14/08/04 17:47:12 ERROR TaskSetManager: Task 0.0:0 failed 4 times; aborting job Error in .jcall(getJRDD(rdd), "Ljava/util/List;", "collect") : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0.0:0 failed 4 times, most recent failure: Exception failure in TID 7 on host <launching_host>: java.io.InvalidClassException: scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc serialVersionUID = -4937928798201944954, local class serialVersionUID = -8102093212602380348 java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617) java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622) java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517) java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771) java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) java.io.ObjectInputStream.readOrdinaryObject(ObjectIn {code}

    JIRA | 2 years ago | Hari Sekhon
    java.io.InvalidClassException: scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc serialVersionUID = -4937928798201944954, local class serialVersionUID = -8102093212602380348
  2. 0

    I'm deploying SparkR on a second cluster now (Hortonworks HDP 2.1) but am seeing an error I've seen before when running SparkR: {code} java.io.InvalidClassException: scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc serialVersionUID = -4937928798201944954, local class serialVersionUID = -8102093212602380348 {code} I've gotten past this before by compiling with the right SPARK_HADOOP_VERSION but I've migrated the exact same version of Spark (1.0.0-bin-hadoop2 the ready made one) to the new cluster, started it in the same way (standalone) and I've tried copying both the original SparkR lib as well as compiling a new version with the more correct hadoop version 2.4.0 but both give the same result. I've tried switching {code} SPARK_HADOOP_VERSION=2.2.0 {code} to {code} SPARK_HADOOP_VERSION=2.4.0 {code} Spark hasn't changed since the old cluster, still spark-1.0.0-bin-hadoop2, although it's not clear which version of Hadoop that was compiled against to try tweaking SPARK_HADOOP_VERSION. I'm also running the same version of Scala on both clusters, 2.10.3 from the typesafe rpm. The only other difference is that this is running in the Revolution R distro which is actually just open source R-3.0.3 with a few additional libraries. Any ideas on what I need to do to recompile this with the correct settings for spark-1.0.0-bin-hadoop2 and HDP 2.1? Full output below: {code} > library(SparkR) Loading required package: rJava [SparkR] Initializing with classpath /usr/lib64/Revo-7.2/R-3.0.3/lib64/R/library/SparkR/sparkr-assembly-0.1.jar > sc <- sparkR.init(master="spark://myMasterHost:7077", appName="Hari's app") SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/lib64/Revo-7.2/R-3.0.3/lib64/R/library/SparkR/sparkr-assembly-0.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 14/08/04 17:47:04 INFO Slf4jLogger: Slf4jLogger started 14/08/04 17:47:05 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable > lines <- textFile(sc, "hdfs:/data/file") 14/08/04 17:47:06 WARN SizeEstimator: Failed to check whether UseCompressedOops is set; assuming yes 14/08/04 17:47:06 WARN BlockReaderLocal: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. 14/08/04 17:47:06 INFO FileInputFormat: Total input paths to process : 1 14/08/04 17:47:11 WARN TaskSetManager: Lost TID 0 (task 0.0:0) 14/08/04 17:47:11 WARN TaskSetManager: Loss was due to java.io.InvalidClassException java.io.InvalidClassException: scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc serialVersionUID = -4937928798201944954, local class serialVersionUID = -8102093212602380348 at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) at scala.collection.immutable.$colon$colon.readObject(List.scala:362) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) at scala.collection.immutable.$colon$colon.readObject(List.scala:362) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63) at org.apache.spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:61) at org.apache.spark.scheduler.ResultTask.readExternal(ResultTask.scala:141) at java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1837) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63) at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:85) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:169) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) 14/08/04 17:47:11 WARN TaskSetManager: Lost TID 1 (task 0.0:1) 14/08/04 17:47:11 WARN TaskSetManager: Lost TID 2 (task 0.0:1) 14/08/04 17:47:12 WARN TaskSetManager: Lost TID 4 (task 0.0:1) 14/08/04 17:47:12 WARN TaskSetManager: Lost TID 3 (task 0.0:0) 14/08/04 17:47:12 WARN TaskSetManager: Lost TID 5 (task 0.0:0) 14/08/04 17:47:12 WARN TaskSetManager: Lost TID 7 (task 0.0:0) 14/08/04 17:47:12 ERROR TaskSetManager: Task 0.0:0 failed 4 times; aborting job Error in .jcall(getJRDD(rdd), "Ljava/util/List;", "collect") : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0.0:0 failed 4 times, most recent failure: Exception failure in TID 7 on host <launching_host>: java.io.InvalidClassException: scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc serialVersionUID = -4937928798201944954, local class serialVersionUID = -8102093212602380348 java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617) java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622) java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517) java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771) java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) java.io.ObjectInputStream.readOrdinaryObject(ObjectIn {code}

    JIRA | 2 years ago | Hari Sekhon
    java.io.InvalidClassException: scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc serialVersionUID = -4937928798201944954, local class serialVersionUID = -8102093212602380348
  3. 0

    [SPARKR-72] local class incompatible serialVersionUID - JIRA

    atlassian.net | 1 year ago
    java.io.InvalidClassException: scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc serialVersionUID = -4937928798201944954, local class serialVersionUID = -8102093212602380348
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    [jira] [Commented] (SPARK-2018) Big-Endian (IBM Power7) Spark Serialization issue

    apache.org | 1 year ago
    java.io.InvalidClassException: scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc serialVersionUID = -8102093212602380348, local class serialVersionUID = -4937928798201944954
  6. 0

    Submit application to spark cluster: Error local class incompatible

    Stack Overflow | 7 months ago | Hao WU
    java.io.InvalidClassException: javax.servlet.GenericServlet; local class incompatible: stream classdesc serialVersionUID = 1, local class serialVersionUID = -8592279577370996712

  1. ajinkya_w 11 times, last 2 months ago
  2. ajinkya_w 11 times, last 2 months ago
  3. ivotje50 5 times, last 3 months ago
  4. bpbhat77 1 times, last 4 months ago
  5. pnaranja 1 times, last 4 months ago
18 more registered users
20 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.io.InvalidClassException

    scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc serialVersionUID = -4937928798201944954, local class serialVersionUID = -8102093212602380348

    at java.io.ObjectStreamClass.initNonProxy()
  2. Java RT
    ObjectInputStream.readObject
    1. java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
    2. java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
    3. java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
    4. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
    5. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
    6. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
    7. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
    8. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
    9. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
    10. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
    11. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
    12. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
    13. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
    14. java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
    14 frames
  3. Scala
    $colon$colon.readObject
    1. scala.collection.immutable.$colon$colon.readObject(List.scala:362)
    1 frame
  4. Java RT
    ObjectInputStream.readObject
    1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    4. java.lang.reflect.Method.invoke(Method.java:606)
    5. java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
    6. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
    7. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
    8. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
    9. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
    10. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
    11. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
    12. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
    13. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
    14. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
    15. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
    16. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
    17. java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
    17 frames
  5. Scala
    $colon$colon.readObject
    1. scala.collection.immutable.$colon$colon.readObject(List.scala:362)
    1 frame
  6. Java RT
    ObjectInputStream.readObject
    1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    4. java.lang.reflect.Method.invoke(Method.java:606)
    5. java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
    6. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
    7. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
    8. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
    9. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
    10. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
    11. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
    12. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
    13. java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
    13 frames
  7. Spark
    ResultTask.readExternal
    1. org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63)
    2. org.apache.spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:61)
    3. org.apache.spark.scheduler.ResultTask.readExternal(ResultTask.scala:141)
    3 frames
  8. Java RT
    ObjectInputStream.readObject
    1. java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1837)
    2. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
    3. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
    4. java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
    4 frames
  9. Spark
    Executor$TaskRunner.run
    1. org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63)
    2. org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:85)
    3. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:169)
    3 frames
  10. Java RT
    Thread.run
    1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    3. java.lang.Thread.run(Thread.java:745)
    3 frames