java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303

JIRA | ElvisDeng | 3 years ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Hi experts, The sparkR example cannot run in my spark cluster: > Spark standalone cluster info: URL: spark://SparkMaster:7077 Workers: 3 Cores: 12 Total, 0 Used Memory: 44.0 GB Total, 0.0 B Used Applications: 0 Running, 3 Completed Drivers: 0 Running, 0 Completed Status: ALIVE *root@SparkMaster:/data/SparkR-pkg-master# _SPARK_MEM=1g ./sparkR examples/pi.R spark://SparkMaster:7077_* ~~~ Loading required package: SparkR Loading required package: methods Loading required package: rJava [SparkR] Initializing with classpath /data/SparkR-pkg-master/lib/SparkR/sparkr-assembly-0.1.jar 14/04/17 02:02:20 INFO Slf4jLogger: Slf4jLogger started 14/04/17 02:02:38 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory 14/04/17 02:02:53 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory 14/04/17 02:03:08 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory 14/04/17 02:03:22 ERROR AppClient$ClientActor: All masters are unresponsive! Giving up. 14/04/17 02:03:22 ERROR SparkDeploySchedulerBackend: Spark cluster looks dead, giving up. Error in .jcall(getJRDD(rdd), "Ljava/util/List;", "collect") : org.apache.spark.SparkException: Job aborted: Spark cluster looks down Calls: reduce ... collect -> collect -> .local -> .jcall -> .jcheck -> .Call Execution halted ~~~ The log in Spark is as below: ~~~ 14/04/17 02:05:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:05:58 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/endpointManager/reliableEndpointWriter-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-54/endpointWriter/endpointReader-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-0#-1459605026] was not delivered. [107] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:05:58 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40070-94#-1641183893] was not delivered. [108] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:18 ERROR Remoting: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303 java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303 at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:592) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1621) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1516) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1770) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1914) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1797) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:369) at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136) at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) at scala.util.Try$.apply(Try.scala:161) at akka.serialization.Serialization.deserialize(Serialization.scala:98) at akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:58) at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) at scala.util.Try$.apply(Try.scala:161) at akka.serialization.Serialization.deserialize(Serialization.scala:98) at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23) at akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:55) at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:55) at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:73) at akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:764) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498) at akka.actor.ActorCell.invoke(ActorCell.scala:456) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237) at akka.dispatch.Mailbox.run(Mailbox.scala:219) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 14/04/17 02:06:18 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/endpointManager/endpointWriter-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-55/endpointReader-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-0#-1568863113] was not delivered. [109] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:18 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:18 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40072-96#905162390] was not delivered. [110] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:38 ERROR Remoting: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303 java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303 at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:592) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1621) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1516) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1770) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1914) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1797) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:369) at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136) at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) at scala.util.Try$.apply(Try.scala:161) at akka.serialization.Serialization.deserialize(Serialization.scala:98) at akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:58) at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) at scala.util.Try$.apply(Try.scala:161) at akka.serialization.Serialization.deserialize(Serialization.scala:98) at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23) at akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:55) at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:55) at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:73) at akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:764) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498) at akka.actor.ActorCell.invoke(ActorCell.scala:456) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237) at akka.dispatch.Mailbox.run(Mailbox.scala:219) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 14/04/17 02:06:38 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:38 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/endpointManager/endpointWriter-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-56/endpointReader-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-0#-2041306928] was not delivered. [111] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:38 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40073-97#-544240507] was not delivered. [112] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40074-98#938859376] was not delivered. [113] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:58 INFO LocalActorRef: Message [akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40074-98#938859376] was not delivered. [114] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 INFO LocalActorRef: Message [akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-95#-798623901] was not delivered. [115] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:58 ERROR EndpointWriter: AssociationError [akka.tcp://sparkMaster@SparkMaster:7077] -> [akka.tcp://spark@SparkMaster.xxx.com:56718]: Error [Association failed with [akka.tcp://spark@SparkMaster.xxx.com:56718]] [ akka.remote.EndpointAssociationException: Association failed with [akka.tcp://spark@SparkMaster.xxx.com:56718] Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: SparkMaster.xxx.com/142.133.50.58:56718 ] 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 ERROR EndpointWriter: AssociationError [akka.tcp://sparkMaster@SparkMaster:7077] -> [akka.tcp://spark@SparkMaster.xxx.com:56718]: Error [Association failed with [akka.tcp://spark@SparkMaster.xxx.com:56718]] [ akka.remote.EndpointAssociationException: Association failed with [akka.tcp://spark@SparkMaster.xxx.com:56718] Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: SparkMaster.xxx.com/142.133.50.58:56718 ] 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. ~~~ SparkR run in local is OK. *root@SparkMaster:/data/SparkR-pkg-master# ./sparkR examples/pi.R local[2]* ~~~ Loading required package: SparkR Loading required package: methods Loading required package: rJava [SparkR] Initializing with classpath /data/SparkR-pkg-master/lib/SparkR/sparkr-assembly-0.1.jar 14/04/17 02:53:59 INFO Slf4jLogger: Slf4jLogger started Pi is roughly 3.1435 Num elements in RDD 200000 ~~~ Spark standalone cluster is OK. *root@SparkMaster:/data/spark# ./bin/run-example org.apache.spark.examples.SparkPi spark://SparkMaster:7077* ~~~ SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/data/spark/examples/target/scala-2.10/spark-examples-assembly-1.0.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/data/spark/sql/hive/target/scala-2.10/spark-hive-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 14/04/17 01:59:29 INFO SecurityManager: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 14/04/17 01:59:29 INFO SecurityManager: SecurityManager, is authentication enabled: false are ui acls enabled: false users with view permissions: Set(root) 14/04/17 01:59:30 INFO Slf4jLogger: Slf4jLogger started 14/04/17 01:59:30 INFO Remoting: Starting remoting ... 14/04/17 01:59:41 INFO DAGScheduler: Completed ResultTask(0, 1) 14/04/17 01:59:41 INFO DAGScheduler: Stage 0 (reduce at SparkPi.scala:39) finished in 7.443 s 14/04/17 01:59:41 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 14/04/17 01:59:41 INFO SparkContext: Job finished: reduce at SparkPi.scala:39, took 7.669905604 s Pi is roughly 3.1458 ... ~~~

    JIRA | 3 years ago | ElvisDeng
    java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303
  2. 0

    Hi experts, The sparkR example cannot run in my spark cluster: > Spark standalone cluster info: URL: spark://SparkMaster:7077 Workers: 3 Cores: 12 Total, 0 Used Memory: 44.0 GB Total, 0.0 B Used Applications: 0 Running, 3 Completed Drivers: 0 Running, 0 Completed Status: ALIVE *root@SparkMaster:/data/SparkR-pkg-master# _SPARK_MEM=1g ./sparkR examples/pi.R spark://SparkMaster:7077_* ~~~ Loading required package: SparkR Loading required package: methods Loading required package: rJava [SparkR] Initializing with classpath /data/SparkR-pkg-master/lib/SparkR/sparkr-assembly-0.1.jar 14/04/17 02:02:20 INFO Slf4jLogger: Slf4jLogger started 14/04/17 02:02:38 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory 14/04/17 02:02:53 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory 14/04/17 02:03:08 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory 14/04/17 02:03:22 ERROR AppClient$ClientActor: All masters are unresponsive! Giving up. 14/04/17 02:03:22 ERROR SparkDeploySchedulerBackend: Spark cluster looks dead, giving up. Error in .jcall(getJRDD(rdd), "Ljava/util/List;", "collect") : org.apache.spark.SparkException: Job aborted: Spark cluster looks down Calls: reduce ... collect -> collect -> .local -> .jcall -> .jcheck -> .Call Execution halted ~~~ The log in Spark is as below: ~~~ 14/04/17 02:05:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:05:58 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/endpointManager/reliableEndpointWriter-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-54/endpointWriter/endpointReader-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-0#-1459605026] was not delivered. [107] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:05:58 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40070-94#-1641183893] was not delivered. [108] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:18 ERROR Remoting: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303 java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303 at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:592) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1621) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1516) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1770) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1914) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1797) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:369) at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136) at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) at scala.util.Try$.apply(Try.scala:161) at akka.serialization.Serialization.deserialize(Serialization.scala:98) at akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:58) at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) at scala.util.Try$.apply(Try.scala:161) at akka.serialization.Serialization.deserialize(Serialization.scala:98) at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23) at akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:55) at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:55) at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:73) at akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:764) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498) at akka.actor.ActorCell.invoke(ActorCell.scala:456) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237) at akka.dispatch.Mailbox.run(Mailbox.scala:219) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 14/04/17 02:06:18 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/endpointManager/endpointWriter-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-55/endpointReader-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-0#-1568863113] was not delivered. [109] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:18 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:18 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40072-96#905162390] was not delivered. [110] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:38 ERROR Remoting: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303 java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303 at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:592) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1621) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1516) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1770) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1914) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1797) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:369) at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136) at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) at scala.util.Try$.apply(Try.scala:161) at akka.serialization.Serialization.deserialize(Serialization.scala:98) at akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:58) at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) at scala.util.Try$.apply(Try.scala:161) at akka.serialization.Serialization.deserialize(Serialization.scala:98) at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23) at akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:55) at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:55) at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:73) at akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:764) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498) at akka.actor.ActorCell.invoke(ActorCell.scala:456) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237) at akka.dispatch.Mailbox.run(Mailbox.scala:219) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 14/04/17 02:06:38 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:38 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/endpointManager/endpointWriter-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-56/endpointReader-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-0#-2041306928] was not delivered. [111] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:38 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40073-97#-544240507] was not delivered. [112] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40074-98#938859376] was not delivered. [113] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:58 INFO LocalActorRef: Message [akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40074-98#938859376] was not delivered. [114] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 INFO LocalActorRef: Message [akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-95#-798623901] was not delivered. [115] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:58 ERROR EndpointWriter: AssociationError [akka.tcp://sparkMaster@SparkMaster:7077] -> [akka.tcp://spark@SparkMaster.xxx.com:56718]: Error [Association failed with [akka.tcp://spark@SparkMaster.xxx.com:56718]] [ akka.remote.EndpointAssociationException: Association failed with [akka.tcp://spark@SparkMaster.xxx.com:56718] Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: SparkMaster.xxx.com/142.133.50.58:56718 ] 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 ERROR EndpointWriter: AssociationError [akka.tcp://sparkMaster@SparkMaster:7077] -> [akka.tcp://spark@SparkMaster.xxx.com:56718]: Error [Association failed with [akka.tcp://spark@SparkMaster.xxx.com:56718]] [ akka.remote.EndpointAssociationException: Association failed with [akka.tcp://spark@SparkMaster.xxx.com:56718] Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: SparkMaster.xxx.com/142.133.50.58:56718 ] 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. ~~~ SparkR run in local is OK. *root@SparkMaster:/data/SparkR-pkg-master# ./sparkR examples/pi.R local[2]* ~~~ Loading required package: SparkR Loading required package: methods Loading required package: rJava [SparkR] Initializing with classpath /data/SparkR-pkg-master/lib/SparkR/sparkr-assembly-0.1.jar 14/04/17 02:53:59 INFO Slf4jLogger: Slf4jLogger started Pi is roughly 3.1435 Num elements in RDD 200000 ~~~ Spark standalone cluster is OK. *root@SparkMaster:/data/spark# ./bin/run-example org.apache.spark.examples.SparkPi spark://SparkMaster:7077* ~~~ SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/data/spark/examples/target/scala-2.10/spark-examples-assembly-1.0.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/data/spark/sql/hive/target/scala-2.10/spark-hive-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 14/04/17 01:59:29 INFO SecurityManager: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 14/04/17 01:59:29 INFO SecurityManager: SecurityManager, is authentication enabled: false are ui acls enabled: false users with view permissions: Set(root) 14/04/17 01:59:30 INFO Slf4jLogger: Slf4jLogger started 14/04/17 01:59:30 INFO Remoting: Starting remoting ... 14/04/17 01:59:41 INFO DAGScheduler: Completed ResultTask(0, 1) 14/04/17 01:59:41 INFO DAGScheduler: Stage 0 (reduce at SparkPi.scala:39) finished in 7.443 s 14/04/17 01:59:41 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 14/04/17 01:59:41 INFO SparkContext: Job finished: reduce at SparkPi.scala:39, took 7.669905604 s Pi is roughly 3.1458 ... ~~~

    JIRA | 3 years ago | ElvisDeng
    java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303
  3. 0

    Standalone spark cluster. Can't submit job programmatically -> java.io.InvalidClassException

    Stack Overflow | 2 years ago | Dr.Khu
    java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = 583745679236071411
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    How to submit a spark Job Programatically

    Stack Overflow | 2 years ago
    java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = 7674242335164700840, local class serialVersionUID = 2596819202403185464
  6. 0

    Apache Spark: ERROR local class incompatible when initiating a SparkContext class

    Stack Overflow | 1 year ago | keypoint
    java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = 7674242335164700840, local class serialVersionUID = -7685200927816255400

  1. Hronom 1 times, last 1 month ago
  2. ajinkya_w 11 times, last 4 months ago
  3. ivotje50 5 times, last 6 months ago
  4. bpbhat77 1 times, last 6 months ago
  5. pnaranja 1 times, last 7 months ago
17 more registered users
20 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.io.InvalidClassException

    org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303

    at java.io.ObjectStreamClass.initNonProxy()
  2. Java RT
    ObjectInputStream.readObject
    1. java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:592)
    2. java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1621)
    3. java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1516)
    4. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1770)
    5. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349)
    6. java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989)
    7. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1914)
    8. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1797)
    9. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349)
    10. java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
    10 frames
  3. Akka Actor
    JavaSerializer$$anonfun$1.apply
    1. akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
    1 frame
  4. Scala
    DynamicVariable.withValue
    1. scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
    1 frame
  5. Akka Actor
    Serialization$$anonfun$deserialize$1.apply
    1. akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
    2. akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
    2 frames
  6. Scala
    Try$.apply
    1. scala.util.Try$.apply(Try.scala:161)
    1 frame
  7. Akka Actor
    Serialization.deserialize
    1. akka.serialization.Serialization.deserialize(Serialization.scala:98)
    1 frame
  8. Akka Remote
    MessageContainerSerializer.fromBinary
    1. akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:58)
    1 frame
  9. Akka Actor
    Serialization$$anonfun$deserialize$1.apply
    1. akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
    1 frame
  10. Scala
    Try$.apply
    1. scala.util.Try$.apply(Try.scala:161)
    1 frame
  11. Akka Actor
    Serialization.deserialize
    1. akka.serialization.Serialization.deserialize(Serialization.scala:98)
    1 frame
  12. Akka Remote
    EndpointReader$$anonfun$receive$2.applyOrElse
    1. akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
    2. akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:55)
    3. akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:55)
    4. akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:73)
    5. akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:764)
    5 frames
  13. Akka Actor
    ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec
    1. akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
    2. akka.actor.ActorCell.invoke(ActorCell.scala:456)
    3. akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
    4. akka.dispatch.Mailbox.run(Mailbox.scala:219)
    5. akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
    5 frames
  14. Scala
    ForkJoinWorkerThread.run
    1. scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    2. scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    3. scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    4. scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
    4 frames