java.io.InvalidClassException: hmda.model.institution.Agency$FDIC$; no valid constructor

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • How to submit a spark Job Programatically
    via by Unknown author,
  • GitHub comment 14#61579279
    via GitHub by ljzzju
    ,
  • Hi experts, The sparkR example cannot run in my spark cluster: > Spark standalone cluster info: URL: spark://SparkMaster:7077 Workers: 3 Cores: 12 Total, 0 Used Memory: 44.0 GB Total, 0.0 B Used Applications: 0 Running, 3 Completed Drivers: 0 Running, 0 Completed Status: ALIVE *root@SparkMaster:/data/SparkR-pkg-master# _SPARK_MEM=1g ./sparkR examples/pi.R spark://SparkMaster:7077_* ~~~ Loading required package: SparkR Loading required package: methods Loading required package: rJava [SparkR] Initializing with classpath /data/SparkR-pkg-master/lib/SparkR/sparkr-assembly-0.1.jar 14/04/17 02:02:20 INFO Slf4jLogger: Slf4jLogger started 14/04/17 02:02:38 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory 14/04/17 02:02:53 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory 14/04/17 02:03:08 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory 14/04/17 02:03:22 ERROR AppClient$ClientActor: All masters are unresponsive! Giving up. 14/04/17 02:03:22 ERROR SparkDeploySchedulerBackend: Spark cluster looks dead, giving up. Error in .jcall(getJRDD(rdd), "Ljava/util/List;", "collect") : org.apache.spark.SparkException: Job aborted: Spark cluster looks down Calls: reduce ... collect -> collect -> .local -> .jcall -> .jcheck -> .Call Execution halted ~~~ The log in Spark is as below: ~~~ 14/04/17 02:05:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:05:58 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/endpointManager/reliableEndpointWriter-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-54/endpointWriter/endpointReader-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-0#-1459605026] was not delivered. [107] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:05:58 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40070-94#-1641183893] was not delivered. [108] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:18 ERROR Remoting: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303 java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303 at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:592) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1621) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1516) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1770) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1914) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1797) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:369) at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136) at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) at scala.util.Try$.apply(Try.scala:161) at akka.serialization.Serialization.deserialize(Serialization.scala:98) at akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:58) at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) at scala.util.Try$.apply(Try.scala:161) at akka.serialization.Serialization.deserialize(Serialization.scala:98) at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23) at akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:55) at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:55) at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:73) at akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:764) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498) at akka.actor.ActorCell.invoke(ActorCell.scala:456) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237) at akka.dispatch.Mailbox.run(Mailbox.scala:219) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 14/04/17 02:06:18 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/endpointManager/endpointWriter-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-55/endpointReader-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-0#-1568863113] was not delivered. [109] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:18 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:18 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40072-96#905162390] was not delivered. [110] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:38 ERROR Remoting: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303 java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303 at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:592) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1621) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1516) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1770) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1914) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1797) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:369) at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136) at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) at scala.util.Try$.apply(Try.scala:161) at akka.serialization.Serialization.deserialize(Serialization.scala:98) at akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:58) at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) at scala.util.Try$.apply(Try.scala:161) at akka.serialization.Serialization.deserialize(Serialization.scala:98) at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23) at akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:55) at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:55) at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:73) at akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:764) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498) at akka.actor.ActorCell.invoke(ActorCell.scala:456) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237) at akka.dispatch.Mailbox.run(Mailbox.scala:219) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 14/04/17 02:06:38 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:38 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/endpointManager/endpointWriter-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-56/endpointReader-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-0#-2041306928] was not delivered. [111] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:38 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40073-97#-544240507] was not delivered. [112] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40074-98#938859376] was not delivered. [113] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:58 INFO LocalActorRef: Message [akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40074-98#938859376] was not delivered. [114] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 INFO LocalActorRef: Message [akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-95#-798623901] was not delivered. [115] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:58 ERROR EndpointWriter: AssociationError [akka.tcp://sparkMaster@SparkMaster:7077] -> [akka.tcp://spark@SparkMaster.xxx.com:56718]: Error [Association failed with [akka.tcp://spark@SparkMaster.xxx.com:56718]] [ akka.remote.EndpointAssociationException: Association failed with [akka.tcp://spark@SparkMaster.xxx.com:56718] Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: SparkMaster.xxx.com/142.133.50.58:56718 ] 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 ERROR EndpointWriter: AssociationError [akka.tcp://sparkMaster@SparkMaster:7077] -> [akka.tcp://spark@SparkMaster.xxx.com:56718]: Error [Association failed with [akka.tcp://spark@SparkMaster.xxx.com:56718]] [ akka.remote.EndpointAssociationException: Association failed with [akka.tcp://spark@SparkMaster.xxx.com:56718] Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: SparkMaster.xxx.com/142.133.50.58:56718 ] 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. ~~~ SparkR run in local is OK. *root@SparkMaster:/data/SparkR-pkg-master# ./sparkR examples/pi.R local[2]* ~~~ Loading required package: SparkR Loading required package: methods Loading required package: rJava [SparkR] Initializing with classpath /data/SparkR-pkg-master/lib/SparkR/sparkr-assembly-0.1.jar 14/04/17 02:53:59 INFO Slf4jLogger: Slf4jLogger started Pi is roughly 3.1435 Num elements in RDD 200000 ~~~ Spark standalone cluster is OK. *root@SparkMaster:/data/spark# ./bin/run-example org.apache.spark.examples.SparkPi spark://SparkMaster:7077* ~~~ SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/data/spark/examples/target/scala-2.10/spark-examples-assembly-1.0.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/data/spark/sql/hive/target/scala-2.10/spark-hive-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 14/04/17 01:59:29 INFO SecurityManager: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 14/04/17 01:59:29 INFO SecurityManager: SecurityManager, is authentication enabled: false are ui acls enabled: false users with view permissions: Set(root) 14/04/17 01:59:30 INFO Slf4jLogger: Slf4jLogger started 14/04/17 01:59:30 INFO Remoting: Starting remoting ... 14/04/17 01:59:41 INFO DAGScheduler: Completed ResultTask(0, 1) 14/04/17 01:59:41 INFO DAGScheduler: Stage 0 (reduce at SparkPi.scala:39) finished in 7.443 s 14/04/17 01:59:41 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 14/04/17 01:59:41 INFO SparkContext: Job finished: reduce at SparkPi.scala:39, took 7.669905604 s Pi is roughly 3.1458 ... ~~~
    via by ElvisDeng,
  • Hi experts, The sparkR example cannot run in my spark cluster: > Spark standalone cluster info: URL: spark://SparkMaster:7077 Workers: 3 Cores: 12 Total, 0 Used Memory: 44.0 GB Total, 0.0 B Used Applications: 0 Running, 3 Completed Drivers: 0 Running, 0 Completed Status: ALIVE *root@SparkMaster:/data/SparkR-pkg-master# _SPARK_MEM=1g ./sparkR examples/pi.R spark://SparkMaster:7077_* ~~~ Loading required package: SparkR Loading required package: methods Loading required package: rJava [SparkR] Initializing with classpath /data/SparkR-pkg-master/lib/SparkR/sparkr-assembly-0.1.jar 14/04/17 02:02:20 INFO Slf4jLogger: Slf4jLogger started 14/04/17 02:02:38 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory 14/04/17 02:02:53 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory 14/04/17 02:03:08 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory 14/04/17 02:03:22 ERROR AppClient$ClientActor: All masters are unresponsive! Giving up. 14/04/17 02:03:22 ERROR SparkDeploySchedulerBackend: Spark cluster looks dead, giving up. Error in .jcall(getJRDD(rdd), "Ljava/util/List;", "collect") : org.apache.spark.SparkException: Job aborted: Spark cluster looks down Calls: reduce ... collect -> collect -> .local -> .jcall -> .jcheck -> .Call Execution halted ~~~ The log in Spark is as below: ~~~ 14/04/17 02:05:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:05:58 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/endpointManager/reliableEndpointWriter-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-54/endpointWriter/endpointReader-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-0#-1459605026] was not delivered. [107] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:05:58 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40070-94#-1641183893] was not delivered. [108] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:18 ERROR Remoting: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303 java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303 at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:592) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1621) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1516) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1770) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1914) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1797) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:369) at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136) at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) at scala.util.Try$.apply(Try.scala:161) at akka.serialization.Serialization.deserialize(Serialization.scala:98) at akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:58) at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) at scala.util.Try$.apply(Try.scala:161) at akka.serialization.Serialization.deserialize(Serialization.scala:98) at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23) at akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:55) at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:55) at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:73) at akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:764) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498) at akka.actor.ActorCell.invoke(ActorCell.scala:456) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237) at akka.dispatch.Mailbox.run(Mailbox.scala:219) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 14/04/17 02:06:18 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/endpointManager/endpointWriter-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-55/endpointReader-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-0#-1568863113] was not delivered. [109] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:18 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:18 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40072-96#905162390] was not delivered. [110] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:38 ERROR Remoting: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303 java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = -6451051318873184044, local class serialVersionUID = -7435142130326333303 at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:592) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1621) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1516) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1770) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1914) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1797) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1349) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:369) at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136) at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) at scala.util.Try$.apply(Try.scala:161) at akka.serialization.Serialization.deserialize(Serialization.scala:98) at akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:58) at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) at scala.util.Try$.apply(Try.scala:161) at akka.serialization.Serialization.deserialize(Serialization.scala:98) at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23) at akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:55) at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:55) at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:73) at akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:764) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498) at akka.actor.ActorCell.invoke(ActorCell.scala:456) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237) at akka.dispatch.Mailbox.run(Mailbox.scala:219) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 14/04/17 02:06:38 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:38 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/endpointManager/endpointWriter-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-56/endpointReader-akka.tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-0#-2041306928] was not delivered. [111] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:38 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40073-97#-544240507] was not delivered. [112] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 INFO LocalActorRef: Message [akka.remote.transport.AssociationHandle$Disassociated] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40074-98#938859376] was not delivered. [113] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:58 INFO LocalActorRef: Message [akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40142.133.50.58%3A40074-98#938859376] was not delivered. [114] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 INFO LocalActorRef: Message [akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from Actor[akka://sparkMaster/deadLetters] to Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2Fspark%40SparkMaster.xxx.com%3A56718-95#-798623901] was not delivered. [115] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'. 14/04/17 02:06:58 ERROR EndpointWriter: AssociationError [akka.tcp://sparkMaster@SparkMaster:7077] -> [akka.tcp://spark@SparkMaster.xxx.com:56718]: Error [Association failed with [akka.tcp://spark@SparkMaster.xxx.com:56718]] [ akka.remote.EndpointAssociationException: Association failed with [akka.tcp://spark@SparkMaster.xxx.com:56718] Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: SparkMaster.xxx.com/142.133.50.58:56718 ] 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. 14/04/17 02:06:58 ERROR EndpointWriter: AssociationError [akka.tcp://sparkMaster@SparkMaster:7077] -> [akka.tcp://spark@SparkMaster.xxx.com:56718]: Error [Association failed with [akka.tcp://spark@SparkMaster.xxx.com:56718]] [ akka.remote.EndpointAssociationException: Association failed with [akka.tcp://spark@SparkMaster.xxx.com:56718] Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: SparkMaster.xxx.com/142.133.50.58:56718 ] 14/04/17 02:06:58 INFO Master: akka.tcp://spark@SparkMaster.xxx.com:56718 got disassociated, removing it. ~~~ SparkR run in local is OK. *root@SparkMaster:/data/SparkR-pkg-master# ./sparkR examples/pi.R local[2]* ~~~ Loading required package: SparkR Loading required package: methods Loading required package: rJava [SparkR] Initializing with classpath /data/SparkR-pkg-master/lib/SparkR/sparkr-assembly-0.1.jar 14/04/17 02:53:59 INFO Slf4jLogger: Slf4jLogger started Pi is roughly 3.1435 Num elements in RDD 200000 ~~~ Spark standalone cluster is OK. *root@SparkMaster:/data/spark# ./bin/run-example org.apache.spark.examples.SparkPi spark://SparkMaster:7077* ~~~ SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/data/spark/examples/target/scala-2.10/spark-examples-assembly-1.0.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/data/spark/sql/hive/target/scala-2.10/spark-hive-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 14/04/17 01:59:29 INFO SecurityManager: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 14/04/17 01:59:29 INFO SecurityManager: SecurityManager, is authentication enabled: false are ui acls enabled: false users with view permissions: Set(root) 14/04/17 01:59:30 INFO Slf4jLogger: Slf4jLogger started 14/04/17 01:59:30 INFO Remoting: Starting remoting ... 14/04/17 01:59:41 INFO DAGScheduler: Completed ResultTask(0, 1) 14/04/17 01:59:41 INFO DAGScheduler: Stage 0 (reduce at SparkPi.scala:39) finished in 7.443 s 14/04/17 01:59:41 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 14/04/17 01:59:41 INFO SparkContext: Job finished: reduce at SparkPi.scala:39, took 7.669905604 s Pi is roughly 3.1458 ... ~~~
    via by ElvisDeng,
    • java.io.InvalidClassException: hmda.model.institution.Agency$FDIC$; no valid constructor at java.io.ObjectStreamClass$ExceptionInfo.newInvalidClassException(ObjectStreamClass.java:150)[na:1.8.0_31] at java.io.ObjectStreamClass.checkDeserialize(ObjectStreamClass.java:768)[na:1.8.0_31] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)[na:1.8.0_31] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)[na:1.8.0_31] at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)[na:1.8.0_31] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)[na:1.8.0_31] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)[na:1.8.0_31] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)[na:1.8.0_31] at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)[na:1.8.0_31] at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)[na:1.8.0_31] at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)[na:1.8.0_31] at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)[na:1.8.0_31] at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)[na:1.8.0_31] at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:242)[akka-actor_2.11-2.4.9.jar:na] at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)[scala-library-2.11.8.jar:1.0.0-M1] at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:242)[akka-actor_2.11-2.4.9.jar:na] at akka.serialization.Serialization$$anonfun$deserialize$2.apply(Serialization.scala:124)[akka-actor_2.11-2.4.9.jar:na] at scala.util.Try$.apply(Try.scala:192)[scala-library-2.11.8.jar:1.0.0-M1] at akka.serialization.Serialization.deserialize(Serialization.scala:114)[akka-actor_2.11-2.4.9.jar:na] at akka.persistence.serialization.MessageSerializer.akka$persistence$serialization$MessageSerializer$$payload(MessageSerializer.scala:216)[akka-persistence_2.11-2.4.9.jar:na] at akka.persistence.serialization.MessageSerializer.akka$persistence$serialization$MessageSerializer$$persistent(MessageSerializer.scala:198)[akka-persistence_2.11-2.4.9.jar:na] at akka.persistence.serialization.MessageSerializer.fromBinary(MessageSerializer.scala:69)[akka-persistence_2.11-2.4.9.jar:na] at akka.persistence.serialization.MessageSerializer.fromBinary(MessageSerializer.scala:28)[akka-persistence_2.11-2.4.9.jar:na] at akka.serialization.Serialization$$anonfun$deserialize$3.apply(Serialization.scala:142)[akka-actor_2.11-2.4.9.jar:na] at scala.util.Try$.apply(Try.scala:192)[scala-library-2.11.8.jar:1.0.0-M1] at akka.serialization.Serialization.deserialize(Serialization.scala:142)[akka-actor_2.11-2.4.9.jar:na] at akka.persistence.journal.leveldb.LeveldbStore$class.persistentFromBytes(LeveldbStore.scala:138)[akka-persistence_2.11-2.4.9.jar:na] at akka.persistence.journal.leveldb.LeveldbJournal.persistentFromBytes(LeveldbJournal.scala:22)[akka-persistence_2.11-2.4.9.jar:na] at akka.persistence.journal.leveldb.LeveldbRecovery$class.go$1(LeveldbRecovery.scala:47)[akka-persistence_2.11-2.4.9.jar:na] at akka.persistence.journal.leveldb.LeveldbRecovery$$anonfun$replayMessages$1.apply(LeveldbRecovery.scala:73)[akka-persistence_2.11-2.4.9.jar:na] at akka.persistence.journal.leveldb.LeveldbRecovery$$anonfun$replayMessages$1.apply(LeveldbRecovery.scala:70)[akka-persistence_2.11-2.4.9.jar:na] at akka.persistence.journal.leveldb.LeveldbStore$class.withIterator(LeveldbStore.scala:119)[akka-persistence_2.11-2.4.9.jar:na] at akka.persistence.journal.leveldb.LeveldbJournal.withIterator(LeveldbJournal.scala:22)[akka-persistence_2.11-2.4.9.jar:na] at akka.persistence.journal.leveldb.LeveldbRecovery$class.replayMessages(LeveldbRecovery.scala:70)[akka-persistence_2.11-2.4.9.jar:na] at akka.persistence.journal.leveldb.LeveldbJournal.replayMessages(LeveldbJournal.scala:22)[akka-persistence_2.11-2.4.9.jar:na] at akka.persistence.journal.leveldb.LeveldbRecovery$$anonfun$asyncReplayMessages$1.apply$mcV$sp(LeveldbRecovery.scala:32)[akka-persistence_2.11-2.4.9.jar:na] at akka.persistence.journal.leveldb.LeveldbRecovery$$anonfun$asyncReplayMessages$1.apply(LeveldbRecovery.scala:32)[akka-persistence_2.11-2.4.9.jar:na] at akka.persistence.journal.leveldb.LeveldbRecovery$$anonfun$asyncReplayMessages$1.apply(LeveldbRecovery.scala:32)[akka-persistence_2.11-2.4.9.jar:na] at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)[scala-library-2.11.8.jar:na] at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)[scala-library-2.11.8.jar:na] at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:39)[akka-actor_2.11-2.4.9.jar:na] at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:409)[akka-actor_2.11-2.4.9.jar:na] at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)[scala-library-2.11.8.jar:na] at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)[scala-library-2.11.8.jar:na] at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)[scala-library-2.11.8.jar:na] at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)[scala-library-2.11.8.jar:na]

    Users with the same issue

    Handemelindo
    Handemelindo1 times, last one,
    Unknown User
    Unknown User2 times, last one,
    Ghosts
    Ghosts3 times, last one,
    Hronom
    Hronom1 times, last one,
    ajinkya_w
    ajinkya_w11 times, last one,
    38 more bugmates