org.apache.spark.SparkException: Error sending message [message = RegisterBlockManager(BlockManagerId(4, ip-10-64-65-168.us-west-2.compute.internal, 37580),277842493,Actor[akka://sparkExecutor/user/BlockManagerActor1#-2041230339])]

JIRA | XW Gong | 2 years ago
  1. 0

    I have a AWS instance and am running the sequenceiq/docker-spark (including hadoop and spark) on it. The R and SparkR were installed on AWS. After I initiated a spark context using the following command: sc <- sparkR.init(master='spark://ip-10-64-65-168:7077', appName="ER") The connect was built successfully. But the error messages popped up soon. The main line of the message is: "ERROR TaskSchedulerImpl: Lost an executor 0 (already removed): remote Akka client disassociated". And the same error messages will repeat for executor 1, 2, .... I am new to spark, and I have done online research for a few weeks, but still could not fix it. Any ideas or hints will be appreciated very much! Thanks!!! Attached pictures are the screen snapshop of the terminal and the slave UI page. And the stderr Log from one of the Executor is as follows: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 15/02/23 21:02:40 INFO CoarseGrainedExecutorBackend: Registered signal handlers for [TERM, HUP, INT] 15/02/23 21:02:40 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 15/02/23 21:02:40 INFO SecurityManager: Changing view acls to: root 15/02/23 21:02:40 INFO SecurityManager: Changing modify acls to: root 15/02/23 21:02:40 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 15/02/23 21:02:41 INFO Slf4jLogger: Slf4jLogger started 15/02/23 21:02:41 INFO Remoting: Starting remoting 15/02/23 21:02:41 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://driverPropsFetcher@ip-10-64-65-168.us-west-2.compute.internal:46838] 15/02/23 21:02:41 INFO Utils: Successfully started service 'driverPropsFetcher' on port 46838. 15/02/23 21:02:41 INFO SecurityManager: Changing view acls to: root 15/02/23 21:02:41 INFO SecurityManager: Changing modify acls to: root 15/02/23 21:02:41 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 15/02/23 21:02:41 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 15/02/23 21:02:41 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. 15/02/23 21:02:41 INFO Slf4jLogger: Slf4jLogger started 15/02/23 21:02:41 INFO Remoting: Starting remoting 15/02/23 21:02:41 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkExecutor@ip-10-64-65-168.us-west-2.compute.internal:48541] 15/02/23 21:02:41 INFO Utils: Successfully started service 'sparkExecutor' on port 48541. 15/02/23 21:02:41 INFO CoarseGrainedExecutorBackend: Connecting to driver: akka.tcp://sparkDriver@ip-10-64-65-168.us-west-2.compute.internal:50114/user/CoarseGrainedScheduler 15/02/23 21:02:41 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down. 15/02/23 21:02:41 INFO CoarseGrainedExecutorBackend: Successfully registered with driver 15/02/23 21:02:41 INFO SecurityManager: Changing view acls to: root 15/02/23 21:02:41 INFO SecurityManager: Changing modify acls to: root 15/02/23 21:02:41 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 15/02/23 21:02:41 INFO AkkaUtils: Connecting to MapOutputTracker: akka.tcp://sparkDriver@ip-10-64-65-168.us-west-2.compute.internal:50114/user/MapOutputTracker 15/02/23 21:02:41 INFO AkkaUtils: Connecting to BlockManagerMaster: akka.tcp://sparkDriver@ip-10-64-65-168.us-west-2.compute.internal:50114/user/BlockManagerMaster 15/02/23 21:02:41 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20150223210241-704c 15/02/23 21:02:41 INFO MemoryStore: MemoryStore started with capacity 265.0 MB 15/02/23 21:02:41 INFO NettyBlockTransferService: Server created on 37580 15/02/23 21:02:41 INFO BlockManagerMaster: Trying to register BlockManager 15/02/23 21:02:41 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkDriver@ip-10-64-65-168.us-west-2.compute.internal:50114] has failed, address is now gated for [5000] ms. Reason is: [Disassociated]. 15/02/23 21:03:11 WARN AkkaUtils: Error sending message in 1 attempts java.util.concurrent.TimeoutException: Futures timed out after [30 seconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread$$anon$3.block(ThreadPoolBuilder.scala:169) at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640) at akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread.blockOn(ThreadPoolBuilder.scala:167) at scala.concurrent.Await$.result(package.scala:107) at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:187) at org.apache.spark.storage.BlockManagerMaster.askDriverWithReply(BlockManagerMaster.scala:221) at org.apache.spark.storage.BlockManagerMaster.tell(BlockManagerMaster.scala:211) at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:51) at org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:206) at org.apache.spark.executor.Executor.<init>(Executor.scala:90) at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receiveWithLogging$1.applyOrElse(CoarseGrainedExecutorBackend.scala:61) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53) at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42) at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118) at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42) at akka.actor.Actor$class.aroundReceive(Actor.scala:465) at org.apache.spark.executor.CoarseGrainedExecutorBackend.aroundReceive(CoarseGrainedExecutorBackend.scala:36) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) at akka.actor.ActorCell.invoke(ActorCell.scala:487) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) at akka.dispatch.Mailbox.run(Mailbox.scala:220) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 15/02/23 21:03:14 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkDriver@ip-10-64-65-168.us-west-2.compute.internal:50114] has failed, address is now gated for [5000] ms. Reason is: [Disassociated]. 15/02/23 21:03:44 WARN AkkaUtils: Error sending message in 2 attempts java.util.concurrent.TimeoutException: Futures timed out after [30 seconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread$$anon$3.block(ThreadPoolBuilder.scala:169) at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640) at akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread.blockOn(ThreadPoolBuilder.scala:167) at scala.concurrent.Await$.result(package.scala:107) at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:187) at org.apache.spark.storage.BlockManagerMaster.askDriverWithReply(BlockManagerMaster.scala:221) at org.apache.spark.storage.BlockManagerMaster.tell(BlockManagerMaster.scala:211) at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:51) at org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:206) at org.apache.spark.executor.Executor.<init>(Executor.scala:90) at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receiveWithLogging$1.applyOrElse(CoarseGrainedExecutorBackend.scala:61) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53) at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42) at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118) at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42) at akka.actor.Actor$class.aroundReceive(Actor.scala:465) at org.apache.spark.executor.CoarseGrainedExecutorBackend.aroundReceive(CoarseGrainedExecutorBackend.scala:36) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) at akka.actor.ActorCell.invoke(ActorCell.scala:487) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) at akka.dispatch.Mailbox.run(Mailbox.scala:220) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 15/02/23 21:03:47 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkDriver@ip-10-64-65-168.us-west-2.compute.internal:50114] has failed, address is now gated for [5000] ms. Reason is: [Disassociated]. 15/02/23 21:04:17 WARN AkkaUtils: Error sending message in 3 attempts java.util.concurrent.TimeoutException: Futures timed out after [30 seconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread$$anon$3.block(ThreadPoolBuilder.scala:169) at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640) at akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread.blockOn(ThreadPoolBuilder.scala:167) at scala.concurrent.Await$.result(package.scala:107) at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:187) at org.apache.spark.storage.BlockManagerMaster.askDriverWithReply(BlockManagerMaster.scala:221) at org.apache.spark.storage.BlockManagerMaster.tell(BlockManagerMaster.scala:211) at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:51) at org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:206) at org.apache.spark.executor.Executor.<init>(Executor.scala:90) at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receiveWithLogging$1.applyOrElse(CoarseGrainedExecutorBackend.scala:61) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53) at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42) at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118) at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42) at akka.actor.Actor$class.aroundReceive(Actor.scala:465) at org.apache.spark.executor.CoarseGrainedExecutorBackend.aroundReceive(CoarseGrainedExecutorBackend.scala:36) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) at akka.actor.ActorCell.invoke(ActorCell.scala:487) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) at akka.dispatch.Mailbox.run(Mailbox.scala:220) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 15/02/23 21:04:20 ERROR OneForOneStrategy: Error sending message [message = RegisterBlockManager(BlockManagerId(4, ip-10-64-65-168.us-west-2.compute.internal, 37580),277842493,Actor[akka://sparkExecutor/user/BlockManagerActor1#-2041230339])] org.apache.spark.SparkException: Error sending message [message = RegisterBlockManager(BlockManagerId(4, ip-10-64-65-168.us-west-2.compute.internal, 37580),277842493,Actor[akka://sparkExecutor/user/BlockManagerActor1#-2041230339])] at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:201) at org.apache.spark.storage.BlockManagerMaster.askDriverWithReply(BlockManagerMaster.scala:221) at org.apache.spark.storage.BlockManagerMaster.tell(BlockManagerMaster.scala:211) at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:51) at org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:206) at org.apache.spark.executor.Executor.<init>(Executor.scala:90) at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receiveWithLogging$1.applyOrElse(CoarseGrainedExecutorBackend.scala:61) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53) at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42) at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118) at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42) at akka.actor.Actor$class.aroundReceive(Actor.scala:465) at org.apache.spark.executor.CoarseGrainedExecutorBackend.aroundReceive(CoarseGrainedExecutorBackend.scala:36) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) at akka.actor.ActorCell.invoke(ActorCell.scala:487) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) at akka.dispatch.Mailbox.run(Mailbox.scala:220) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) Caused by: java.util.concurrent.TimeoutException: Futures timed out after [30 seconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread$$anon$3.block(ThreadPoolBuilder.scala:169) at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640) at akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread.blockOn(ThreadPoolBuilder.scala:167) at scala.concurrent.Await$.result(package.scala:107) at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:187) ... 24 more 15/02/23 21:04:20 INFO CoarseGrainedExecutorBackend: Connecting to driver: akka.tcp://sparkDriver@ip-10-64-65-168.us-west-2.compute.internal:50114/user/CoarseGrainedScheduler 15/02/23 21:04:20 ERROR CoarseGrainedExecutorBackend: Driver Disassociated [akka.tcp://sparkExecutor@ip-10-64-65-168.us-west-2.compute.internal:48541] -> [akka.tcp://sparkDriver@ip-10-64-65-168.us-west-2.compute.internal:50114] disassociated! Shutting down.

    JIRA | 2 years ago | XW Gong
    org.apache.spark.SparkException: Error sending message [message = RegisterBlockManager(BlockManagerId(4, ip-10-64-65-168.us-west-2.compute.internal, 37580),277842493,Actor[akka://sparkExecutor/user/BlockManagerActor1#-2041230339])]
  2. 0

    I have a AWS instance and am running the sequenceiq/docker-spark (including hadoop and spark) on it. The R and SparkR were installed on AWS. After I initiated a spark context using the following command: sc <- sparkR.init(master='spark://ip-10-64-65-168:7077', appName="ER") The connect was built successfully. But the error messages popped up soon. The main line of the message is: "ERROR TaskSchedulerImpl: Lost an executor 0 (already removed): remote Akka client disassociated". And the same error messages will repeat for executor 1, 2, .... I am new to spark, and I have done online research for a few weeks, but still could not fix it. Any ideas or hints will be appreciated very much! Thanks!!! Attached pictures are the screen snapshop of the terminal and the slave UI page. And the stderr Log from one of the Executor is as follows: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 15/02/23 21:02:40 INFO CoarseGrainedExecutorBackend: Registered signal handlers for [TERM, HUP, INT] 15/02/23 21:02:40 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 15/02/23 21:02:40 INFO SecurityManager: Changing view acls to: root 15/02/23 21:02:40 INFO SecurityManager: Changing modify acls to: root 15/02/23 21:02:40 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 15/02/23 21:02:41 INFO Slf4jLogger: Slf4jLogger started 15/02/23 21:02:41 INFO Remoting: Starting remoting 15/02/23 21:02:41 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://driverPropsFetcher@ip-10-64-65-168.us-west-2.compute.internal:46838] 15/02/23 21:02:41 INFO Utils: Successfully started service 'driverPropsFetcher' on port 46838. 15/02/23 21:02:41 INFO SecurityManager: Changing view acls to: root 15/02/23 21:02:41 INFO SecurityManager: Changing modify acls to: root 15/02/23 21:02:41 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 15/02/23 21:02:41 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 15/02/23 21:02:41 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. 15/02/23 21:02:41 INFO Slf4jLogger: Slf4jLogger started 15/02/23 21:02:41 INFO Remoting: Starting remoting 15/02/23 21:02:41 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkExecutor@ip-10-64-65-168.us-west-2.compute.internal:48541] 15/02/23 21:02:41 INFO Utils: Successfully started service 'sparkExecutor' on port 48541. 15/02/23 21:02:41 INFO CoarseGrainedExecutorBackend: Connecting to driver: akka.tcp://sparkDriver@ip-10-64-65-168.us-west-2.compute.internal:50114/user/CoarseGrainedScheduler 15/02/23 21:02:41 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down. 15/02/23 21:02:41 INFO CoarseGrainedExecutorBackend: Successfully registered with driver 15/02/23 21:02:41 INFO SecurityManager: Changing view acls to: root 15/02/23 21:02:41 INFO SecurityManager: Changing modify acls to: root 15/02/23 21:02:41 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 15/02/23 21:02:41 INFO AkkaUtils: Connecting to MapOutputTracker: akka.tcp://sparkDriver@ip-10-64-65-168.us-west-2.compute.internal:50114/user/MapOutputTracker 15/02/23 21:02:41 INFO AkkaUtils: Connecting to BlockManagerMaster: akka.tcp://sparkDriver@ip-10-64-65-168.us-west-2.compute.internal:50114/user/BlockManagerMaster 15/02/23 21:02:41 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20150223210241-704c 15/02/23 21:02:41 INFO MemoryStore: MemoryStore started with capacity 265.0 MB 15/02/23 21:02:41 INFO NettyBlockTransferService: Server created on 37580 15/02/23 21:02:41 INFO BlockManagerMaster: Trying to register BlockManager 15/02/23 21:02:41 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkDriver@ip-10-64-65-168.us-west-2.compute.internal:50114] has failed, address is now gated for [5000] ms. Reason is: [Disassociated]. 15/02/23 21:03:11 WARN AkkaUtils: Error sending message in 1 attempts java.util.concurrent.TimeoutException: Futures timed out after [30 seconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread$$anon$3.block(ThreadPoolBuilder.scala:169) at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640) at akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread.blockOn(ThreadPoolBuilder.scala:167) at scala.concurrent.Await$.result(package.scala:107) at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:187) at org.apache.spark.storage.BlockManagerMaster.askDriverWithReply(BlockManagerMaster.scala:221) at org.apache.spark.storage.BlockManagerMaster.tell(BlockManagerMaster.scala:211) at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:51) at org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:206) at org.apache.spark.executor.Executor.<init>(Executor.scala:90) at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receiveWithLogging$1.applyOrElse(CoarseGrainedExecutorBackend.scala:61) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53) at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42) at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118) at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42) at akka.actor.Actor$class.aroundReceive(Actor.scala:465) at org.apache.spark.executor.CoarseGrainedExecutorBackend.aroundReceive(CoarseGrainedExecutorBackend.scala:36) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) at akka.actor.ActorCell.invoke(ActorCell.scala:487) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) at akka.dispatch.Mailbox.run(Mailbox.scala:220) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 15/02/23 21:03:14 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkDriver@ip-10-64-65-168.us-west-2.compute.internal:50114] has failed, address is now gated for [5000] ms. Reason is: [Disassociated]. 15/02/23 21:03:44 WARN AkkaUtils: Error sending message in 2 attempts java.util.concurrent.TimeoutException: Futures timed out after [30 seconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread$$anon$3.block(ThreadPoolBuilder.scala:169) at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640) at akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread.blockOn(ThreadPoolBuilder.scala:167) at scala.concurrent.Await$.result(package.scala:107) at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:187) at org.apache.spark.storage.BlockManagerMaster.askDriverWithReply(BlockManagerMaster.scala:221) at org.apache.spark.storage.BlockManagerMaster.tell(BlockManagerMaster.scala:211) at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:51) at org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:206) at org.apache.spark.executor.Executor.<init>(Executor.scala:90) at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receiveWithLogging$1.applyOrElse(CoarseGrainedExecutorBackend.scala:61) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53) at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42) at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118) at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42) at akka.actor.Actor$class.aroundReceive(Actor.scala:465) at org.apache.spark.executor.CoarseGrainedExecutorBackend.aroundReceive(CoarseGrainedExecutorBackend.scala:36) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) at akka.actor.ActorCell.invoke(ActorCell.scala:487) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) at akka.dispatch.Mailbox.run(Mailbox.scala:220) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 15/02/23 21:03:47 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkDriver@ip-10-64-65-168.us-west-2.compute.internal:50114] has failed, address is now gated for [5000] ms. Reason is: [Disassociated]. 15/02/23 21:04:17 WARN AkkaUtils: Error sending message in 3 attempts java.util.concurrent.TimeoutException: Futures timed out after [30 seconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread$$anon$3.block(ThreadPoolBuilder.scala:169) at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640) at akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread.blockOn(ThreadPoolBuilder.scala:167) at scala.concurrent.Await$.result(package.scala:107) at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:187) at org.apache.spark.storage.BlockManagerMaster.askDriverWithReply(BlockManagerMaster.scala:221) at org.apache.spark.storage.BlockManagerMaster.tell(BlockManagerMaster.scala:211) at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:51) at org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:206) at org.apache.spark.executor.Executor.<init>(Executor.scala:90) at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receiveWithLogging$1.applyOrElse(CoarseGrainedExecutorBackend.scala:61) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53) at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42) at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118) at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42) at akka.actor.Actor$class.aroundReceive(Actor.scala:465) at org.apache.spark.executor.CoarseGrainedExecutorBackend.aroundReceive(CoarseGrainedExecutorBackend.scala:36) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) at akka.actor.ActorCell.invoke(ActorCell.scala:487) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) at akka.dispatch.Mailbox.run(Mailbox.scala:220) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 15/02/23 21:04:20 ERROR OneForOneStrategy: Error sending message [message = RegisterBlockManager(BlockManagerId(4, ip-10-64-65-168.us-west-2.compute.internal, 37580),277842493,Actor[akka://sparkExecutor/user/BlockManagerActor1#-2041230339])] org.apache.spark.SparkException: Error sending message [message = RegisterBlockManager(BlockManagerId(4, ip-10-64-65-168.us-west-2.compute.internal, 37580),277842493,Actor[akka://sparkExecutor/user/BlockManagerActor1#-2041230339])] at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:201) at org.apache.spark.storage.BlockManagerMaster.askDriverWithReply(BlockManagerMaster.scala:221) at org.apache.spark.storage.BlockManagerMaster.tell(BlockManagerMaster.scala:211) at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:51) at org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:206) at org.apache.spark.executor.Executor.<init>(Executor.scala:90) at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receiveWithLogging$1.applyOrElse(CoarseGrainedExecutorBackend.scala:61) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53) at org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42) at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118) at org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42) at akka.actor.Actor$class.aroundReceive(Actor.scala:465) at org.apache.spark.executor.CoarseGrainedExecutorBackend.aroundReceive(CoarseGrainedExecutorBackend.scala:36) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) at akka.actor.ActorCell.invoke(ActorCell.scala:487) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) at akka.dispatch.Mailbox.run(Mailbox.scala:220) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) Caused by: java.util.concurrent.TimeoutException: Futures timed out after [30 seconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread$$anon$3.block(ThreadPoolBuilder.scala:169) at scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640) at akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread.blockOn(ThreadPoolBuilder.scala:167) at scala.concurrent.Await$.result(package.scala:107) at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:187) ... 24 more 15/02/23 21:04:20 INFO CoarseGrainedExecutorBackend: Connecting to driver: akka.tcp://sparkDriver@ip-10-64-65-168.us-west-2.compute.internal:50114/user/CoarseGrainedScheduler 15/02/23 21:04:20 ERROR CoarseGrainedExecutorBackend: Driver Disassociated [akka.tcp://sparkExecutor@ip-10-64-65-168.us-west-2.compute.internal:48541] -> [akka.tcp://sparkDriver@ip-10-64-65-168.us-west-2.compute.internal:50114] disassociated! Shutting down.

    JIRA | 2 years ago | XW Gong
    org.apache.spark.SparkException: Error sending message [message = RegisterBlockManager(BlockManagerId(4, ip-10-64-65-168.us-west-2.compute.internal, 37580),277842493,Actor[akka://sparkExecutor/user/BlockManagerActor1#-2041230339])]
  3. 0

    kafka + spark streaming: empty shuffle read and write

    Stack Overflow | 2 years ago | Jerrysdevil
    java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    [Kafka-Spark-Consumer] Spark-Streaming Job Fails due to Futures timed

    gmane.org | 12 months ago
    java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
  6. 0

    Spark cluster computing framework

    gmane.org | 12 months ago
    java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]

  1. rp 1 times, last 6 months ago
  2. guizmaii 1 times, last 2 weeks ago
1 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.util.concurrent.TimeoutException

    Futures timed out after [30 seconds]

    at scala.concurrent.impl.Promise$DefaultPromise.ready()
  2. Scala
    Await$$anonfun$result$1.apply
    1. scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
    2. scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
    3. scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
    3 frames
  3. Akka Actor
    MonitorableThreadFactory$AkkaForkJoinWorkerThread$$anon$3.block
    1. akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread$$anon$3.block(ThreadPoolBuilder.scala:169)
    1 frame
  4. Scala
    ForkJoinPool.managedBlock
    1. scala.concurrent.forkjoin.ForkJoinPool.managedBlock(ForkJoinPool.java:3640)
    1 frame
  5. Akka Actor
    MonitorableThreadFactory$AkkaForkJoinWorkerThread.blockOn
    1. akka.dispatch.MonitorableThreadFactory$AkkaForkJoinWorkerThread.blockOn(ThreadPoolBuilder.scala:167)
    1 frame
  6. Scala
    Await$.result
    1. scala.concurrent.Await$.result(package.scala:107)
    1 frame
  7. Spark
    CoarseGrainedExecutorBackend$$anonfun$receiveWithLogging$1.applyOrElse
    1. org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:187)
    2. org.apache.spark.storage.BlockManagerMaster.askDriverWithReply(BlockManagerMaster.scala:221)
    3. org.apache.spark.storage.BlockManagerMaster.tell(BlockManagerMaster.scala:211)
    4. org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:51)
    5. org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:206)
    6. org.apache.spark.executor.Executor.<init>(Executor.scala:90)
    7. org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receiveWithLogging$1.applyOrElse(CoarseGrainedExecutorBackend.scala:61)
    7 frames
  8. Scala
    AbstractPartialFunction$mcVL$sp.apply
    1. scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
    2. scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
    3. scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
    3 frames
  9. Spark
    ActorLogReceive$$anon$1.apply
    1. org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:53)
    2. org.apache.spark.util.ActorLogReceive$$anon$1.apply(ActorLogReceive.scala:42)
    2 frames
  10. Scala
    PartialFunction$class.applyOrElse
    1. scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
    1 frame
  11. Spark
    ActorLogReceive$$anon$1.applyOrElse
    1. org.apache.spark.util.ActorLogReceive$$anon$1.applyOrElse(ActorLogReceive.scala:42)
    1 frame
  12. Akka Actor
    Actor$class.aroundReceive
    1. akka.actor.Actor$class.aroundReceive(Actor.scala:465)
    1 frame
  13. Spark
    CoarseGrainedExecutorBackend.aroundReceive
    1. org.apache.spark.executor.CoarseGrainedExecutorBackend.aroundReceive(CoarseGrainedExecutorBackend.scala:36)
    1 frame
  14. Akka Actor
    ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec
    1. akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
    2. akka.actor.ActorCell.invoke(ActorCell.scala:487)
    3. akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
    4. akka.dispatch.Mailbox.run(Mailbox.scala:220)
    5. akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
    5 frames
  15. Scala
    ForkJoinWorkerThread.run
    1. scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    2. scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    3. scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    4. scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
    4 frames