Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Google Groups by lexk, 1 year ago
Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.lookupTimeout
via Stack Overflow by B. Smith
, 4 months ago
Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.askTimeout
via Stack Overflow by bourneli
, 1 year ago
Recipient[Actor[akka://sparkDriver/user/BlockManagerMaster#-213595070]] had already been terminated.. This timeout is controlled by spark.rpc.askTimeout
via Stack Overflow by SpringStarter
, 1 year ago
Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.askTimeout
via Stack Overflow by Sparknewbie
, 1 year ago
Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.lookupTimeout
via Stack Overflow by Alan
, 2 years ago
Cannot receive any reply in 10 seconds. This timeout is controlled by spark.executor.heartbeatInterval
java.util.concurrent.TimeoutException: Futures timed out after 
[120 seconds]	at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)	at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)	at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)	at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)	at scala.concurrent.Await$.result(package.scala:107)	at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)	at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:97)	at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:106)	at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36)	at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:324)	at org.apache.spark.SparkEnv$.create(SparkEnv.scala:336)	at org.apache.spark.SparkEnv$.createExecutorEnv(SparkEnv.scala:217)	at org.apache.spark.executor.MesosExecutorBackend.registered(MesosExecutorBackend.scala:75)