Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via cloudera.com by Unknown author, 1 year ago
Recipient[Actor[akka://sparkDriver/user/CoarseGrai nedScheduler#-1206596405]] had already been terminated.. This timeout is controlled by spark.rpc.askTimeout at org.apache.spark.rpc.RpcTimeout.org$apache$spark$r pc$RpcTimeout$$createRpcTimeoutException(RpcEnv.sc ala:214)
via mahout-user by Dmitriy Lyubimov, 2 years ago
Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.askTimeout at org.apache.spark.rpc.RpcTimeout.org $apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:48)
via mahout-user by Dmitriy Lyubimov, 1 year ago
Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.askTimeout at org.apache.spark.rpc.RpcTimeout.org $apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:48)
via mahout-user by Dmitriy Lyubimov, 2 years ago
Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.askTimeout at org.apache.spark.rpc.RpcTimeout.org $apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:48)
via mahout-user by Dmitriy Lyubimov, 2 years ago
Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.askTimeout at org.apache.spark.rpc.RpcTimeout.org $apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:48)
via mahout-user by Dmitriy Lyubimov, 2 years ago
Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.askTimeout at org.apache.spark.rpc.RpcTimeout.org $apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:48)
org.apache.spark.rpc.RpcTimeoutException: Recipient[Actor[akka://sparkDriver/user/CoarseGrai
nedScheduler#-1206596405]] had already been terminated.. This timeout is controlled by spark.rpc.askTimeout
at org.apache.spark.rpc.RpcTimeout.org$apache$spark$r
pc$RpcTimeout$$createRpcTimeoutException(RpcEnv.sc
ala:214)	at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcEnv.scala:229)	at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcEnv.scala:225)	at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)	at scala.util.Failure$$anonfun$recover$1.apply(Try.scala:185)	at scala.util.Try$.apply(Try.scala:161)	at scala.util.Failure.recover(Try.scala:185)