org.apache.spark.rpc.RpcTimeoutException: Recipient[Actor[akka://sparkDriver/user/CoarseGrai nedScheduler#-1206596405]] had already been terminated.. This timeout is controlled by spark.rpc.askTimeout at org.apache.spark.rpc.RpcTimeout.org$apache$spark$r pc$RpcTimeout$$createRpcTimeoutException(RpcEnv.sc ala:214)

cloudera.com | 8 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    ERROR ActorSystemImpl - Running my spark job on ya... - Cloudera Community

    cloudera.com | 8 months ago
    org.apache.spark.rpc.RpcTimeoutException: Recipient[Actor[akka://sparkDriver/user/CoarseGrai nedScheduler#-1206596405]] had already been terminated.. This timeout is controlled by spark.rpc.askTimeout at org.apache.spark.rpc.RpcTimeout.org$apache$spark$r pc$RpcTimeout$$createRpcTimeoutException(RpcEnv.sc ala:214)
  2. 0

    Futures timed out after [120 seconds]

    gmane.org | 1 year ago
    org.apache.spark.rpc.RpcTimeoutException: Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.askTimeout at $apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcEnv.scala:214)
  3. 0

    Re: Exception in task 0.0 in stage 13.0 (TID 13) java.lang.OutOfMemoryError: Java heap space

    mahout-user | 1 year ago | Dmitriy Lyubimov
    org.apache.spark.rpc.RpcTimeoutException: Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.askTimeout at org.apache.spark.rpc.RpcTimeout.org $apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:48)
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    GitHub comment 64#252462840

    GitHub | 7 months ago | yxzf
    org.apache.spark.rpc.RpcTimeoutException: Recipient[Actor[akka://sparkDriver/user/CoarseGrainedScheduler#325208432]] had already been terminated.. This timeout is controlled by spark.rpc.askTimeout
  6. 0

    Unable to view Twitter stream using a Spark streaming application

    Stack Overflow | 2 years ago | nikos
    org.apache.spark.rpc.RpcTimeoutException: Recipient[Actor[akka://sparkDriver/user/BlockManagerMaster#1080029491]] had already been terminated.. This timeout is controlled by spark.rpc.askTimeout

  1. Nikolay Rybak 1 times, last 3 months ago
  2. johnxfly 4 times, last 1 month ago
  3. tyson925 1 times, last 11 months ago
1 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. org.apache.spark.rpc.RpcTimeoutException

    Recipient[Actor[akka://sparkDriver/user/CoarseGrai nedScheduler#-1206596405]] had already been terminated.. This timeout is controlled by spark.rpc.askTimeout at org.apache.spark.rpc.RpcTimeout.org$apache$spark$r pc$RpcTimeout$$createRpcTimeoutException(RpcEnv.sc ala:214)

    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse()
  2. org.apache.spark
    RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse
    1. org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcEnv.scala:229)
    2. org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcEnv.scala:225)
    2 frames
  3. Scala
    Failure.recover
    1. scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
    2. scala.util.Failure$$anonfun$recover$1.apply(Try.scala:185)
    3. scala.util.Try$.apply(Try.scala:161)
    4. scala.util.Failure.recover(Try.scala:185)
    4 frames