Thread.run() has thrown a RpcTimeoutException

We found this prefix in 2 webpages.
MapChildren (2)Typical messages (1)TraceDetailed Map
  1. java.util.concurrent.TimeoutException
    1.   at scala.concurrent.impl.Promise$DefaultPromise.ready()
    2.   at scala.concurrent.impl.Promise$DefaultPromise.result()
    3.   at scala.concurrent.Await$$anonfun$result$1.apply()
    4.   at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn()
    5.   at scala.concurrent.Await$.result()
    6.   at org.apache.spark.rpc.RpcTimeout.awaitResult()
  2. org.apache.spark.rpc.RpcTimeoutException
    1.   at org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException()
    2.   at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse()
    3.   at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse()
    4.   at scala.runtime.AbstractPartialFunction.apply()
    5.   at org.apache.spark.rpc.RpcTimeout.awaitResult()
    6.   at org.apache.spark.rpc.RpcEndpointRef.askWithRetry()
    7.   at org.apache.spark.rpc.RpcEndpointRef.askWithRetry()
    8.   at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.removeExecutor()
    9.   at org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint$$anonfun$receive$1.applyOrElse()
    10.   at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp()
    11.   at org.apache.spark.rpc.netty.Inbox.safelyCall()
    12.   at org.apache.spark.rpc.netty.Inbox.process()
    13.   at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run()
    14.   at java.util.concurrent.ThreadPoolExecutor.runWorker()
    15.   at java.util.concurrent.ThreadPoolExecutor$Worker.run()
    16.   at java.lang.Thread.run()
  1. How to investigate failing dataproc worker processes?

    First:2 years ago
    Last:2 years ago
    Author:sthomps
  2. SPARK Job returning ExitCodeException exitCode=1 : Scala

    First:2 years ago
    Last:2 years ago
    Author:Newbie
MessageNumber of crashes
Futures timed out after [120 seconds]2
  1. java.util.concurrent.TimeoutException
    1.   at scala.concurrent.impl.Promise$DefaultPromise.ready()
    2.   at scala.concurrent.impl.Promise$DefaultPromise.result()
    3.   at scala.concurrent.Await$$anonfun$result$1.apply()
    4.   at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn()
    5.   at scala.concurrent.Await$.result()
    6.   at org.apache.spark.rpc.RpcTimeout.awaitResult()
  2. org.apache.spark.rpc.RpcTimeoutException
    1.   at org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException()
    2.   at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse()
    3.   at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse()
    4.   at scala.runtime.AbstractPartialFunction.apply()
    5.   at org.apache.spark.rpc.RpcTimeout.awaitResult()
    6.   at org.apache.spark.rpc.RpcEndpointRef.askWithRetry()
    7.   at org.apache.spark.rpc.RpcEndpointRef.askWithRetry()
    8.   at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.removeExecutor()
    9.   at org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint$$anonfun$receive$1.applyOrElse()
    10.   at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp()
    11.   at org.apache.spark.rpc.netty.Inbox.safelyCall()
    12.   at org.apache.spark.rpc.netty.Inbox.process()
    13.   at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run()
    14.   at java.util.concurrent.ThreadPoolExecutor.runWorker()
    15.   at java.util.concurrent.ThreadPoolExecutor$Worker.run()
    16.   at java.lang.Thread.run()