org.apache.spark.SparkException: Error sending message [message = Heartbeat(281,[Lscala.Tuple2;@4d9294db,BlockManagerId(281, ip-172-31-7-55.eu-west-1.compute.internal, 52303))]

github.com | 5 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    The thread started by Executor.startDriverHeartbeater can actually terminate the whole executor if AkkaUtils.askWithReply[HeartbeatResponse] throws an exception. I don't think we should quit the executor this way. At the very least, we would want to log a more meaningful exception then simply {code} 14/09/20 06:38:12 WARN AkkaUtils: Error sending message in 1 attempts java.util.concurrent.TimeoutException: Futures timed out after [30 seconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53) at scala.concurrent.Await$.result(package.scala:107) at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:176) at org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:379) 14/09/20 06:38:45 WARN AkkaUtils: Error sending message in 2 attempts java.util.concurrent.TimeoutException: Futures timed out after [30 seconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53) at scala.concurrent.Await$.result(package.scala:107) at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:176) at org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:379) 14/09/20 06:39:18 WARN AkkaUtils: Error sending message in 3 attempts java.util.concurrent.TimeoutException: Futures timed out after [30 seconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53) at scala.concurrent.Await$.result(package.scala:107) at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:176) at org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:379) 14/09/20 06:39:21 ERROR ExecutorUncaughtExceptionHandler: Uncaught exception in thread Thread[Driver Heartbeater,5,main] org.apache.spark.SparkException: Error sending message [message = Heartbeat(281,[Lscala.Tuple2;@4d9294db,BlockManagerId(281, ip-172-31-7-55.eu-west-1.compute.internal, 52303))] at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:190) at org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:379) Caused by: java.util.concurrent.TimeoutException: Futures timed out after [30 seconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53) at scala.concurrent.Await$.result(package.scala:107) at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:176) ... 1 more {code}

    Apache's JIRA Issue Tracker | 2 years ago | Reynold Xin
    org.apache.spark.SparkException: Error sending message [message = Heartbeat(281,[Lscala.Tuple2;@4d9294db,BlockManagerId(281, ip-172-31-7-55.eu-west-1.compute.internal, 52303))]
  2. 0

    eco-release-metadata/RELEASENOTES.1.2.0.md at master · aw-was-here/eco-release-metadata · GitHub

    github.com | 5 months ago
    org.apache.spark.SparkException: Error sending message [message = Heartbeat(281,[Lscala.Tuple2;@4d9294db,BlockManagerId(281, ip-172-31-7-55.eu-west-1.compute.internal, 52303))]
  3. 0

    [SPARK-3612] Executor shouldn't quit if heartbeat message fails to reach the driver - ASF JIRA

    apache.org | 2 years ago
    org.apache.spark.SparkException: Error sending message [message = Heartbeat(281,[Lscala.Tuple2;@4d9294db,BlockManagerId(281, ip-172-31-7-55.eu-west-1.compute.internal, 52303))]
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Spark cluster computing framework

    gmane.org | 11 months ago
    org.apache.spark.SparkException: Error sending message [message = Heartbeat(2,[Lscala.Tuple2; <at> 64678d48,BlockManagerId(2, node09.demo.hadoop, 50044))]
  6. 0

    大数据-spark streaming运行一段时间报以下异常,怎么解决——CSDN问答频道

    csdn.net | 1 year ago
    org.apache.spark.SparkException: Error sending message [message = Heartbeat(0,[Lscala.Tuple2;@544fc1ff,BlockManagerId(0, iZ94w2tczvjZ, 41595))]

  1. johnxfly 1 times, last 1 month ago
  2. poroszd 2 times, last 2 months ago
  3. Nikolay Rybak 1 times, last 4 months ago
  4. tyson925 6 times, last 5 months ago
  5. kid 1 times, last 6 months ago
3 more registered users
28 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.util.concurrent.TimeoutException

    Futures timed out after [30 seconds]

    at scala.concurrent.impl.Promise$DefaultPromise.ready()
  2. Scala
    Await$.result
    1. scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
    2. scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
    3. scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
    4. scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
    5. scala.concurrent.Await$.result(package.scala:107)
    5 frames
  3. Spark
    Executor$$anon$1.run
    1. org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:176)
    2. org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:379)
    2 frames