org.apache.spark.SparkException: Error sending message [message = Heartbeat(driver,[Lscala.Tuple2;@3475518a,BlockManagerId(driver, localhost, 63823))]

GitHub | yxzf | 5 months ago
  1. 0

    GitHub comment 1370#233621519

    GitHub | 5 months ago | yxzf
    org.apache.spark.SparkException: Error sending message [message = Heartbeat(driver,[Lscala.Tuple2;@3475518a,BlockManagerId(driver, localhost, 63823))]
  2. 0

    GitHub comment 572#249369489

    GitHub | 3 months ago | car2008
    org.apache.spark.SparkException: Error sending message [message = Heartbeat(driver,[Lscala.Tuple2;@d5667a9,BlockManagerId(driver, localhost, 44173))]
  3. 0

    java.lang.OutOfMemoryError: Java heap space when running the MNIST example

    GitHub | 8 months ago | romeokienzler
    org.apache.spark.SparkException: Error sending message [message = StopBlockManagerMaster]
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Spark Application Not Recovering when Executor Lost

    Stack Overflow | 6 months ago | user481a
    org.apache.spark.SparkException: Error sending message [message = RemoveExecutor(1)]
  6. 0

    Apache Spark User List - "CANNOT FIND ADDRESS"

    nabble.com | 8 months ago
    java.util.concurrent.TimeoutException: Futures timed out after [10 seconds]

  1. tyson925 6 times, last 2 months ago
  2. Nikolay Rybak 1 times, last 2 months ago
  3. poroszd 12 times, last 2 months ago
  4. kid 1 times, last 4 months ago
  5. Handemelindo 1 times, last 4 months ago
4 more registered users
28 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.util.concurrent.TimeoutException

    Futures timed out after [10 seconds]

    at scala.concurrent.impl.Promise$DefaultPromise.ready()
  2. Scala
    Await$.result
    1. scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
    2. scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
    3. scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
    4. scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
    5. scala.concurrent.Await$.result(package.scala:107)
    5 frames
  3. org.apache.spark
    RpcEndpointRef.askWithRetry
    1. org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
    2. org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:101)
    2 frames
  4. Spark
    Executor$$anon$1.run
    1. org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$reportHeartBeat(Executor.scala:449)
    2. org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply$mcV$sp(Executor.scala:470)
    3. org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply(Executor.scala:470)
    4. org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply(Executor.scala:470)
    5. org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1765)
    6. org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:470)
    6 frames
  5. Java RT
    Thread.run
    1. java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
    2. java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
    3. java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
    4. java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    5. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    6. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    7. java.lang.Thread.run(Thread.java:745)
    7 frames