org.apache.spark.SparkException

Error sending message [message = StopAllReceivers]

Samebug tips0

There are no available Samebug tips for this exception. If you know how to solve this issue, help other users by writing a short tip.

Don't give up yet. Paste your full stack trace to get a solution.

Solutions on the web249

  • via Talend Open Integration Solution by lei ju, 11 months ago
    Error sending message [message = StopAllReceivers]
  • via Google Groups by hart jo, 11 months ago
    Error sending message [message = StopExecutors]
  • Error sending message [message = StopBlockManagerMaster]
  • Stack trace

    • org.apache.spark.SparkException: Error sending message [message = StopAllReceivers] at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:118) at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:77) at org.apache.spark.streaming.scheduler.ReceiverTracker.stop(ReceiverTracker.scala:170) at org.apache.spark.streaming.scheduler.JobScheduler.stop(JobScheduler.scala:93) at org.apache.spark.streaming.StreamingContext.stop(StreamingContext.scala:709) at org.apache.spark.streaming.StreamingContext.stop(StreamingContext.scala:682) at org.apache.spark.streaming.api.java.JavaStreamingContext.stop(JavaStreamingContext.scala:662) at bigdata.spark_0_1.spark.runJobInTOS(spark.java:898) at bigdata.spark_0_1.spark.main(spark.java:773)

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    3 times, 5 months ago
    5 times, 7 months ago