org.apache.spark.SparkException: Job cancelled because SparkContext was shut down

Stack Overflow | user2895478 | 2 years ago
  1. 0

    Spark Streaming Stateful Network Word Count - www.scriptscoop.net

    scriptscoop.net | 8 months ago
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  2. 0

    Spark Streaming Stateful Network Word Count

    Stack Overflow | 2 years ago | user2895478
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  3. 0

    SparkContext shut down if runJob() returns a stream runtime object

    GitHub | 2 years ago | adrianfr
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Spark OutOfMemoryError when adding executors

    Stack Overflow | 2 years ago | Belag
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  6. 0

    Spark Context shutdown when I parallelize a large list

    Stack Overflow | 2 years ago | StackG
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down

  1. tyson925 1 times, last 3 weeks ago
  2. Nikolay Rybak 2 times, last 4 weeks ago
  3. tyson925 12 times, last 2 months ago
3 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. org.apache.spark.SparkException

    Job cancelled because SparkContext was shut down

    at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply()
  2. Spark
    DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply
    1. org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:639)
    2. org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:638)
    2 frames
  3. Scala
    HashSet.foreach
    1. scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
    1 frame
  4. Spark
    DAGSchedulerEventProcessActor.postStop
    1. org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:638)
    2. org.apache.spark.scheduler.DAGSchedulerEventProcessActor.postStop(DAGScheduler.scala:1215)
    2 frames
  5. Akka Actor
    ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec
    1. akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:201)
    2. akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:163)
    3. akka.actor.ActorCell.terminate(ActorCell.scala:338)
    4. akka.actor.ActorCell.invokeAll$1(ActorCell.scala:431)
    5. akka.actor.ActorCell.systemInvoke(ActorCell.scala:447)
    6. akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:262)
    7. akka.dispatch.Mailbox.processMailbox(Mailbox.scala:240)
    8. akka.dispatch.Mailbox.run(Mailbox.scala:219)
    9. akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
    9 frames
  6. Scala
    ForkJoinWorkerThread.run
    1. scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    2. scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    3. scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    4. scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
    4 frames