org.apache.spark.SparkException: Job cancelled because SparkContext was shut down

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • pastebin - Spark OOM Logs - post number 3078231
    via by Unknown author,
  • GitHub comment 109#130664925
    via GitHub by cacti77
    ,
  • GitHub comment 109#134049748
    via GitHub by velvia
    ,
  • GitHub comment 54#71801487
    via GitHub by velvia
    ,
  • OOM in spark pagerank
    via Stack Overflow by Piyush Agal
    ,
  • Spark OutOfMemoryError when adding executors
    via Stack Overflow by Belag
    ,
    • org.apache.spark.SparkException: Job cancelled because SparkContext was shut down at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:699) at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:698) at scala.collection.mutable.HashSet.foreach(HashSet.scala:79) at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:698) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1411) at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84) at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1346) at org.apache.spark.SparkContext.stop(SparkContext.scala:1380) at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$$anon$1.run(YarnClientSchedulerBackend.scala:143)

    Users with the same issue

    Luka
    Luka1 times, last one,
    Unknown visitor1 times, last one,
    Unknown visitor1 times, last one,
    rp
    rp1 times, last one,
    bandoca
    bandoca1 times, last one,
    4 more bugmates