org.apache.spark.SparkException: Job cancelled because SparkContext was shut down

spark-user | Timothy Sum Hon Mun | 1 year ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Re: Spark Effects of Driver Memory, Executor Memory, Driver Memory Overhead and Executor Memory Overhead on success of job runs

    spark-user | 1 year ago | Timothy Sum Hon Mun
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  2. 0

    Spark cluster computing framework

    gmane.org | 11 months ago
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  3. 0

    Spark cluster computing framework

    gmane.org | 11 months ago
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    SparkR Job 100 Minutes Timeout

    Stack Overflow | 1 year ago | Shanika
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  6. 0

    Job cancelled because SparkContext was shut down while saving dataframe as hive table

    Stack Overflow | 6 months ago | vatsal mevada
    org.apache.spark.SparkException: Job 2 cancelled because SparkContext was shut down

  1. tyson925 1 times, last 3 weeks ago
  2. Nikolay Rybak 2 times, last 4 weeks ago
  3. johnxfly 16 times, last 2 months ago
3 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. org.apache.spark.SparkException

    Job cancelled because SparkContext was shut down

    at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply()
  2. Spark
    DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply
    1. org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:736)
    2. org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:735)
    2 frames
  3. Scala
    HashSet.foreach
    1. scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
    1 frame
  4. Spark
    SparkShutdownHookManager$$anonfun$runAll$1.apply
    1. org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:735)
    2. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1468)
    3. org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
    4. org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1403)
    5. org.apache.spark.SparkContext.stop(SparkContext.scala:1642)
    6. org.apache.spark.SparkContext$$anonfun$3.apply$mcV$sp(SparkContext.scala:559)
    7. org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2292)
    8. org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(Utils.scala:2262)
    9. org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(Utils.scala:2262)
    10. org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(Utils.scala:2262)
    11. org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1772)
    12. org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2262)
    13. org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2262)
    14. org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2262)
    14 frames
  5. Scala
    Try$.apply
    1. scala.util.Try$.apply(Try.scala:161)
    1 frame
  6. Spark
    SparkShutdownHookManager$$anon$6.run
    1. org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2262)
    2. org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2244)
    2 frames
  7. Hadoop
    ShutdownHookManager$1.run
    1. org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
    1 frame