org.apache.spark.SparkException: Job 0 cancelled because SparkContext was shut down

Unix & Linux | Henry | 8 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Spark example Pi fails to run on yarn-client mode

    Unix & Linux | 8 months ago | Henry
    org.apache.spark.SparkException: Job 0 cancelled because SparkContext was shut down
  2. 0

    Spark app can run in standalone mode but can't run in yarn cluster

    Stack Overflow | 4 months ago | fuxiuyin
    org.apache.spark.SparkException: Job 0 cancelled because SparkContext was shut down
  3. 0

    Spark streaming with mllib error when running on yarn

    Stack Overflow | 4 hours ago | Eka Cahya Pratama
    org.apache.spark.SparkException: Job 0 cancelled because SparkContext was shut down
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Talend Open Integration Solution | 9 months ago | lei ju
    org.apache.spark.SparkException: Job 1 cancelled because SparkContext was shut down
  6. 0

    Job cancelled because SparkContext was shut down while saving dataframe as hive table

    Stack Overflow | 5 months ago | vatsal mevada
    org.apache.spark.SparkException: Job 2 cancelled because SparkContext was shut down

  1. tyson925 1 times, last 3 weeks ago
  2. Nikolay Rybak 2 times, last 3 weeks ago
  3. johnxfly 16 times, last 1 month ago
3 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. org.apache.spark.SparkException

    Job 0 cancelled because SparkContext was shut down

    at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply()
  2. Spark
    DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply
    1. org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)
    2. org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)
    2 frames
  3. Scala
    HashSet.foreach
    1. scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
    1 frame
  4. Spark
    RDD.reduce
    1. org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)
    2. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)
    3. org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
    4. org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)
    5. org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1740)
    6. org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1229)
    7. org.apache.spark.SparkContext.stop(SparkContext.scala:1739)
    8. org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:147)
    9. org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
    10. org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
    11. org.apache.spark.SparkContext.runJob(SparkContext.scala:1952)
    12. org.apache.spark.rdd.RDD$$anonfun$reduce$1.apply(RDD.scala:1025)
    13. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
    14. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
    15. org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
    16. org.apache.spark.rdd.RDD.reduce(RDD.scala:1007)
    16 frames
  5. Spark Examples
    SparkPi.main
    1. org.apache.spark.examples.SparkPi$.main(SparkPi.scala:36)
    2. org.apache.spark.examples.SparkPi.main(SparkPi.scala)
    2 frames
  6. Java RT
    Method.invoke
    1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    4. java.lang.reflect.Method.invoke(Method.java:606)
    4 frames
  7. Spark
    SparkSubmit.main
    1. org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    2. org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    3. org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    4. org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    5. org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    5 frames