org.apache.spark.SparkException: Job 34 cancelled because SparkContext was shut down

GitHub | jmuhlenkamp | 8 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    SparkContext was shut down on long running as_h2o_frame

    GitHub | 8 months ago | jmuhlenkamp
    org.apache.spark.SparkException: Job 34 cancelled because SparkContext was shut down
  2. 0

    GitHub comment 109#130664925

    GitHub | 2 years ago | cacti77
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  3. 0

    GitHub comment 109#134049748

    GitHub | 2 years ago | velvia
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Apache Spark Developers List - Spark fails after 6000s because of akka

    nabble.com | 9 months ago
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  6. 0

    Spark example Pi fails to run on yarn-client mode

    Unix & Linux | 11 months ago | Henry
    org.apache.spark.SparkException: Job 0 cancelled because SparkContext was shut down

  1. Nikolay Rybak 3 times, last 4 months ago
  2. tyson925 1 times, last 5 days ago
  3. bandoca 1 times, last 1 week ago
  4. rp 1 times, last 3 weeks ago
  5. johnxfly 1 times, last 2 months ago
3 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. org.apache.spark.SparkException

    Job 34 cancelled because SparkContext was shut down

    at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply()
  2. Spark
    DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply
    1. org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)
    2. org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)
    2 frames
  3. Scala
    HashSet.foreach
    1. scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
    1 frame
  4. Spark
    SparkContext.stop
    1. org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)
    2. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)
    3. org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
    4. org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)
    5. org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1740)
    6. org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1219)
    7. org.apache.spark.SparkContext.stop(SparkContext.scala:1739)
    7 frames