Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via GitHub by jmuhlenkamp
, 1 year ago
Job 34 cancelled because SparkContext was shut down
via GitHub by dahaian
, 6 months ago
Job 995 cancelled because SparkContext was shut down
via Talend Open Integration Solution by lei ju, 1 year ago
via pastebin.ca by Unknown author, 2 years ago
Job cancelled because SparkContext was shut down
via csdn.net by Unknown author, 1 year ago
via Stack Overflow by Piyush Agal
, 2 years ago
Job cancelled because SparkContext was shut down
org.apache.spark.SparkException: Job 34 cancelled because SparkContext was shut down	at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)	at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)	at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)	at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)	at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)	at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)	at org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1740)	at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1219)	at org.apache.spark.SparkContext.stop(SparkContext.scala:1739)