Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Unix & Linux by Henry
, 1 year ago
Job 0 cancelled because SparkContext was shut down
via GitHub by dahaian
, 6 months ago
Job 995 cancelled because SparkContext was shut down
via Talend Open Integration Solution by lei ju, 1 year ago
via GitHub by jmuhlenkamp
, 1 year ago
Job 34 cancelled because SparkContext was shut down
via Stack Overflow by Appalachian Math
, 1 year ago
Job 2 cancelled because SparkContext was shut down
via pastebin.ca by Unknown author, 2 years ago
Job cancelled because SparkContext was shut down
org.apache.spark.SparkException: Job 0 cancelled because SparkContext was shut down	at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)	at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)	at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)	at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)	at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)	at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)	at org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1740)	at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1229)	at org.apache.spark.SparkContext.stop(SparkContext.scala:1739)	at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:147)	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1952)	at org.apache.spark.rdd.RDD$$anonfun$reduce$1.apply(RDD.scala:1025)	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)	at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)	at org.apache.spark.rdd.RDD.reduce(RDD.scala:1007)	at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:36)	at org.apache.spark.examples.SparkPi.main(SparkPi.scala)	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)	at java.lang.reflect.Method.invoke(Method.java:606)	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)