org.apache.spark.SparkException: Job cancelled because SparkContext was shut down

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via pastebin.ca by Unknown author, 2 years ago
Job cancelled because SparkContext was shut down
via csdn.net by Unknown author, 1 year ago
via GitHub by jmuhlenkamp
, 1 year ago
Job 34 cancelled because SparkContext was shut down
via GitHub by dahaian
, 1 month ago
Job 995 cancelled because SparkContext was shut down
via Talend Open Integration Solution by lei ju, 1 year ago
Job 1 cancelled because SparkContext was shut down
via Stack Overflow by Piyush Agal
, 2 years ago
Job cancelled because SparkContext was shut down
org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:699)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:698)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1411)
at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1346)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1380)

Users with the same issue

Once, 1 month ago
2 times, 3 months ago
Once, 6 months ago
Once, 6 months ago
Once, 8 months ago

Write tip

Know the solutions? Share your knowledge to help other developers to debug faster.