Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Stack Overflow by Newbie
, 1 year ago
Job 2 cancelled because SparkContext was shut down
via Stack Overflow by Aleksander Zendel
, 1 year ago
via GitHub by goldjay1231
, 2 years ago
Job cancelled because SparkContext was shut down
via spark-dev by Alexander Pivovarov, 2 years ago
Job cancelled because SparkContext was shut down
via spark-dev by Alexander Pivovarov, 2 years ago
Job cancelled because SparkContext was shut down
via nabble.com by Unknown author, 1 year ago
Job cancelled because SparkContext was shut down
org.apache.spark.SparkException: Job 2 cancelled because SparkContext was shut down	at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)	at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)	at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)	at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)	at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)	at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)	at org.apache.spark.SparkContext$$anonfun$stop$7.apply$mcV$sp(SparkContext.scala:1731)	at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1229)	at org.apache.spark.SparkContext.stop(SparkContext.scala:1730)	at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:147)	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)	at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:927)	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)	at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)	at org.apache.spark.rdd.RDD.collect(RDD.scala:926)	at org.apache.spark.sql.sources.HadoopFsRelation$.listLeafFilesInParallel(interfaces.scala:904)	at org.apache.spark.sql.sources.HadoopFsRelation$FileStatusCache.listLeafFiles(interfaces.scala:445)	at org.apache.spark.sql.sources.HadoopFsRelation$FileStatusCache.refresh(interfaces.scala:477)	at org.apache.spark.sql.sources.HadoopFsRelation.org$apache$spark$sql$sources$HadoopFsRelation$$fileStatusCache$lzycompute(interfaces.scala:489)	at org.apache.spark.sql.sources.HadoopFsRelation.org$apache$spark$sql$sources$HadoopFsRelation$$fileStatusCache(interfaces.scala:487)	at org.apache.spark.sql.sources.HadoopFsRelation.cachedLeafStatuses(interfaces.scala:494)	at org.apache.spark.sql.execution.datasources.json.JSONRelation$$anonfun$4.apply(JSONRelation.scala:110)	at org.apache.spark.sql.execution.datasources.json.JSONRelation$$anonfun$4.apply(JSONRelation.scala:109)	at scala.Option.getOrElse(Option.scala:120)	at org.apache.spark.sql.execution.datasources.json.JSONRelation.dataSchema$lzycompute(JSONRelation.scala:109)	at org.apache.spark.sql.execution.datasources.json.JSONRelation.dataSchema(JSONRelation.scala:108)	at org.apache.spark.sql.sources.HadoopFsRelation.schema$lzycompute(interfaces.scala:636)	at org.apache.spark.sql.sources.HadoopFsRelation.schema(interfaces.scala:635)	at org.apache.spark.sql.execution.datasources.LogicalRelation.(LogicalRelation.scala:37)	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:136)	at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:263)	at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:58)	at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:63)	at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:65)	at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:67)	at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:69)	at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.(:71)	at $iwC$$iwC$$iwC$$iwC$$iwC.(:73)	at $iwC$$iwC$$iwC$$iwC.(:75)	at $iwC$$iwC$$iwC.(:77)	at $iwC$$iwC.(:79)	at $iwC.(:81)