Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Stack Overflow by mahdi62
, 1 year ago
via GitHub by fromradio
, 1 year ago
SparkContext has been shutdown
via GitHub by geoHeil
, 1 year ago
SparkContext has been shutdown
via gitbooks.io by Unknown author, 1 year ago
SparkContext has been shutdown
via wordpress.com by Unknown author, 2 years ago
via wordpress.com by Unknown author, 2 years ago
SparkContext has been shutdown
java.lang.IllegalStateException: SparkContext has been shutdown	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1824)	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)	at org.apache.spark.rdd.RDD.count(RDD.scala:1157)	at UnionStream$$anonfun$creatingFunc$5.apply(UnionStreaming.scala:453)	at UnionStream$$anonfun$creatingFunc$5.apply(UnionStreaming.scala:451)	at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:661)	at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:661)	at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:50)	at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:50)	at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:50)	at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:426)	at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:49)	at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:49)	at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:49)	at scala.util.Try$.apply(Try.scala:161)	at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)	at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:224)	at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:224)	at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:224)	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)	at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:223)	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)	at java.lang.Thread.run(Thread.java:745)