java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext. This stopped SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:83) org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874) org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81) org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140) bigdata.spark_0_1.spark.runJobInTOS(spark.java:889) bigdata.spark_0_1.spark.main(spark.java:773) The currently active SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:83) org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874) org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81) org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140) bigdata.spark_0_1.spark.runJobInTOS(spark.java:889) bigdata.spark_0_1.spark.main(spark.java:773)

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Talend Open Integration Solution by lei ju, 1 year ago
Cannot call methods on a stopped SparkContext. This stopped SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:83) org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874
via hive-user by Todd, 4 months ago
Cannot call methods on a stopped SparkContext
via Stack Overflow by astha
, 11 months ago
Cannot call methods on a stopped SparkContext
via hive-user by Sofia, 4 months ago
Cannot call methods on a stopped SparkContext
via nabble.com by Unknown author, 1 year ago
Cannot call methods on a stopped SparkContext
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext. This stopped SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:83) org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874) org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81) org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140) bigdata.spark_0_1.spark.runJobInTOS(spark.java:889) bigdata.spark_0_1.spark.main(spark.java:773) The currently active SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:83) org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874) org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81) org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140) bigdata.spark_0_1.spark.runJobInTOS(spark.java:889) bigdata.spark_0_1.spark.main(spark.java:773)
at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:107)
at org.apache.spark.SparkContext$$anonfun$parallelize$1.apply(SparkContext.scala:740)
at org.apache.spark.SparkContext$$anonfun$parallelize$1.apply(SparkContext.scala:739)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.SparkContext.withScope(SparkContext.scala:725)
at org.apache.spark.SparkContext.parallelize(SparkContext.scala:739)
at org.apache.spark.SparkContext$$anonfun$makeRDD$1.apply(SparkContext.scala:823)
at org.apache.spark.SparkContext$$anonfun$makeRDD$1.apply(SparkContext.scala:823)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.SparkContext.withScope(SparkContext.scala:725)
at org.apache.spark.SparkContext.makeRDD(SparkContext.scala:822)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint.org$apache$spark$streaming$scheduler$ReceiverTracker$ReceiverTrackerEndpoint$$startReceiver(ReceiverTracker.scala:585)
at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:116)
at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:204)
at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:215)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Users with the same issue

Samebug visitor profile picture
Unknown user
Once, 4 months ago
Samebug visitor profile picture
Unknown user
Once, 1 year ago
Samebug visitor profile picture
Unknown user
Once, 1 year ago
Samebug visitor profile picture
Unknown user
Once, 1 year ago
Samebug visitor profile picture
Unknown user
Once, 1 year ago
2 more bugmates

Know the solutions? Share your knowledge to help other developers to debug faster.