java.lang.IllegalStateException: Adding new inputs, transformations, and output operations after stopping a context is not supported

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via nabble.com by Unknown author, 1 year ago
Adding new inputs, transformations, and output operations after stopping a context is not supported
via codegur.com by Unknown author, 1 year ago
Adding new inputs, transformations, and output operations after sta$ ting a context is not supported
via codegur.com by Unknown author, 1 year ago
Adding new inputs, transformations, and output operations after sta$ ting a context is not supported
via Stack Overflow by benteeuwen
, 1 year ago
Adding new inputs, transformations, and output operations after sta$ ting a context is not supported
via Stack Overflow by Fadhilah Putri
, 1 year ago
Adding new inputs, transformations, and output operations after stopping a context is not supported
via Stack Overflow by amit_kumar
, 1 year ago
Adding new inputs, transformations, and output operations after starting a context is not supported
java.lang.IllegalStateException: Adding new inputs, transformations, and output operations after stopping a context is not supported
at org.apache.spark.streaming.dstream.DStream.validateAtInit(DStream.scala:224)
at org.apache.spark.streaming.dstream.DStream.(DStream.scala:64)
at org.apache.spark.streaming.dstream.ForEachDStream.(ForEachDStream.scala:26)
at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$2.apply(DStream.scala:642)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
at org.apache.spark.SparkContext.withScope(SparkContext.scala:681)
at org.apache.spark.streaming.StreamingContext.withScope(StreamingContext.scala:258)
at org.apache.spark.streaming.dstream.DStream.foreachRDD(DStream.scala:638)
at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1.apply$mcV$sp(DStream.scala:631)
at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1.apply(DStream.scala:629)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
at org.apache.spark.SparkContext.withScope(SparkContext.scala:681)
at org.apache.spark.streaming.StreamingContext.withScope(StreamingContext.scala:258)
at org.apache.spark.streaming.dstream.DStream.foreachRDD(DStream.scala:629)
at org.apache.spark.streaming.api.java.JavaDStreamLike$class.foreachRDD(JavaDStreamLike.scala:315)
at com.til.ibeat.script.AggregateDashboardDataKafka$5.run(AggregateDashboardDataKafka.java:210)

Users with the same issue

Samebug visitor profile picture
Unknown user
Once, 1 year ago

Write tip

Know the solutions? Share your knowledge to help other developers to debug faster.