java.lang.IllegalStateException: Adding new inputs, transformations, and output operations after sta$ ting a context is not supported

Stack Overflow | benteeuwen | 4 months ago
  1. 0

    scala | read from file | type mismatch | for-comprehension

    codegur.com | 4 months ago
    java.lang.IllegalStateException: Adding new inputs, transformations, and output operations after sta$ ting a context is not supported
  2. 0

    Materialize mapWithState stateSnapShots to database for later resume of spark streaming app

    Stack Overflow | 4 months ago | benteeuwen
    java.lang.IllegalStateException: Adding new inputs, transformations, and output operations after sta$ ting a context is not supported
  3. 0

    Materialize mapWithState stateSnapShots to database for later resume of spark streaming app

    codegur.com | 3 months ago
    java.lang.IllegalStateException: Adding new inputs, transformations, and output operations after sta$ ting a context is not supported
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    How to log data from RDD without doing transformations or output operations in Spark after exception is caught in driver

    Stack Overflow | 9 months ago | amit_kumar
    java.lang.IllegalStateException: Adding new inputs, transformations, and output operations after stopping a context is not supported
  6. 0

    Apache Spark User List - java.lang.IllegalArgumentException: requirement failed: No output operations registered, so nothing to execute

    nabble.com | 4 months ago
    java.lang.IllegalStateException: Adding new inputs, transformations, and output operations after stopping a context is not supported

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.IllegalStateException

      Adding new inputs, transformations, and output operations after sta$ ting a context is not supported

      at org.apache.spark.streaming.dstream.DStream.validateAtInit()
    2. Spark Project Streaming
      DStream$$anonfun$foreachRDD$1.apply
      1. org.apache.spark.streaming.dstream.DStream.validateAtInit(DStream.scala:222)
      2. org.apache.spark.streaming.dstream.DStream.<init>(DStream.scala:64)
      3. org.apache.spark.streaming.dstream.ForEachDStream.<init>(ForEachDStream.scala:34)
      4. org.apache.spark.streaming.dstream.DStream.org$apache$spark$streaming$dstream$DStream$$foreachRDD(DStream.scala:687)
      5. org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1.apply$mcV$sp(DStream.scala:661)
      6. org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1.apply(DStream.scala:659)
      7. org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1.apply(DStream.scala:659)
      7 frames
    3. Spark
      SparkContext.withScope
      1. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
      2. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
      3. org.apache.spark.SparkContext.withScope(SparkContext.scala:714)
      3 frames
    4. Spark Project Streaming
      DStream.foreachRDD
      1. org.apache.spark.streaming.StreamingContext.withScope(StreamingContext.scala:260)
      2. org.apache.spark.streaming.dstream.DStream.foreachRDD(DStream.scala:659)
      2 frames
    5. main.scala.feaUS
      Listener.run
      1. main.scala.feaUS.Listener.run(feaUS.scala:119)
      1 frame
    6. Java RT
      Thread.run
      1. java.lang.Thread.run(Thread.java:745)
      1 frame