java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext. This stopped SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:83) org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874) org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81) org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140) bigdata.spark_0_1.spark.runJobInTOS(spark.java:889) bigdata.spark_0_1.spark.main(spark.java:773) The currently active SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:83) org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874) org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81) org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140) bigdata.spark_0_1.spark.runJobInTOS(spark.java:889) bigdata.spark_0_1.spark.main(spark.java:773)

Talend Open Integration Solution | lei ju | 9 months ago
tip
Click on the to mark the solution that helps you, Samebug will learn from it.
As a community member, you’ll be rewarded for you help.
  1. 0

    Talend Open Integration Solution | 9 months ago | lei ju
    java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext. This stopped SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:83) org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874) org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81) org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140) bigdata.spark_0_1.spark.runJobInTOS(spark.java:889) bigdata.spark_0_1.spark.main(spark.java:773) The currently active SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:83) org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874) org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81) org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140) bigdata.spark_0_1.spark.runJobInTOS(spark.java:889) bigdata.spark_0_1.spark.main(spark.java:773)

    Root Cause Analysis

    1. java.lang.IllegalStateException

      Cannot call methods on a stopped SparkContext. This stopped SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:83) org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874) org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81) org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140) bigdata.spark_0_1.spark.runJobInTOS(spark.java:889) bigdata.spark_0_1.spark.main(spark.java:773) The currently active SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:83) org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874) org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81) org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140) bigdata.spark_0_1.spark.runJobInTOS(spark.java:889) bigdata.spark_0_1.spark.main(spark.java:773)

      at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped()
    2. Spark
      SparkContext.makeRDD
      1. org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:107)
      2. org.apache.spark.SparkContext$$anonfun$parallelize$1.apply(SparkContext.scala:740)
      3. org.apache.spark.SparkContext$$anonfun$parallelize$1.apply(SparkContext.scala:739)
      4. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
      5. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
      6. org.apache.spark.SparkContext.withScope(SparkContext.scala:725)
      7. org.apache.spark.SparkContext.parallelize(SparkContext.scala:739)
      8. org.apache.spark.SparkContext$$anonfun$makeRDD$1.apply(SparkContext.scala:823)
      9. org.apache.spark.SparkContext$$anonfun$makeRDD$1.apply(SparkContext.scala:823)
      10. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
      11. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
      12. org.apache.spark.SparkContext.withScope(SparkContext.scala:725)
      13. org.apache.spark.SparkContext.makeRDD(SparkContext.scala:822)
      13 frames
    3. Spark Project Streaming
      ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$receive$1.applyOrElse
      1. org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint.org$apache$spark$streaming$scheduler$ReceiverTracker$ReceiverTrackerEndpoint$$startReceiver(ReceiverTracker.scala:585)
      2. org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$receive$1.applyOrElse(ReceiverTracker.scala:477)
      2 frames
    4. org.apache.spark
      Dispatcher$MessageLoop.run
      1. org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:116)
      2. org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:204)
      3. org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
      4. org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:215)
      4 frames
    5. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames