java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext

hive-user | Jörn Franke | 1 year ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Re: Executor getting killed when running Hive on Spark

    hive-user | 1 year ago | Jörn Franke
    java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext
  2. 0

    Re:Re: Unable to start container using hive on spark

    hive-user | 1 year ago | Todd
    java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext
  3. 0

    Re: Unable to start container using hive on spark

    hive-user | 1 year ago | Sofia
    java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Re:Re: Unable to start container using hive on spark

    hive-user | 1 year ago | Todd
    java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext
  6. 0

    Re: Unable to start container using hive on spark

    hive-user | 1 year ago | Sofia
    java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext

    1 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.IllegalStateException

      Cannot call methods on a stopped SparkContext

      at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped()
    2. Spark
      AbstractJavaRDDLike.foreachAsync
      1. org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:104)
      2. org.apache.spark.SparkContext.submitJob(SparkContext.scala:1981)
      3. org.apache.spark.rdd.AsyncRDDActions$$anonfun$foreachAsync$1.apply(AsyncRDDActions.scala:118)
      4. org.apache.spark.rdd.AsyncRDDActions$$anonfun$foreachAsync$1.apply(AsyncRDDActions.scala:116)
      5. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
      6. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
      7. org.apache.spark.rdd.RDD.withScope(RDD.scala:310)
      8. org.apache.spark.rdd.AsyncRDDActions.foreachAsync(AsyncRDDActions.scala:116)
      9. org.apache.spark.api.java.JavaRDDLike$class.foreachAsync(JavaRDDLike.scala:690)
      10. org.apache.spark.api.java.AbstractJavaRDDLike.foreachAsync(JavaRDDLike.scala:47)
      10 frames
    3. org.apache.hadoop
      RemoteHiveSparkClient$JobStatusJob.call
      1. org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient$JobStatusJob.call(RemoteHiveSparkClient.java:257)
      1 frame
    4. org.apache.hive
      RemoteDriver$JobWrapper.call
      1. org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:366)
      2. org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:335)
      2 frames
    5. Java RT
      Thread.run
      1. java.util.concurrent.FutureTask.run(FutureTask.java:262)
      2. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      3. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      4. java.lang.Thread.run(Thread.java:745)
      4 frames