org.apache.spark.SparkException: Job 2 cancelled because SparkContext was shut down

Stack Overflow | vatsal mevada | 8 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Job cancelled because SparkContext was shut down while saving dataframe as hive table

    Stack Overflow | 8 months ago | vatsal mevada
    org.apache.spark.SparkException: Job 2 cancelled because SparkContext was shut down
  2. 0

    SparkR Job 100 Minutes Timeout

    Stack Overflow | 1 year ago | Shanika
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  3. 0

    spark sql DataFrame to H2OFrame

    Google Groups | 1 year ago | sunil v
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Spark cluster computing framework

    gmane.org | 1 year ago
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  6. 0

    Spark cluster computing framework

    gmane.org | 1 year ago
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down

  1. rp 1 times, last 6 days ago
  2. bandoca 1 times, last 1 week ago
  3. tyson925 4 times, last 1 month ago
  4. johnxfly 1 times, last 1 month ago
  5. Nikolay Rybak 2 times, last 3 months ago
3 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. org.apache.spark.SparkException

    Job 2 cancelled because SparkContext was shut down

    at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply()
  2. Spark
    DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply
    1. org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)
    2. org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)
    2 frames
  3. Scala
    HashSet.foreach
    1. scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
    1 frame
  4. Spark
    SparkShutdownHookManager$$anonfun$runAll$1.apply
    1. org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)
    2. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)
    3. org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
    4. org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)
    5. org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1751)
    6. org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1230)
    7. org.apache.spark.SparkContext.stop(SparkContext.scala:1750)
    8. org.apache.spark.SparkContext$$anonfun$3.apply$mcV$sp(SparkContext.scala:607)
    9. org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:267)
    10. org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:239)
    11. org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
    12. org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
    13. org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1766)
    14. org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:239)
    15. org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
    16. org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
    16 frames
  5. Scala
    Try$.apply
    1. scala.util.Try$.apply(Try.scala:161)
    1 frame
  6. Spark
    SparkShutdownHookManager$$anon$2.run
    1. org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:239)
    2. org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:218)
    2 frames
  7. Hadoop
    ShutdownHookManager$1.run
    1. org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
    1 frame
  8. Spark
    SparkContext.runJob
    1. org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
    2. org.apache.spark.SparkContext.runJob(SparkContext.scala:1843)
    3. org.apache.spark.SparkContext.runJob(SparkContext.scala:1856)
    4. org.apache.spark.SparkContext.runJob(SparkContext.scala:1933)
    4 frames
  9. org.apache.spark
    InsertIntoHadoopFsRelation$$anonfun$run$1.apply
    1. org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation$$anonfun$run$1.apply$mcV$sp(InsertIntoHadoopFsRelation.scala:150)
    2. org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation$$anonfun$run$1.apply(InsertIntoHadoopFsRelation.scala:108)
    3. org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation$$anonfun$run$1.apply(InsertIntoHadoopFsRelation.scala:108)
    3 frames
  10. Spark Project SQL
    SQLExecution$.withNewExecutionId
    1. org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:53)
    1 frame
  11. org.apache.spark
    InsertIntoHadoopFsRelation.run
    1. org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation.run(InsertIntoHadoopFsRelation.scala:108)
    1 frame
  12. Spark Project SQL
    SparkPlan$$anonfun$execute$5.apply
    1. org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58)
    2. org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56)
    3. org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)
    4. org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
    5. org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
    5 frames
  13. Spark
    RDDOperationScope$.withScope
    1. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
    1 frame
  14. Spark Project SQL
    QueryExecution.toRdd
    1. org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
    2. org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:55)
    3. org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:55)
    3 frames
  15. org.apache.spark
    ResolvedDataSource$.apply
    1. org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:256)
    1 frame
  16. Spark Project Hive
    CreateMetastoreDataSourceAsSelect.run
    1. org.apache.spark.sql.hive.execution.CreateMetastoreDataSourceAsSelect.run(commands.scala:258)
    1 frame
  17. Spark Project SQL
    SparkPlan$$anonfun$execute$5.apply
    1. org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58)
    2. org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56)
    3. org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)
    4. org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
    5. org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
    5 frames
  18. Spark
    RDDOperationScope$.withScope
    1. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
    1 frame
  19. Spark Project SQL
    DataFrameWriter.saveAsTable
    1. org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
    2. org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:55)
    3. org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:55)
    4. org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:251)
    5. org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:221)
    5 frames
  20. Unknown
    Main.main
    1. DataAccessService.saveToHive(DataAccessService.java:48)
    2. Main.main(Main.java:42)
    2 frames
  21. Java RT
    Method.invoke
    1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    4. java.lang.reflect.Method.invoke(Method.java:606)
    4 frames
  22. Spark
    SparkSubmit.main
    1. org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    2. org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    3. org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    4. org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    5. org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    5 frames