org.apache.spark.SparkException: Job cancelled because SparkContext was shut down

nabble.com | 5 months ago
  1. 0

    Apache Spark Developers List - Spark fails after 6000s because of akka

    nabble.com | 5 months ago
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  2. 0

    Re: Spark fails after 6000s because of akka

    spark-dev | 1 year ago | Alexander Pivovarov
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  3. 0

    Re: Spark fails after 6000s because of akka

    spark-dev | 1 year ago | Alexander Pivovarov
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    "sparkContext was shut down" while running spark on a large dataset

    Stack Overflow | 1 year ago | Aleksander Zendel
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down
  6. 0

    Can't run sparkbench in HDP 2.3

    GitHub | 1 year ago | goldjay1231
    org.apache.spark.SparkException: Job cancelled because SparkContext was shut down

  1. tyson925 1 times, last 11 hours ago
  2. johnxfly 16 times, last 7 days ago
  3. Nikolay Rybak 1 times, last 3 weeks ago
  4. tyson925 12 times, last 3 months ago
3 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. org.apache.spark.SparkException

    Job cancelled because SparkContext was shut down

    at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply()
  2. Spark
    DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply
    1. org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:703)
    2. org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:702)
    2 frames
  3. Scala
    HashSet.foreach
    1. scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
    1 frame
  4. Spark
    RDD.saveAsTextFile
    1. org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:702)
    2. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1514)
    3. org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
    4. org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1438)
    5. org.apache.spark.SparkContext$$anonfun$stop$7.apply$mcV$sp(SparkContext.scala:1724)
    6. org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1185)
    7. org.apache.spark.SparkContext.stop(SparkContext.scala:1723)
    8. org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:146)
    9. org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567)
    10. org.apache.spark.SparkContext.runJob(SparkContext.scala:1824)
    11. org.apache.spark.SparkContext.runJob(SparkContext.scala:1837)
    12. org.apache.spark.SparkContext.runJob(SparkContext.scala:1914)
    13. org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1124)
    14. org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1.apply(PairRDDFunctions.scala:1065)
    15. org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1.apply(PairRDDFunctions.scala:1065)
    16. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
    17. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
    18. org.apache.spark.rdd.RDD.withScope(RDD.scala:310)
    19. org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopDataset(PairRDDFunctions.scala:1065)
    20. org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply$mcV$sp(PairRDDFunctions.scala:989)
    21. org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply(PairRDDFunctions.scala:965)
    22. org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply(PairRDDFunctions.scala:965)
    23. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
    24. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
    25. org.apache.spark.rdd.RDD.withScope(RDD.scala:310)
    26. org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:965)
    27. org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply$mcV$sp(PairRDDFunctions.scala:897)
    28. org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply(PairRDDFunctions.scala:897)
    29. org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$1.apply(PairRDDFunctions.scala:897)
    30. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
    31. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
    32. org.apache.spark.rdd.RDD.withScope(RDD.scala:310)
    33. org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:896)
    34. org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply$mcV$sp(RDD.scala:1430)
    35. org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply(RDD.scala:1409)
    36. org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1.apply(RDD.scala:1409)
    37. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
    38. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
    39. org.apache.spark.rdd.RDD.withScope(RDD.scala:310)
    40. org.apache.spark.rdd.RDD.saveAsTextFile(RDD.scala:1409)
    40 frames
  5. com.radius.distiller
    Execute.main
    1. com.radius.distiller.components.CondenseRecords.saveValidationQa(CondenseRecords.scala:65)
    2. com.radius.distiller.Distiller.runCondenseRecords(Distiller.scala:49)
    3. com.radius.distiller.Execute$.run(Execute.scala:56)
    4. com.radius.distiller.Execute$.main(Execute.scala:33)
    5. com.radius.distiller.Execute.main(Execute.scala)
    5 frames
  6. Java RT
    Method.invoke
    1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    4. java.lang.reflect.Method.invoke(Method.java:606)
    4 frames
  7. Spark
    SparkSubmit.main
    1. org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
    2. org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    3. org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    4. org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
    5. org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    5 frames