org.apache.spark.SparkException: Job cancelled because SparkContext was shut down at org.apache.spark.scheduler.DAGScheduler$$anonfun$c leanUpAfterSchedulerStop$1.apply(DAGScheduler.scal a:703) at org.apache.spark.scheduler.DAGScheduler$$anonfun$c leanUpAfterSchedulerStop$1.apply(DAGScheduler.scal a:702) at scala.collection.mutable.HashSet.foreach(HashSet.s cala:79) at org.apache.spark.scheduler.DAGScheduler.cleanUpAft erSchedulerStop(DAGScheduler.scala:702) at org.apache.spark.scheduler.DAGSchedulerEventProces sLoop.onStop(DAGScheduler.scala:1525) at org.apache.spark.util.EventLoop.stop(EventLoop.sca la:84) at org.apache.spark.scheduler.DAGScheduler.stop(DAGSc heduler.scala:1449) at org.apache.spark.SparkContext$$anonfun$stop$7.appl y$mcV$sp(SparkContext.scala:1724) at org.apache.spark.util.Utils$.tryLogNonFatalError(U tils.scala:1184) at org.apache.spark.SparkContext.stop(SparkContext.sc ala:1723) at org.apache.spark.scheduler.cluster.YarnClientSched ulerBackend$MonitorThread.run(YarnClientSchedulerB ackend.scala:146) at org.apache.spark.scheduler.DAGScheduler.runJob(DAG Scheduler.scala:567) at org.apache.spark.SparkContext.runJob(SparkContext. scala:1824) at org.apache.spark.SparkContext.runJob(SparkContext. scala:1837) at org.apache.spark.SparkContext.runJob(SparkContext. scala:1914) at org.apache.spark.sql.execution.datasources.InsertI ntoHadoopFsRelation$$anonfun$run$1.apply$mcV$sp(In sertIntoHadoopFsRelation.scala:150) at org.apache.spark.sql.execution.datasources.InsertI ntoHadoopFsRelation$$anonfun$run$1.apply(InsertInt oHadoopFsRelation.scala:108) at org.apache.spark.sql.execution.datasources.InsertI ntoHadoopFsRelation$$anonfun$run$1.apply(InsertInt oHadoopFsRelation.scala:108) at org.apache.spark.sql.execution.SQLExecution$.withN ewExecutionId(SQLExecution.scala:56) at org.apache.spark.sql.execution.datasources.InsertI ntoHadoopFsRelation.run(InsertIntoHadoopFsRelation .scala:108) at org.apache.spark.sql.execution.ExecutedCommand.sid eEffectResult$lzycompute(commands.scala:57) at org.apache.spark.sql.execution.ExecutedCommand.sid eEffectResult(commands.scala:57) at org.apache.spark.sql.execution.ExecutedCommand.doE xecute(commands.scala:69) at org.apache.spark.sql.execution.SparkPlan$$anonfun$ execute$5.apply(SparkPlan.scala:140) at org.apache.spark.sql.execution.SparkPlan$$anonfun$ execute$5.apply(SparkPlan.scala:138) at org.apache.spark.rdd.RDDOperationScope$.withScope( RDDOperationScope.scala:147) at org.apache.spark.sql.execution.SparkPlan.execute(S parkPlan.scala:138) at org.apache.spark.sql.SQLContext$QueryExecution.toR dd$lzycompute(SQLContext.scala:933) at org.apache.spark.sql.SQLContext$QueryExecution.toR dd(SQLContext.scala:933) at org.apache.spark.sql.execution.datasources.Resolve dDataSource$.apply(ResolvedDataSource.scala:197) at org.apache.spark.sql.DataFrameWriter.save(DataFram eWriter.scala:146) at org.apache.spark.sql.DataFrameWriter.save(DataFram eWriter.scala:137) at org.apache.spark.sql.DataFrame.save(DataFrame.scal a:1808) at com.acnielsen.madras.utils.ndx_scala_util$.newHive TableData(ndx_scala_util.scala:1264) at com.acnielsen.madras.utils.ndx_scala_util$.UPDATE( ndx_scala_util.scala:238) at com.acnielsen.madras.pkgews_panel_extract$$anonfun $p_signed_rank_yago$1.apply(pkgews_panel_extract.s cala:658) at com.acnielsen.madras.pkgews_panel_extract$$anonfun $p_signed_rank_yago$1.apply(pkgews_panel_extract.s cala:652)

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Samebug tips

Do you know how to solve this issue? Write a tip to help other users and build your expert profile.

Solutions on the web

via cloudera.com by Unknown author, 1 year ago
(DAGScheduler.scala:1525) at org.apache.spark.util.EventLoop.stop(EventLoop.sca la:84) at org.apache.spark.scheduler.DAGScheduler.stop(DAGSc heduler.scala:1449) at org.apache.spark.SparkContext$$anonfun$stop$7.appl y$mcV$sp(SparkContext.scala:1724) at
org.apache.spark.SparkException: Job cancelled because SparkContext was shut down at org.apache.spark.scheduler.DAGScheduler$$anonfun$c leanUpAfterSchedulerStop$1.apply(DAGScheduler.scal a:703) at org.apache.spark.scheduler.DAGScheduler$$anonfun$c leanUpAfterSchedulerStop$1.apply(DAGScheduler.scal a:702) at scala.collection.mutable.HashSet.foreach(HashSet.s cala:79) at org.apache.spark.scheduler.DAGScheduler.cleanUpAft erSchedulerStop(DAGScheduler.scala:702) at org.apache.spark.scheduler.DAGSchedulerEventProces sLoop.onStop(DAGScheduler.scala:1525) at org.apache.spark.util.EventLoop.stop(EventLoop.sca la:84) at org.apache.spark.scheduler.DAGScheduler.stop(DAGSc heduler.scala:1449) at org.apache.spark.SparkContext$$anonfun$stop$7.appl y$mcV$sp(SparkContext.scala:1724) at org.apache.spark.util.Utils$.tryLogNonFatalError(U tils.scala:1184) at org.apache.spark.SparkContext.stop(SparkContext.sc ala:1723) at org.apache.spark.scheduler.cluster.YarnClientSched ulerBackend$MonitorThread.run(YarnClientSchedulerB ackend.scala:146) at org.apache.spark.scheduler.DAGScheduler.runJob(DAG Scheduler.scala:567) at org.apache.spark.SparkContext.runJob(SparkContext. scala:1824) at org.apache.spark.SparkContext.runJob(SparkContext. scala:1837) at org.apache.spark.SparkContext.runJob(SparkContext. scala:1914) at org.apache.spark.sql.execution.datasources.InsertI ntoHadoopFsRelation$$anonfun$run$1.apply$mcV$sp(In sertIntoHadoopFsRelation.scala:150) at org.apache.spark.sql.execution.datasources.InsertI ntoHadoopFsRelation$$anonfun$run$1.apply(InsertInt oHadoopFsRelation.scala:108) at org.apache.spark.sql.execution.datasources.InsertI ntoHadoopFsRelation$$anonfun$run$1.apply(InsertInt oHadoopFsRelation.scala:108) at org.apache.spark.sql.execution.SQLExecution$.withN ewExecutionId(SQLExecution.scala:56) at org.apache.spark.sql.execution.datasources.InsertI ntoHadoopFsRelation.run(InsertIntoHadoopFsRelation .scala:108) at org.apache.spark.sql.execution.ExecutedCommand.sid eEffectResult$lzycompute(commands.scala:57) at org.apache.spark.sql.execution.ExecutedCommand.sid eEffectResult(commands.scala:57) at org.apache.spark.sql.execution.ExecutedCommand.doE xecute(commands.scala:69) at org.apache.spark.sql.execution.SparkPlan$$anonfun$ execute$5.apply(SparkPlan.scala:140) at org.apache.spark.sql.execution.SparkPlan$$anonfun$ execute$5.apply(SparkPlan.scala:138) at org.apache.spark.rdd.RDDOperationScope$.withScope( RDDOperationScope.scala:147) at org.apache.spark.sql.execution.SparkPlan.execute(S parkPlan.scala:138) at org.apache.spark.sql.SQLContext$QueryExecution.toR dd$lzycompute(SQLContext.scala:933) at org.apache.spark.sql.SQLContext$QueryExecution.toR dd(SQLContext.scala:933) at org.apache.spark.sql.execution.datasources.Resolve dDataSource$.apply(ResolvedDataSource.scala:197) at org.apache.spark.sql.DataFrameWriter.save(DataFram eWriter.scala:146) at org.apache.spark.sql.DataFrameWriter.save(DataFram eWriter.scala:137) at org.apache.spark.sql.DataFrame.save(DataFrame.scal a:1808) at com.acnielsen.madras.utils.ndx_scala_util$.newHive TableData(ndx_scala_util.scala:1264) at com.acnielsen.madras.utils.ndx_scala_util$.UPDATE( ndx_scala_util.scala:238) at com.acnielsen.madras.pkgews_panel_extract$$anonfun $p_signed_rank_yago$1.apply(pkgews_panel_extract.s cala:658) at com.acnielsen.madras.pkgews_panel_extract$$anonfun $p_signed_rank_yago$1.apply(pkgews_panel_extract.s cala:652)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at com.acnielsen.madras.pkgews_panel_extract$.p_signed_rank_yago(pkgews_panel_extract.scala:652)
at com.acnielsen.madras.pkgews_panel_extract$.p_main(pkgews_panel_extract.scala:4844)
at com.acnielsen.madras.pkgews_panel_extract$.main(pkgews_panel_extract.scala:4655)
at com.acnielsen.madras.pkgews_panel_extract.main(pkgews_panel_extract.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)

Users with the same issue

You are the first who have seen this exception. Write a tip to help other users and build your expert profile.

Write tip

Know the solutions? Share your knowledge to help other developers to debug faster.