org.apache.spark.SparkException: Job 311 cancelled because SparkContext was shut down' was thrown while evaluating an expression.

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via snappydata.io by Unknown author, 6 months ago
Job 311 cancelled because SparkContext was shut down' was thrown while evaluating an expression.
org.apache.spark.SparkException: Job 311 cancelled because SparkContext was shut down' was thrown while evaluating an expression.
at com.pivotal.gemfirexd.internal.impl.tools.ij.utilMain.handleSQLException(utilMain.java:896)
at com.pivotal.gemfirexd.internal.impl.tools.ij.utilMain.doCatch(utilMain.java:795)
at com.pivotal.gemfirexd.internal.impl.tools.ij.utilMain.runScriptGuts(utilMain.java:469)
at com.pivotal.gemfirexd.internal.impl.tools.ij.utilMain.goScript(utilMain.java:336)
at com.pivotal.gemfirexd.tools.internal.MiscTools$1.executeCommand(MiscTools.java:152)
at com.pivotal.gemfirexd.tools.internal.ToolsBase.invoke(ToolsBase.java:135)
at com.pivotal.gemfirexd.tools.internal.MiscTools.main(MiscTools.java:95)

Users with the same issue

You are the first who have seen this exception. Write a tip to help other users and build your expert profile.

Know the solutions? Share your knowledge to help other developers to debug faster.