org.apache.spark.SparkException

Job 311 cancelled because SparkContext was shut down' was thrown while evaluating an expression.

Solutions on the web1721

  • via snappydata.io by Unknown author, 2 months ago
    Job 311 cancelled because SparkContext was shut down' was thrown while evaluating an expression.
  • via GitHub by velvia
    , 1 year ago
    Job cancelled because SparkContext was shut down
  • via Google Groups by sunil v, 6 months ago
    Job cancelled because SparkContext was shut down
  • Stack trace

    • org.apache.spark.SparkException: Job 311 cancelled because SparkContext was shut down' was thrown while evaluating an expression. at com.pivotal.gemfirexd.internal.impl.tools.ij.utilMain.handleSQLException(utilMain.java:896) at com.pivotal.gemfirexd.internal.impl.tools.ij.utilMain.doCatch(utilMain.java:795) at com.pivotal.gemfirexd.internal.impl.tools.ij.utilMain.runScriptGuts(utilMain.java:469) at com.pivotal.gemfirexd.internal.impl.tools.ij.utilMain.goScript(utilMain.java:336) at com.pivotal.gemfirexd.tools.internal.MiscTools$1.executeCommand(MiscTools.java:152) at com.pivotal.gemfirexd.tools.internal.ToolsBase.invoke(ToolsBase.java:135) at com.pivotal.gemfirexd.tools.internal.MiscTools.main(MiscTools.java:95)

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    You are the first who have seen this exception. Write a tip to help other users and build your expert profile.