1. Explore
  2. »
  3. Spark Project Core

Spark Project Core errors

A fast and general engine for large-scale data processing

http://spark.apache.org/
Solution coverage:
Summary coverage:
Error patternsPackagesClassesMethodsExceptions
DescriptionException TypeEntry MethodWeb pages

DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages() has thrown a SparkException

org.apache.spark.SparkException
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages
258Web pages
SparkExceptionDAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages258

DAGScheduler.abortStage() has thrown a SparkException

org.apache.spark.SparkException
5 frames hidden
org.apache.spark.scheduler.DAGScheduler.abortStage
237Web pages
SparkExceptionDAGScheduler.abortStage237

DAGScheduler.handleTaskSetFailed() has thrown a SparkException

org.apache.spark.SparkException
9 frames hidden
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed
217Web pages
SparkExceptionDAGScheduler.handleTaskSetFailed217

SerializationDebugger$.improveException() has thrown a NotSerializableException

java.io.NotSerializableException
org.apache.spark.serializer.SerializationDebugger$.improveException
118Web pages
NotSerializableExceptionSerializationDebugger$.improveException118

JavaSerializationStream.writeObject() has thrown a NotSerializableException

java.io.NotSerializableException
1 frames hidden
org.apache.spark.serializer.JavaSerializationStream.writeObject
116Web pages
NotSerializableExceptionJavaSerializationStream.writeObject116

DAGSchedulerEventProcessLoop.onReceive() has thrown a SparkException

org.apache.spark.SparkException
12 frames hidden
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive
108Web pages
SparkExceptionDAGSchedulerEventProcessLoop.onReceive108

JavaSerializerInstance.serialize() has thrown a NotSerializableException

java.io.NotSerializableException
2 frames hidden
org.apache.spark.serializer.JavaSerializerInstance.serialize
106Web pages
NotSerializableExceptionJavaSerializerInstance.serialize106

SparkContext.runJob() has thrown a SparkException

org.apache.spark.SparkException
15 frames hidden
org.apache.spark.SparkContext.runJob
102Web pages
SparkExceptionSparkContext.runJob102

SparkContext.runJob() has thrown a SparkException

org.apache.spark.SparkException
16 frames hidden
org.apache.spark.SparkContext.runJob
101Web pages
SparkExceptionSparkContext.runJob101

ClosureCleaner$.ensureSerializable() has thrown a SparkException

org.apache.spark.SparkException
org.apache.spark.util.ClosureCleaner$.ensureSerializable
87Web pages
SparkExceptionClosureCleaner$.ensureSerializable87

ClosureCleaner$.ensureSerializable() has thrown a NotSerializableException

java.io.NotSerializableException
3 frames hidden
org.apache.spark.util.ClosureCleaner$.ensureSerializable
85Web pages
NotSerializableExceptionClosureCleaner$.ensureSerializable85

ClosureCleaner$.ensureSerializable() has thrown a SparkException

java.io.NotSerializableException
4 frames hidden
org.apache.spark.util.ClosureCleaner$.ensureSerializable
82Web pages
SparkExceptionClosureCleaner$.ensureSerializable82

SparkContext.runJob() has thrown a SparkException

org.apache.spark.SparkException
17 frames hidden
org.apache.spark.SparkContext.runJob
81Web pages
SparkExceptionSparkContext.runJob81

PythonRunner.compute() has thrown a SparkException

org.apache.spark.SparkException
2 frames hidden
org.apache.spark.api.python.PythonRunner.compute
81Web pages
SparkExceptionPythonRunner.compute81

SparkSubmit.main() has thrown a SparkException

org.apache.spark.SparkException
11 frames hidden
org.apache.spark.deploy.SparkSubmit.main
81Web pages
SparkExceptionSparkSubmit.main81

RDD.computeOrReadCheckpoint() has thrown a SparkException

org.apache.spark.SparkException
4 frames hidden
org.apache.spark.rdd.RDD.computeOrReadCheckpoint
71Web pages
SparkExceptionRDD.computeOrReadCheckpoint71

SparkContext.clean() has thrown a SparkException

java.io.NotSerializableException
7 frames hidden
org.apache.spark.SparkContext.clean
70Web pages
SparkExceptionSparkContext.clean70

RDD.iterator() has thrown a SparkException

org.apache.spark.SparkException
5 frames hidden
org.apache.spark.rdd.RDD.iterator
69Web pages
SparkExceptionRDD.iterator69

PythonRunner.compute() has thrown a PythonException

org.apache.spark.api.python.PythonException
2 frames hidden
org.apache.spark.api.python.PythonRunner.compute
68Web pages
PythonExceptionPythonRunner.compute68

ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean() has thrown a SparkException

org.apache.spark.SparkException
1 frames hidden
org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean
63Web pages
SparkExceptionClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean63

RDD.computeOrReadCheckpoint() has thrown a PythonException

org.apache.spark.api.python.PythonException
4 frames hidden
org.apache.spark.rdd.RDD.computeOrReadCheckpoint
62Web pages
PythonExceptionRDD.computeOrReadCheckpoint62

ClosureCleaner$.clean() has thrown a SparkException

org.apache.spark.SparkException
2 frames hidden
org.apache.spark.util.ClosureCleaner$.clean
61Web pages
SparkExceptionClosureCleaner$.clean61

RDD.iterator() has thrown a PythonException

org.apache.spark.api.python.PythonException
5 frames hidden
org.apache.spark.rdd.RDD.iterator
59Web pages
PythonExceptionRDD.iterator59

SparkContext.clean() has thrown a SparkException

org.apache.spark.SparkException
3 frames hidden
org.apache.spark.SparkContext.clean
58Web pages
SparkExceptionSparkContext.clean58

Absent.get() has thrown an IllegalStateException

java.lang.IllegalStateException
com.google.common.base.Absent.get
58Web pages
IllegalStateExceptionAbsent.get58

SparkContext.<init>() has thrown a SparkException

org.apache.spark.SparkException
3 frames hidden
org.apache.spark.SparkContext.<init>
56Web pages
1Solution
SparkExceptionSparkContext.<init>56

SparkSubmitUtils$.resolveMavenCoordinates() has thrown a RuntimeException

java.lang.RuntimeException
org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates
52Web pages
RuntimeExceptionSparkSubmitUtils$.resolveMavenCoordinates52

Utils$.classForName() has thrown a ClassNotFoundException

java.lang.ClassNotFoundException
5 frames hidden
org.apache.spark.util.Utils$.classForName
49Web pages
1Solution
ClassNotFoundExceptionUtils$.classForName49

SparkContext.runJob() has thrown a SparkException

org.apache.spark.SparkException
18 frames hidden
org.apache.spark.SparkContext.runJob
48Web pages
SparkExceptionSparkContext.runJob48

DAGScheduler.abortStage() has thrown a SparkException

org.apache.spark.SparkException
4 frames hidden
org.apache.spark.scheduler.DAGScheduler.abortStage
45Web pages
SparkExceptionDAGScheduler.abortStage45

SparkSubmit.main() has thrown a RuntimeException

java.lang.RuntimeException
4 frames hidden
org.apache.spark.deploy.SparkSubmit.main
45Web pages
RuntimeExceptionSparkSubmit.main45

DAGScheduler.cleanUpAfterSchedulerStop() has thrown a SparkException

org.apache.spark.SparkException
3 frames hidden
org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop
44Web pages
SparkExceptionDAGScheduler.cleanUpAfterSchedulerStop44

Utils$.tryOrIOException() has thrown an IOException

java.io.IOException
org.apache.spark.util.Utils$.tryOrIOException
41Web pages
IOExceptionUtils$.tryOrIOException41

SparkContext.<init>() has thrown a SparkException

org.apache.spark.SparkException
org.apache.spark.SparkContext.<init>
39Web pages
SparkExceptionSparkContext.<init>39

RpcTimeout.awaitResult() has thrown a TimeoutException

java.util.concurrent.TimeoutException
5 frames hidden
org.apache.spark.rpc.RpcTimeout.awaitResult
38Web pages
TimeoutExceptionRpcTimeout.awaitResult38

SparkSubmit.main() has thrown a ClassNotFoundException

java.lang.ClassNotFoundException
10 frames hidden
org.apache.spark.deploy.SparkSubmit.main
37Web pages
ClassNotFoundExceptionSparkSubmit.main37

RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException() has thrown a RpcTimeoutException

org.apache.spark.rpc.RpcTimeoutException
org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException
36Web pages
RpcTimeoutExceptionRpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException36

JavaDeserializationStream.readObject() has thrown an IllegalStateException

java.lang.IllegalStateException
7 frames hidden
org.apache.spark.serializer.JavaDeserializationStream.readObject
36Web pages
IllegalStateExceptionJavaDeserializationStream.readObject36

ClosureCleaner$.getClassReader() has thrown an IllegalArgumentException

java.lang.IllegalArgumentException
3 frames hidden
org.apache.spark.util.ClosureCleaner$.getClassReader
35Web pages
IllegalArgumentExceptionClosureCleaner$.getClassReader35

JavaSerializerInstance.deserialize() has thrown an IllegalStateException

java.lang.IllegalStateException
8 frames hidden
org.apache.spark.serializer.JavaSerializerInstance.deserialize
35Web pages
IllegalStateExceptionJavaSerializerInstance.deserialize35

AkkaUtils$.askWithReply() has thrown a TimeoutException

java.util.concurrent.TimeoutException
5 frames hidden
org.apache.spark.util.AkkaUtils$.askWithReply
33Web pages
TimeoutExceptionAkkaUtils$.askWithReply33

AkkaUtils$.createActorSystem() has thrown a TimeoutException

java.util.concurrent.TimeoutException
19 frames hidden
org.apache.spark.util.AkkaUtils$.createActorSystem
33Web pages
TimeoutExceptionAkkaUtils$.createActorSystem33

RDD.computeOrReadCheckpoint() has thrown a SparkException

org.apache.spark.SparkException
3 frames hidden
org.apache.spark.rdd.RDD.computeOrReadCheckpoint
32Web pages
SparkExceptionRDD.computeOrReadCheckpoint32

SparkContext.org$apache$spark$SparkContext$$assertNotStopped() has thrown an IllegalStateException

java.lang.IllegalStateException
org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped
32Web pages
IllegalStateExceptionSparkContext.org$apache$spark$SparkContext$$assertNotStopped32

RpcTimeout.awaitResult() has thrown a RpcTimeoutException

java.util.concurrent.TimeoutException
10 frames hidden
org.apache.spark.rpc.RpcTimeout.awaitResult
32Web pages
RpcTimeoutExceptionRpcTimeout.awaitResult32

RDD.partitions() has thrown an InvalidInputException

org.apache.hadoop.mapred.InvalidInputException
7 frames hidden
org.apache.spark.rdd.RDD.partitions
31Web pages
InvalidInputExceptionRDD.partitions31

DAGScheduler.stop() has thrown a SparkException

org.apache.spark.SparkException
6 frames hidden
org.apache.spark.scheduler.DAGScheduler.stop
31Web pages
SparkExceptionDAGScheduler.stop31

DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage() has thrown a SparkException

org.apache.spark.SparkException
4 frames hidden
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage
31Web pages
SparkExceptionDAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage31

RDD.iterator() has thrown a SparkException

org.apache.spark.SparkException
4 frames hidden
org.apache.spark.rdd.RDD.iterator
31Web pages
SparkExceptionRDD.iterator31

SparkContext.clean() has thrown an IllegalArgumentException

java.lang.IllegalArgumentException
23 frames hidden
org.apache.spark.SparkContext.clean
31Web pages
IllegalArgumentExceptionSparkContext.clean31
Nothing to list here.
Nothing to list here.
Nothing to list here.
Nothing to list here.