org.apache.spark.SparkException: java.nio.channels.ClosedChannelException

Stack Overflow | Navarro | 2 months ago
  1. 0

    Kafka - SimpleConsumer not working after port change

    Stack Overflow | 2 months ago | Navarro
    org.apache.spark.SparkException: java.nio.channels.ClosedChannelException
  2. 0

    Direct Kafka Stream with PySpark (Apache Spark 1.6)

    Stack Overflow | 9 months ago | cynical biscuit
    org.apache.spark.SparkException: java.nio.channels.ClosedChannelException
  3. 0

    GitHub comment 113#158861911

    GitHub | 1 year ago | ssamynathan
    org.apache.spark.SparkException: java.io.EOFException: Received -1 when reading from channel, socket has likely been closed.
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Spark Streaming + Kafka: SparkException: Couldn't find leader offsets for Set

    Stack Overflow | 12 months ago | facha
    org.apache.spark.SparkException: Couldn't find leader offsets for Set([test-topic,0])
  6. 0

    Spark w/ Embedded Kafka: Couldn't find leader offsets for Set()

    Stack Overflow | 11 months ago | autodidacticon
    org.apache.spark.SparkException: org.apache.spark.SparkException: Couldn't find leader offsets for Set()

    6 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.apache.spark.SparkException

      java.nio.channels.ClosedChannelException

      at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply()
    2. Spark Project External Kafka
      KafkaCluster$$anonfun$checkErrors$1.apply
      1. org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply(KafkaCluster.scala:366)
      2. org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply(KafkaCluster.scala:366)
      2 frames
    3. Scala
      Either.fold
      1. scala.util.Either.fold(Either.scala:97)
      1 frame
    4. Spark Project External Kafka
      DirectKafkaInputDStream$DirectKafkaInputDStreamCheckpointData.restore
      1. org.apache.spark.streaming.kafka.KafkaCluster$.checkErrors(KafkaCluster.scala:365)
      2. org.apache.spark.streaming.kafka.DirectKafkaInputDStream$DirectKafkaInputDStreamCheckpointData.restore(DirectKafkaInputDStream.scala:197)
      2 frames
    5. Spark Project Streaming
      DStream$$anonfun$restoreCheckpointData$2.apply
      1. org.apache.spark.streaming.dstream.DStream.restoreCheckpointData(DStream.scala:515)
      2. org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:516)
      3. org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:516)
      3 frames
    6. Scala
      List.foreach
      1. scala.collection.immutable.List.foreach(List.scala:318)
      1 frame
    7. Spark Project Streaming
      DStream$$anonfun$restoreCheckpointData$2.apply
      1. org.apache.spark.streaming.dstream.DStream.restoreCheckpointData(DStream.scala:516)
      2. org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:516)
      3. org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:516)
      3 frames
    8. Scala
      List.foreach
      1. scala.collection.immutable.List.foreach(List.scala:318)
      1 frame
    9. Spark Project Streaming
      DStream$$anonfun$restoreCheckpointData$2.apply
      1. org.apache.spark.streaming.dstream.DStream.restoreCheckpointData(DStream.scala:516)
      2. org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:516)
      3. org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:516)
      3 frames
    10. Scala
      List.foreach
      1. scala.collection.immutable.List.foreach(List.scala:318)
      1 frame
    11. Spark Project Streaming
      DStreamGraph$$anonfun$restoreCheckpointData$2.apply
      1. org.apache.spark.streaming.dstream.DStream.restoreCheckpointData(DStream.scala:516)
      2. org.apache.spark.streaming.DStreamGraph$$anonfun$restoreCheckpointData$2.apply(DStreamGraph.scala:151)
      3. org.apache.spark.streaming.DStreamGraph$$anonfun$restoreCheckpointData$2.apply(DStreamGraph.scala:151)
      3 frames
    12. Scala
      ArrayBuffer.foreach
      1. scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
      2. scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
      2 frames
    13. Spark Project Streaming
      StreamingContext$$anonfun$getOrCreate$1.apply
      1. org.apache.spark.streaming.DStreamGraph.restoreCheckpointData(DStreamGraph.scala:151)
      2. org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:158)
      3. org.apache.spark.streaming.StreamingContext$$anonfun$getOrCreate$1.apply(StreamingContext.scala:877)
      4. org.apache.spark.streaming.StreamingContext$$anonfun$getOrCreate$1.apply(StreamingContext.scala:877)
      4 frames
    14. Scala
      Option.map
      1. scala.Option.map(Option.scala:145)
      1 frame
    15. Spark Project Streaming
      JavaStreamingContext.getOrCreate
      1. org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:877)
      2. org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:775)
      3. org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
      3 frames
    16. com.ncr.dataplatform
      Runner.main
      1. com.ncr.dataplatform.Runner.main(Runner.java:48)
      1 frame
    17. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:498)
      4 frames
    18. Spark
      SparkSubmit.main
      1. org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
      2. org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
      3. org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
      4. org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
      5. org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      5 frames