org.apache.spark.SparkException: Error creating channel and connection: connection is already closed due to connection error; cause: com.rabbitmq.client.impl.UnknownChannelException: Unknown channel number 77

GitHub | nelsou | 6 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Error creating channel and connection: Unknown channel number 77

    GitHub | 6 months ago | nelsou
    org.apache.spark.SparkException: Error creating channel and connection: connection is already closed due to connection error; cause: com.rabbitmq.client.impl.UnknownChannelException: Unknown channel number 77
  2. 0

    Error creating channel and connection: Unknown channel number 77

    GitHub | 6 months ago | nelsou
    org.apache.spark.SparkException: Error creating channel and connection: connection is already closed due to connection error; cause: com.rabbitmq.client.impl.UnknownChannelException: Unknown channel number 77
  3. 0

    Error creating channel and connection: connection is already closed due to connection error; cause: com.rabbitmq.client.impl.UnknownChannelException: Unknown channel number 1

    GitHub | 4 weeks ago | alankala
    org.apache.spark.SparkException: Error creating channel and connection: connection is already closed due to connection error; cause: com.rabbitmq.client.impl.UnknownChannelException: Unknown channel number 1
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.apache.spark.SparkException

      Error creating channel and connection: connection is already closed due to connection error; cause: com.rabbitmq.client.impl.UnknownChannelException: Unknown channel number 77

      at org.apache.spark.streaming.rabbitmq.consumer.Consumer$.apply()
    2. org.apache.spark
      RabbitMQRDD.compute
      1. org.apache.spark.streaming.rabbitmq.consumer.Consumer$.apply(Consumer.scala:203)
      2. org.apache.spark.streaming.rabbitmq.distributed.RabbitMQRDD$RabbitMQRDDIterator.getConsumer(RabbitMQRDD.scala:229)
      3. org.apache.spark.streaming.rabbitmq.distributed.RabbitMQRDD$RabbitMQRDDIterator.<init>(RabbitMQRDD.scala:164)
      4. org.apache.spark.streaming.rabbitmq.distributed.RabbitMQRDD.compute(RabbitMQRDD.scala:141)
      4 frames
    3. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
      2. org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:69)
      3. org.apache.spark.rdd.RDD.iterator(RDD.scala:262)
      4. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      5. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
      6. org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
      7. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      8. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
      9. org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
      10. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      11. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
      12. org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
      13. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      14. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
      15. org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:69)
      16. org.apache.spark.rdd.RDD.iterator(RDD.scala:262)
      17. org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:87)
      18. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
      19. org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
      20. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      21. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
      22. org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:69)
      23. org.apache.spark.rdd.RDD.iterator(RDD.scala:262)
      24. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      25. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
      26. org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
      27. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      28. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
      29. org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
      30. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      31. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
      32. org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
      33. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
      34. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
      35. org.apache.spark.scheduler.Task.run(Task.scala:88)
      36. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
      36 frames
    4. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames