org.apache.spark.SparkException: Job aborted due to stage failure: Task 138 in stage 1.0 failed 1 times, most recent failure: Lost task 138.0 in stage 1.0 (TID 362, localhost): com.esotericsoftware.kryo.KryoException: Encountered unregistered class ID: 4267

GitHub | car2008 | 5 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    GitHub comment 572#247921943

    GitHub | 5 months ago | car2008
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 138 in stage 1.0 failed 1 times, most recent failure: Lost task 138.0 in stage 1.0 (TID 362, localhost): com.esotericsoftware.kryo.KryoException: Encountered unregistered class ID: 4267
  2. 0

    GitHub comment 572#248870687

    GitHub | 5 months ago | car2008
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 4.0 failed 1 times, most recent failure: Lost task 0.0 in stage 4.0 (TID 36, localhost): com.esotericsoftware.kryo.KryoException: Encountered unregistered class ID: 2422353
  3. 0

    Checkpoint data corruption in Spark Streaming

    Stack Overflow | 5 months ago | thesamet
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 80.0 failed 1 times, most recent failure: Lost task 0.0 in stage 80.0 (TID 17, localhost): com.esotericsoftware.kryo.KryoException: Encountered unregistered class ID: 13994
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Oryx-2 crashed after runing for serval hour - Cloudera Community

    cloudera.com | 1 year ago
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 5 in stage 5211.0 failed 4 times, most recent failure: Lost task 5.3 in stage 5211.0 (TID 204446, hadoop06): com.esotericsoftware.kryo.KryoException: com.ning.compress.lzf.LZFException: Corrupt input data, block did not start with 2 byte signature ('ZV') followed by type byte, 2-byte length)
  6. 0

    Re: Oryx-2 crashed after runing for serval hour - Cloudera Community

    cloudera.com | 1 year ago
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 5 in stage 5211.0 failed 4 times, most recent failure: Lost task 5.3 in stage 5211.0 (TID 204446, hadoop06): com.esotericsoftware.kryo.KryoException: com.ning.compress.lzf.LZFException: Corrupt input data, block did not start with 2 byte signature ('ZV') followed by type byte, 2-byte length)

    2 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.apache.spark.SparkException

      Job aborted due to stage failure: Task 138 in stage 1.0 failed 1 times, most recent failure: Lost task 138.0 in stage 1.0 (TID 362, localhost): com.esotericsoftware.kryo.KryoException: Encountered unregistered class ID: 4267

      at com.esotericsoftware.kryo.util.DefaultClassResolver.readClass()
    2. Kryo
      Kryo.readClassAndObject
      1. com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:119)
      2. com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:610)
      3. com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:721)
      3 frames
    3. Spark
      NextIterator.hasNext
      1. org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSerializer.scala:228)
      2. org.apache.spark.serializer.DeserializationStream.readValue(Serializer.scala:171)
      3. org.apache.spark.serializer.DeserializationStream$$anon$2.getNext(Serializer.scala:201)
      4. org.apache.spark.serializer.DeserializationStream$$anon$2.getNext(Serializer.scala:198)
      5. org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:73)
      5 frames
    4. Scala
      Iterator$$anon$11.hasNext
      1. scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
      2. scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
      2 frames
    5. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:32)
      2. org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
      3. org.apache.spark.util.collection.ExternalAppendOnlyMap.insertAll(ExternalAppendOnlyMap.scala:152)
      4. org.apache.spark.Aggregator.combineCombinersByKey(Aggregator.scala:58)
      5. org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:83)
      6. org.apache.spark.rdd.ShuffledRDD.compute(ShuffledRDD.scala:98)
      7. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      8. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      9. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      10. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      11. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      12. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      13. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      14. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      15. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
      16. org.apache.spark.scheduler.Task.run(Task.scala:89)
      17. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
      17 frames
    6. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames