Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Stack Overflow by Viren
, 8 months ago
com.ning.compress.lzf.LZFException: Corrupt input data, block did not start with 2 byte signature ('ZV') followed by type byte, 2-byte length) The job succeeds many times but fails also many times. It may fail few times and succeed next time for the
via Apache's JIRA Issue Tracker by Josh Rosen, 1 year ago
java.io.IOException: failed to uncompress the chunk: FAILED_TO_UNCOMPRESS(5)
via apache.org by Unknown author, 2 years ago
java.io.IOException: failed to uncompress the chunk: FAILED_TO_UNCOMPRESS(5)
via GitHub by johanhaleby
, 1 year ago
java.io.EOFException: Unexpected end of ZLIB input stream
via gradle.org by Unknown author, 1 year ago
java.io.IOException: An established connection was aborted by the software in your host machine.
com.ning.compress.lzf.LZFException: Corrupt input data, block did not start with 2 byte signature ('ZV') followed by type byte, 2-byte length)	at com.ning.compress.lzf.ChunkDecoder._reportCorruptHeader(ChunkDecoder.java:267)	at com.ning.compress.lzf.impl.UnsafeChunkDecoder.decodeChunk(UnsafeChunkDecoder.java:55)	at com.ning.compress.lzf.LZFInputStream.readyBuffer(LZFInputStream.java:363)	at com.ning.compress.lzf.LZFInputStream.read(LZFInputStream.java:193)	at com.esotericsoftware.kryo.io.Input.fill(Input.java:140)	at com.esotericsoftware.kryo.io.Input.require(Input.java:155)	at com.esotericsoftware.kryo.io.Input.readInt(Input.java:337)	at com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:109)	at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:610)	at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:721)	at org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSerializer.scala:228)	at org.apache.spark.serializer.DeserializationStream.readKey(Serializer.scala:169)	at org.apache.spark.serializer.DeserializationStream$$anon$2.getNext(Serializer.scala:201)	at org.apache.spark.serializer.DeserializationStream$$anon$2.getNext(Serializer.scala:198)	at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:73)	at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)	at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:32)	at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)	at org.apache.spark.util.collection.ExternalAppendOnlyMap.insertAll(ExternalAppendOnlyMap.scala:152)	at org.apache.spark.Aggregator.combineCombinersByKey(Aggregator.scala:58)	at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:91)	at org.apache.spark.rdd.ShuffledRDD.compute(ShuffledRDD.scala:98)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:87)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)	at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:69)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:268)	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)	at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)	at org.apache.spark.scheduler.Task.run(Task.scala:89)	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)	at java.lang.Thread.run(Thread.java:745)