java.lang.RuntimeException: Failed to apply upload policy

GitHub | RainbowTraveller | 4 months ago
  1. 0

    GitHub comment 234#238842582

    GitHub | 4 months ago | RainbowTraveller
    java.lang.RuntimeException: Failed to apply upload policy
  2. 0

    Checksum Exception when reading from or copying to hdfs in apache hadoop

    Stack Overflow | 4 years ago | lvella
    org.apache.hadoop.fs.ChecksumException: Checksum error: /home/name/Desktop/dtlScaleData/attr.txt at 0
  3. 0

    get hadoop ChecksumException: Checksum error

    Stack Overflow | 4 years ago | Xuanzi Han
    org.apache.hadoop.fs.ChecksumException: Checksum error: /crawler/twitcher/tmp/twitcher715632000093292278919867391792973804/Televisions_UK.20120912 at 0
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Getting error while implementing a simple sorting program in Mapreduce with zero reduce nodes

    Stack Overflow | 5 years ago | Ravi Trivedi
    org.apache.hadoop.fs.ChecksumException: Checksum error: file:/root/NetBeansProjects/projectAll/output/regionMulti/individual/part-00000 at 0
  6. 0

    Multithreading Apache Nutch

    Stack Overflow | 2 years ago | gowthamganguri
    org.apache.hadoop.fs.ChecksumException: Checksum error: file:/tmp/hadoop-root/mapred/system/job_local_0001/job.xml at 24576

  1. tyson925 7 times, last 4 weeks ago
  2. r1chjames 2 times, last 6 months ago
2 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. org.apache.hadoop.fs.ChecksumException

    Checksum error: /home/usr/secor_prod_binaries_1/message_logs/backup/21738_20/dt=20160807/hour=02/Hostname-error_log-20160807-02-00.gz at 2096128

    at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.readChunk()
  2. Hadoop
    FSInputChecker.read
    1. org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.readChunk(ChecksumFileSystem.java:254)
    2. org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:276)
    3. org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:228)
    4. org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:196)
    4 frames
  3. Java RT
    DataInputStream.read
    1. java.io.DataInputStream.read(DataInputStream.java:149)
    1 frame
  4. Hadoop
    DecompressorStream.read
    1. org.apache.hadoop.io.compress.DecompressorStream.getCompressedData(DecompressorStream.java:159)
    2. org.apache.hadoop.io.compress.DecompressorStream.decompress(DecompressorStream.java:143)
    3. org.apache.hadoop.io.compress.DecompressorStream.read(DecompressorStream.java:85)
    3 frames
  5. Java RT
    BufferedInputStream.read
    1. java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
    2. java.io.BufferedInputStream.read(BufferedInputStream.java:254)
    2 frames
  6. com.pinterest.secor
    Consumer.checkUploadPolicy
    1. com.pinterest.secor.io.impl.DelimitedTextFileReaderWriterFactory$DelimitedTextFileReader.next(DelimitedTextFileReaderWriterFactory.java:80)
    2. com.pinterest.secor.uploader.Uploader.trim(Uploader.java:144)
    3. com.pinterest.secor.uploader.Uploader.trimFiles(Uploader.java:177)
    4. com.pinterest.secor.uploader.Uploader.checkTopicPartition(Uploader.java:206)
    5. com.pinterest.secor.uploader.Uploader.applyPolicy(Uploader.java:216)
    6. com.pinterest.secor.consumer.Consumer.checkUploadPolicy(Consumer.java:114)
    6 frames