java.io.FileNotFoundException: /usr/local/hadoop/hadoop-hadoop/mapred/local/taskTracker/jobcache/job_200811041109_0003/attempt_200811041109_0003_m_000000_0/output/spill4055.out.index (Too many open files)

apache.org | 4 months ago
  1. 0

    [HADOOP-4614] "Too many open files" error while processing a large gzip file - ASF JIRA

    apache.org | 1 year ago
    java.io.FileNotFoundException: /usr/local/hadoop/hadoop-hadoop/mapred/local/taskTracker/jobcache/job_200811041109_0003/attempt_200811041109_0003_m_000000_0/output/spill4055.out.index (Too many open files)
  2. 0

    [HADOOP-4614] "Too many open files" error while processing a large gzip file - ASF JIRA

    apache.org | 4 months ago
    java.io.FileNotFoundException: /usr/local/hadoop/hadoop-hadoop/mapred/local/taskTracker/jobcache/job_200811041109_0003/attempt_200811041109_0003_m_000000_0/output/spill4055.out.index (Too many open files)
  3. 0

    Hadoop getting file not found exception

    Stack Overflow | 3 years ago | Janith
    java.io.FileNotFoundException: /tmp/Jetty_0_0_0_0_50090_secondary_</strong>_y6aanv (Is a directory)
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    _metadata files and multiple outputs

    Google Groups | 3 years ago | Levi Bowman
    java.io.IOException: Could not read footer: java.io.IOException: Could not read footer for file RawLocalFileStatus{path=filesrc/test/resources//parquet/year=2014; isDirectory=true; modification_time=1405979408000; access_time=0; owner=; group=; permission=rwxrwxrwx; isSymlink=false}
  6. 0

    Spark Can't Read Local File

    Stack Overflow | 6 months ago | shj
    java.io.FileNotFoundException: /tmp/foo_test (Permission denied)

  1. r1chjames 2 times, last 5 months ago
  2. r1chjames 1 times, last 1 day ago
  3. richard77 1 times, last 7 days ago
  4. gehel 2 times, last 2 weeks ago
  5. Kialandei 100 times, last 2 weeks ago
45 more registered users
50 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.io.FileNotFoundException

    /usr/local/hadoop/hadoop-hadoop/mapred/local/taskTracker/jobcache/job_200811041109_0003/attempt_200811041109_0003_m_000000_0/output/spill4055.out.index (Too many open files)

    at java.io.FileInputStream.open()
  2. Java RT
    FileInputStream.<init>
    1. java.io.FileInputStream.open(Native Method)
    2. java.io.FileInputStream.<init>(FileInputStream.java:137)
    2 frames
  3. Hadoop
    FileSystem.open
    1. org.apache.hadoop.fs.RawLocalFileSystem$TrackingFileInputStream.<init>(RawLocalFileSystem.java:62)
    2. org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileInputStream.<init>(RawLocalFileSystem.java:98)
    3. org.apache.hadoop.fs.RawLocalFileSystem.open(RawLocalFileSystem.java:168)
    4. org.apache.hadoop.fs.FileSystem.open(FileSystem.java:359)
    4 frames
  4. Hadoop
    Child.main
    1. org.apache.hadoop.mapred.IndexRecord.readIndexFile(IndexRecord.java:47)
    2. org.apache.hadoop.mapred.MapTask$MapOutputBuffer.getIndexInformation(MapTask.java:1339)
    3. org.apache.hadoop.mapred.MapTask$MapOutputBuffer.mergeParts(MapTask.java:1237)
    4. org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:857)
    5. org.apache.hadoop.mapred.MapTask.run(MapTask.java:333)
    6. org.apache.hadoop.mapred.Child.main(Child.java:155)
    6 frames