java.io.FileNotFoundException: /tmp/foo_test (Permission denied)

amazon.com | 1 month ago
  1. 0

    Spark Can't Read Local File

    Stack Overflow | 6 months ago | shj
    java.io.FileNotFoundException: /tmp/foo_test (Permission denied)
  2. 0

    AWS Developer Forums: Spark Can't Read Local File ...

    amazon.com | 1 month ago
    java.io.FileNotFoundException: /tmp/foo_test (Permission denied)
  3. 0

    Hadoop getting file not found exception

    Stack Overflow | 3 years ago | Janith
    java.io.FileNotFoundException: /tmp/Jetty_0_0_0_0_50090_secondary_</strong>_y6aanv (Is a directory)
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    _metadata files and multiple outputs

    Google Groups | 3 years ago | Levi Bowman
    java.io.IOException: Could not read footer: java.io.IOException: Could not read footer for file RawLocalFileStatus{path=filesrc/test/resources//parquet/year=2014; isDirectory=true; modification_time=1405979408000; access_time=0; owner=; group=; permission=rwxrwxrwx; isSymlink=false}
  6. 0

    [HADOOP-4614] "Too many open files" error while processing a large gzip file - ASF JIRA

    apache.org | 1 year ago
    java.io.FileNotFoundException: /usr/local/hadoop/hadoop-hadoop/mapred/local/taskTracker/jobcache/job_200811041109_0003/attempt_200811041109_0003_m_000000_0/output/spill4055.out.index (Too many open files)

  1. r1chjames 2 times, last 5 months ago
  2. r1chjames 1 times, last 1 day ago
  3. richard77 1 times, last 1 week ago
  4. gehel 2 times, last 2 weeks ago
  5. Kialandei 100 times, last 2 weeks ago
45 more registered users
50 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.io.FileNotFoundException

    /tmp/foo_test (Permission denied)

    at java.io.FileInputStream.open()
  2. Java RT
    FileInputStream.<init>
    1. java.io.FileInputStream.open(Native Method)
    2. java.io.FileInputStream.<init>(FileInputStream.java:146)
    2 frames
  3. Hadoop
    FileSystem.open
    1. org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileInputStream.<init>(RawLocalFileSystem.java:111)
    2. org.apache.hadoop.fs.RawLocalFileSystem.open(RawLocalFileSystem.java:207)
    3. org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:141)
    4. org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:341)
    5. org.apache.hadoop.fs.FileSystem.open(FileSystem.java:771)
    5 frames
  4. Hadoop
    TextInputFormat.getRecordReader
    1. org.apache.hadoop.mapred.LineRecordReader.<init>(LineRecordReader.java:109)
    2. org.apache.hadoop.mapred.TextInputFormat.getRecordReader(TextInputFormat.java:67)
    2 frames
  5. Spark
    Executor$TaskRunner.run
    1. org.apache.spark.rdd.HadoopRDD$$anon$1.<init>(HadoopRDD.scala:237)
    2. org.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:208)
    3. org.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:101)
    4. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
    5. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
    6. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
    7. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
    8. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
    9. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    10. org.apache.spark.scheduler.Task.run(Task.scala:89)
    11. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    11 frames
  6. Java RT
    Thread.run
    1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    3. java.lang.Thread.run(Thread.java:745)
    3 frames