java.io.FileNotFoundException

/home/bioinfo/zhipengcheng/file/tmp/blockmgr-f7ac149c-fd99-45a3-a917-08317e6d044c/21/temp_local_92a22279-47a1-4fd4-a444-bb7b546255c5 (no such file or directory)

Samebug tips0

We couldn't find tips for this exception.

Don't give up yet. Paste your full stack trace to get a solution.

Solutions on the web2106

  • via GitHub by car2008
    ,
  • via Stack Overflow by auxdx
    ,
  • Stack trace

    • java.io.FileNotFoundException: /home/bioinfo/zhipengcheng/file/tmp/blockmgr-f7ac149c-fd99-45a3-a917-08317e6d044c/21/temp_local_92a22279-47a1-4fd4-a444-bb7b546255c5 (no such file or directory) at java.io.FileOutputStream.open0(Native Method) at java.io.FileOutputStream.open(FileOutputStream.java:270) at java.io.FileOutputStream.<init>(FileOutputStream.java:213) at org.apache.spark.storage.DiskBlockObjectWriter.open(DiskBlockObjectWriter.scala:88) at org.apache.spark.storage.DiskBlockObjectWriter.write(DiskBlockObjectWriter.scala:181) at org.apache.spark.util.collection.ExternalAppendOnlyMap.spill(ExternalAppendOnlyMap.scala:206) at org.apache.spark.util.collection.ExternalAppendOnlyMap.spill(ExternalAppendOnlyMap.scala:55) at org.apache.spark.util.collection.Spillable$class.maybeSpill(Spillable.scala:93) at org.apache.spark.util.collection.ExternalAppendOnlyMap.maybeSpill(ExternalAppendOnlyMap.scala:55) at org.apache.spark.util.collection.ExternalAppendOnlyMap.insertAll(ExternalAppendOnlyMap.scala:158) at org.apache.spark.Aggregator.combineCombinersByKey(Aggregator.scala:58) at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:83) at org.apache.spark.rdd.ShuffledRDD.compute(ShuffledRDD.scala:98) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:89) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    rprp
    93 times, last one
    bandocabandoca
    82 times, last one
    tyson925tyson925
    279 times, last one
    johnxflyjohnxfly
    520 times, last one
    Unknown visitor
    Unknown visitorOnce,
    166 more bugmates