java.io.FileNotFoundException

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • Reproduce steps: 1,Create cluster with attached DC separate YARN cluster; 2,Run mapreduce job from client; hadoop jar /usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar pi 2 1000 hadoop jar /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.0.0-cdh4.1.2-tests.jar TestDFSIO -write -nrFiles 5 -fileSize 1000 Expected result: Mapreduce job runs successfully; Actual result: [joe@10 ~]$ hadoop jar /usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar pi 2 1000 Number of Maps = 2 Samples per Map = 1000 Wrote input for Map #0 Wrote input for Map #1 Starting Job 13/04/10 06:42:13 INFO input.FileInputFormat: Total input paths to process : 2 13/04/10 06:42:13 INFO mapreduce.JobSubmitter: number of splits:2 13/04/10 06:42:13 WARN conf.Configuration: mapred.jar is deprecated. Instead, use mapreduce.job.jar 13/04/10 06:42:13 WARN conf.Configuration: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative 13/04/10 06:42:13 WARN conf.Configuration: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces 13/04/10 06:42:13 WARN conf.Configuration: mapred.output.value.class is deprecated. Instead, use mapreduce.job.output.value.class 13/04/10 06:42:13 WARN conf.Configuration: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative 13/04/10 06:42:13 WARN conf.Configuration: mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class 13/04/10 06:42:13 WARN conf.Configuration: mapred.job.name is deprecated. Instead, use mapreduce.job.name 13/04/10 06:42:13 WARN conf.Configuration: mapreduce.reduce.class is deprecated. Instead, use mapreduce.job.reduce.class 13/04/10 06:42:13 WARN conf.Configuration: mapreduce.inputformat.class is deprecated. Instead, use mapreduce.job.inputformat.class 13/04/10 06:42:13 WARN conf.Configuration: mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir 13/04/10 06:42:13 WARN conf.Configuration: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir 13/04/10 06:42:13 WARN conf.Configuration: mapreduce.outputformat.class is deprecated. Instead, use mapreduce.job.outputformat.class 13/04/10 06:42:13 WARN conf.Configuration: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps 13/04/10 06:42:13 WARN conf.Configuration: mapred.output.key.class is deprecated. Instead, use mapreduce.job.output.key.class 13/04/10 06:42:13 WARN conf.Configuration: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir 13/04/10 06:42:14 INFO mapred.ResourceMgrDelegate: Submitted application application_1365573089037_0010 to ResourceManager at /10.111.88.85:8032 13/04/10 06:42:14 INFO mapreduce.Job: The url to track the job: http://10.111.88.85:8088/proxy/application_1365573089037_0010/ 13/04/10 06:42:14 INFO mapreduce.Job: Running job: job_1365573089037_0010 13/04/10 06:42:16 INFO mapreduce.Job: Job job_1365573089037_0010 running in uber mode : false 13/04/10 06:42:16 INFO mapreduce.Job: map 0% reduce 0% 13/04/10 06:42:16 INFO mapreduce.Job: Job job_1365573089037_0010 failed with state FAILED due to: Application application_1365573089037_0010 failed 1 times due to AM Container for appattempt_1365573089037_0010_000001 exited with exitCode: -1000 due to: RemoteTrace: java.io.IOException: No FileSystem for scheme: hdfs at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2206) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2213) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:80) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2252) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2234) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:300) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:194) at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:86) at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:49) at org.apache.hadoop.yarn.util.FSDownload$1.run(FSDownload.java:157) at org.apache.hadoop.yarn.util.FSDownload$1.run(FSDownload.java:155) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332) at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:153) at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:49) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) at java.util.concurrent.FutureTask.run(FutureTask.java:138) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) at java.util.concurrent.FutureTask.run(FutureTask.java:138) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:662) at LocalTrace: org.apache.hadoop.yarn.exceptions.impl.pb.YarnRemoteExceptionPBImpl: No FileSystem for scheme: hdfs at org.apache.hadoop.yarn.server.nodemanager.api.protocolrecords.impl.pb.LocalResourceStatusPBImpl.convertFromProtoFormat(LocalResourceStatusPBImpl.java:217) at org.apache.hadoop.yarn.server.nodemanager.api.protocolrecords.impl.pb.LocalResourceStatusPBImpl.getException(LocalResourceStatusPBImpl.java:147) at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService$LocalizerRunner.update(ResourceLocalizationService.java:822) at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService$LocalizerTracker.processHeartbeat(ResourceLocalizationService.java:492) at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.heartbeat(ResourceLocalizationService.java:221) at org.apache.hadoop.yarn.server.nodemanager.api.impl.pb.service.LocalizationProtocolPBServiceImpl.heartbeat(LocalizationProtocolPBServiceImpl.java:46) at org.apache.hadoop.yarn.proto.LocalizationProtocol$LocalizationProtocolService$2.callBlockingMethod(LocalizationProtocol.java:57) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687) .Failing this attempt.. Failing the application. 13/04/10 06:42:17 INFO mapreduce.Job: Counters: 0 Job Finished in 4.092 seconds java.io.FileNotFoundException: File does not exist: hdfs://10.111.88.85:9000/user/joe/QuasiMonteCarlo_TMP_3_141592654/out/reduce-out at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:787) at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1704) at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1728) at org.apache.hadoop.examples.QuasiMonteCarlo.estimatePi(QuasiMonteCarlo.java:314) at org.apache.hadoop.examples.QuasiMonteCarlo.run(QuasiMonteCarlo.java:351) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.examples.QuasiMonteCarlo.main(QuasiMonteCarlo.java:360) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72) at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144) at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:68) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:208) Thanks, Hong
    via by Qiang Tang,
  • Reproduce steps: 1,Create cluster with attached DC separate YARN cluster; 2,Run mapreduce job from client; hadoop jar /usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar pi 2 1000 hadoop jar /usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.0.0-cdh4.1.2-tests.jar TestDFSIO -write -nrFiles 5 -fileSize 1000 Expected result: Mapreduce job runs successfully; Actual result: [joe@10 ~]$ hadoop jar /usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar pi 2 1000 Number of Maps = 2 Samples per Map = 1000 Wrote input for Map #0 Wrote input for Map #1 Starting Job 13/04/10 06:42:13 INFO input.FileInputFormat: Total input paths to process : 2 13/04/10 06:42:13 INFO mapreduce.JobSubmitter: number of splits:2 13/04/10 06:42:13 WARN conf.Configuration: mapred.jar is deprecated. Instead, use mapreduce.job.jar 13/04/10 06:42:13 WARN conf.Configuration: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative 13/04/10 06:42:13 WARN conf.Configuration: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces 13/04/10 06:42:13 WARN conf.Configuration: mapred.output.value.class is deprecated. Instead, use mapreduce.job.output.value.class 13/04/10 06:42:13 WARN conf.Configuration: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative 13/04/10 06:42:13 WARN conf.Configuration: mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class 13/04/10 06:42:13 WARN conf.Configuration: mapred.job.name is deprecated. Instead, use mapreduce.job.name 13/04/10 06:42:13 WARN conf.Configuration: mapreduce.reduce.class is deprecated. Instead, use mapreduce.job.reduce.class 13/04/10 06:42:13 WARN conf.Configuration: mapreduce.inputformat.class is deprecated. Instead, use mapreduce.job.inputformat.class 13/04/10 06:42:13 WARN conf.Configuration: mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir 13/04/10 06:42:13 WARN conf.Configuration: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir 13/04/10 06:42:13 WARN conf.Configuration: mapreduce.outputformat.class is deprecated. Instead, use mapreduce.job.outputformat.class 13/04/10 06:42:13 WARN conf.Configuration: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps 13/04/10 06:42:13 WARN conf.Configuration: mapred.output.key.class is deprecated. Instead, use mapreduce.job.output.key.class 13/04/10 06:42:13 WARN conf.Configuration: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir 13/04/10 06:42:14 INFO mapred.ResourceMgrDelegate: Submitted application application_1365573089037_0010 to ResourceManager at /10.111.88.85:8032 13/04/10 06:42:14 INFO mapreduce.Job: The url to track the job: http://10.111.88.85:8088/proxy/application_1365573089037_0010/ 13/04/10 06:42:14 INFO mapreduce.Job: Running job: job_1365573089037_0010 13/04/10 06:42:16 INFO mapreduce.Job: Job job_1365573089037_0010 running in uber mode : false 13/04/10 06:42:16 INFO mapreduce.Job: map 0% reduce 0% 13/04/10 06:42:16 INFO mapreduce.Job: Job job_1365573089037_0010 failed with state FAILED due to: Application application_1365573089037_0010 failed 1 times due to AM Container for appattempt_1365573089037_0010_000001 exited with exitCode: -1000 due to: RemoteTrace: java.io.IOException: No FileSystem for scheme: hdfs at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2206) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2213) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:80) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2252) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2234) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:300) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:194) at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:86) at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:49) at org.apache.hadoop.yarn.util.FSDownload$1.run(FSDownload.java:157) at org.apache.hadoop.yarn.util.FSDownload$1.run(FSDownload.java:155) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332) at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:153) at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:49) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) at java.util.concurrent.FutureTask.run(FutureTask.java:138) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) at java.util.concurrent.FutureTask.run(FutureTask.java:138) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:662) at LocalTrace: org.apache.hadoop.yarn.exceptions.impl.pb.YarnRemoteExceptionPBImpl: No FileSystem for scheme: hdfs at org.apache.hadoop.yarn.server.nodemanager.api.protocolrecords.impl.pb.LocalResourceStatusPBImpl.convertFromProtoFormat(LocalResourceStatusPBImpl.java:217) at org.apache.hadoop.yarn.server.nodemanager.api.protocolrecords.impl.pb.LocalResourceStatusPBImpl.getException(LocalResourceStatusPBImpl.java:147) at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService$LocalizerRunner.update(ResourceLocalizationService.java:822) at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService$LocalizerTracker.processHeartbeat(ResourceLocalizationService.java:492) at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.heartbeat(ResourceLocalizationService.java:221) at org.apache.hadoop.yarn.server.nodemanager.api.impl.pb.service.LocalizationProtocolPBServiceImpl.heartbeat(LocalizationProtocolPBServiceImpl.java:46) at org.apache.hadoop.yarn.proto.LocalizationProtocol$LocalizationProtocolService$2.callBlockingMethod(LocalizationProtocol.java:57) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687) .Failing this attempt.. Failing the application. 13/04/10 06:42:17 INFO mapreduce.Job: Counters: 0 Job Finished in 4.092 seconds java.io.FileNotFoundException: File does not exist: hdfs://10.111.88.85:9000/user/joe/QuasiMonteCarlo_TMP_3_141592654/out/reduce-out at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:787) at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1704) at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1728) at org.apache.hadoop.examples.QuasiMonteCarlo.estimatePi(QuasiMonteCarlo.java:314) at org.apache.hadoop.examples.QuasiMonteCarlo.run(QuasiMonteCarlo.java:351) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.examples.QuasiMonteCarlo.main(QuasiMonteCarlo.java:360) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72) at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144) at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:68) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:208) Thanks, Hong
    via by Qiang Tang,
  • Debugging a Hive job. - Altiscale
    via by Unknown author,
  • Hadoop Hive user mailing list
    via by Unknown author,
    • java.io.FileNotFoundException: File does not exist: hdfs://10.111.88.85:9000/user/joe/QuasiMonteCarlo_TMP_3_141592654/out/reduce-out at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:787) at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1704) at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1728) at org.apache.hadoop.examples.QuasiMonteCarlo.estimatePi(QuasiMonteCarlo.java:314) at org.apache.hadoop.examples.QuasiMonteCarlo.run(QuasiMonteCarlo.java:351) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.examples.QuasiMonteCarlo.main(QuasiMonteCarlo.java:360) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72) at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144) at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:68) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
    No Bugmate found.