java.io.FileNotFoundException

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • I think this h2o on hadoop command is failing because it can't create the h2o.jar in hdfs using this uri hdfs://ch-0:8020/home/0xdiag/h2o.jar it's possible /home doesn't exist in hdfs or /home/0xdiag (the user is 0xdiag) I think /home doesn't exist hadoop dfs -ls hdfs://ch-0:8020/home ls: `hdfs://ch-0:8020/home': No such file or directory maybe the h2o on hadoop thing should check that any assumed or necessary directories in hadoop hdfs actually exist before it does something that might use them? here's the command line I tried, and result hadoop jar h2odriver_cdh4_yarn.jar water.hadoop.h2odriver -jt ch-5:8032 -libjars h2o.jar -mapperXmx 5g -nodes 1 -output hdfsOutputDirName -notify h2o_one_node Determining driver host interface for mapper->driver callback... [Possible callback IP address: 10.71.0.100] [Possible callback IP address: 127.0.0.1] Using mapper->driver callback IP address and port: 10.71.0.100:58732 (You can override these with -driverif and -driverport.) Driver program compiled with MapReduce V2 (Yarn) 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.map.child.java.opts is deprecated. Instead, use mapreduce.map.java.opts Memory Settings: mapred.child.java.opts: -Xms5g -Xmx5g mapred.map.child.java.opts: -Xms5g -Xmx5g Extra memory percent: 10 mapreduce.map.memory.mb: 5632 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.used.genericoptionsparser is deprecated. Instead, use mapreduce.client.genericoptionsparser.used 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.map.max.attempts is deprecated. Instead, use mapreduce.map.maxattempts 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.job.reuse.jvm.num.tasks is deprecated. Instead, use mapreduce.job.jvm.numtasks 14/03/24 00:49:53 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id 14/03/24 00:49:53 ERROR security.UserGroupInformation: PriviledgedActionException as:0xdiag (auth:SIMPLE) cause:java.io.FileNotFoundException: File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar ERROR: File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar java.io.FileNotFoundException: File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1116) at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1108) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1108) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1301) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1298) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1298) at water.hadoop.h2odriver.run2(h2odriver.java:785) at water.hadoop.h2odriver.run(h2odriver.java:846) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) at water.hadoop.h2odriver.main(h2odriver.java:868) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
    via by Kevin Normoyle,
  • I think this h2o on hadoop command is failing because it can't create the h2o.jar in hdfs using this uri hdfs://ch-0:8020/home/0xdiag/h2o.jar it's possible /home doesn't exist in hdfs or /home/0xdiag (the user is 0xdiag) I think /home doesn't exist hadoop dfs -ls hdfs://ch-0:8020/home ls: `hdfs://ch-0:8020/home': No such file or directory maybe the h2o on hadoop thing should check that any assumed or necessary directories in hadoop hdfs actually exist before it does something that might use them? here's the command line I tried, and result hadoop jar h2odriver_cdh4_yarn.jar water.hadoop.h2odriver -jt ch-5:8032 -libjars h2o.jar -mapperXmx 5g -nodes 1 -output hdfsOutputDirName -notify h2o_one_node Determining driver host interface for mapper->driver callback... [Possible callback IP address: 10.71.0.100] [Possible callback IP address: 127.0.0.1] Using mapper->driver callback IP address and port: 10.71.0.100:58732 (You can override these with -driverif and -driverport.) Driver program compiled with MapReduce V2 (Yarn) 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.map.child.java.opts is deprecated. Instead, use mapreduce.map.java.opts Memory Settings: mapred.child.java.opts: -Xms5g -Xmx5g mapred.map.child.java.opts: -Xms5g -Xmx5g Extra memory percent: 10 mapreduce.map.memory.mb: 5632 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.used.genericoptionsparser is deprecated. Instead, use mapreduce.client.genericoptionsparser.used 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.map.max.attempts is deprecated. Instead, use mapreduce.map.maxattempts 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.job.reuse.jvm.num.tasks is deprecated. Instead, use mapreduce.job.jvm.numtasks 14/03/24 00:49:53 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id 14/03/24 00:49:53 ERROR security.UserGroupInformation: PriviledgedActionException as:0xdiag (auth:SIMPLE) cause:java.io.FileNotFoundException: File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar ERROR: File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar java.io.FileNotFoundException: File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1116) at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1108) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1108) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1301) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1298) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1298) at water.hadoop.h2odriver.run2(h2odriver.java:785) at water.hadoop.h2odriver.run(h2odriver.java:846) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) at water.hadoop.h2odriver.main(h2odriver.java:868) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
    via by Kevin Normoyle,
  • Gobblin mapreduce job failed
    via GitHub by sdikby
    ,
  • GitHub comment 762#191288295
    via GitHub by sdikby
    ,
  • java.io.FileNotFoundException in exportSnapshot
    via by Akmal Abbasov,
  • hbase importTsv FileNotFoundException
    via Stack Overflow by user1244888
    ,
  • hbase mapreduce file not found exception
    via Stack Overflow by TheRoyal Llama
    ,
  • count function not working in hive
    via Stack Overflow by Aniket
    ,
  • Hive Mapreduce Jobs failing
    via Stack Overflow by Karthik
    ,
    • java.io.FileNotFoundException: File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1116) at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1108) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1108) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1301) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1298) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1298) at water.hadoop.h2odriver.run2(h2odriver.java:785) at water.hadoop.h2odriver.run(h2odriver.java:846) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) at water.hadoop.h2odriver.main(h2odriver.java:868) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

    Users with the same issue

    Unknown visitor
    Unknown visitor1 times, last one,