java.io.FileNotFoundException: File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar

JIRA | Kevin Normoyle | 3 years ago
  1. 0

    I think this h2o on hadoop command is failing because it can't create the h2o.jar in hdfs using this uri hdfs://ch-0:8020/home/0xdiag/h2o.jar it's possible /home doesn't exist in hdfs or /home/0xdiag (the user is 0xdiag) I think /home doesn't exist hadoop dfs -ls hdfs://ch-0:8020/home ls: `hdfs://ch-0:8020/home': No such file or directory maybe the h2o on hadoop thing should check that any assumed or necessary directories in hadoop hdfs actually exist before it does something that might use them? here's the command line I tried, and result hadoop jar h2odriver_cdh4_yarn.jar water.hadoop.h2odriver -jt ch-5:8032 -libjars h2o.jar -mapperXmx 5g -nodes 1 -output hdfsOutputDirName -notify h2o_one_node Determining driver host interface for mapper->driver callback... [Possible callback IP address: 10.71.0.100] [Possible callback IP address: 127.0.0.1] Using mapper->driver callback IP address and port: 10.71.0.100:58732 (You can override these with -driverif and -driverport.) Driver program compiled with MapReduce V2 (Yarn) 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.map.child.java.opts is deprecated. Instead, use mapreduce.map.java.opts Memory Settings: mapred.child.java.opts: -Xms5g -Xmx5g mapred.map.child.java.opts: -Xms5g -Xmx5g Extra memory percent: 10 mapreduce.map.memory.mb: 5632 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.used.genericoptionsparser is deprecated. Instead, use mapreduce.client.genericoptionsparser.used 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.map.max.attempts is deprecated. Instead, use mapreduce.map.maxattempts 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.job.reuse.jvm.num.tasks is deprecated. Instead, use mapreduce.job.jvm.numtasks 14/03/24 00:49:53 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id 14/03/24 00:49:53 ERROR security.UserGroupInformation: PriviledgedActionException as:0xdiag (auth:SIMPLE) cause:java.io.FileNotFoundException: File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar ERROR: File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar java.io.FileNotFoundException: File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1116) at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1108) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1108) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1301) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1298) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1298) at water.hadoop.h2odriver.run2(h2odriver.java:785) at water.hadoop.h2odriver.run(h2odriver.java:846) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) at water.hadoop.h2odriver.main(h2odriver.java:868) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

    JIRA | 3 years ago | Kevin Normoyle
    java.io.FileNotFoundException: File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar
  2. 0

    I think this h2o on hadoop command is failing because it can't create the h2o.jar in hdfs using this uri hdfs://ch-0:8020/home/0xdiag/h2o.jar it's possible /home doesn't exist in hdfs or /home/0xdiag (the user is 0xdiag) I think /home doesn't exist hadoop dfs -ls hdfs://ch-0:8020/home ls: `hdfs://ch-0:8020/home': No such file or directory maybe the h2o on hadoop thing should check that any assumed or necessary directories in hadoop hdfs actually exist before it does something that might use them? here's the command line I tried, and result hadoop jar h2odriver_cdh4_yarn.jar water.hadoop.h2odriver -jt ch-5:8032 -libjars h2o.jar -mapperXmx 5g -nodes 1 -output hdfsOutputDirName -notify h2o_one_node Determining driver host interface for mapper->driver callback... [Possible callback IP address: 10.71.0.100] [Possible callback IP address: 127.0.0.1] Using mapper->driver callback IP address and port: 10.71.0.100:58732 (You can override these with -driverif and -driverport.) Driver program compiled with MapReduce V2 (Yarn) 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.map.child.java.opts is deprecated. Instead, use mapreduce.map.java.opts Memory Settings: mapred.child.java.opts: -Xms5g -Xmx5g mapred.map.child.java.opts: -Xms5g -Xmx5g Extra memory percent: 10 mapreduce.map.memory.mb: 5632 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.used.genericoptionsparser is deprecated. Instead, use mapreduce.client.genericoptionsparser.used 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.map.max.attempts is deprecated. Instead, use mapreduce.map.maxattempts 14/03/24 00:49:53 INFO Configuration.deprecation: mapred.job.reuse.jvm.num.tasks is deprecated. Instead, use mapreduce.job.jvm.numtasks 14/03/24 00:49:53 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id 14/03/24 00:49:53 ERROR security.UserGroupInformation: PriviledgedActionException as:0xdiag (auth:SIMPLE) cause:java.io.FileNotFoundException: File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar ERROR: File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar java.io.FileNotFoundException: File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1116) at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1108) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1108) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1301) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1298) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1298) at water.hadoop.h2odriver.run2(h2odriver.java:785) at water.hadoop.h2odriver.run(h2odriver.java:846) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) at water.hadoop.h2odriver.main(h2odriver.java:868) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

    JIRA | 3 years ago | Kevin Normoyle
    java.io.FileNotFoundException: File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar
  3. 0

    Gobblin mapreduce job failed

    GitHub | 10 months ago | sdikby
    java.io.FileNotFoundException: File does not exist: hdfs://localhost:54310/usr/local/gobblin/gobblin-dist/lib/gobblin-metastore-0.6.2-133-g84c12a0.jar
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    GitHub comment 762#191288295

    GitHub | 9 months ago | sdikby
    java.io.FileNotFoundException: File does not exist: hdfs://localhost:54310/usr/local/gobblin/gobblin-dist/lib
  6. 0

    java.io.FileNotFoundException in exportSnapshot

    Google Groups | 2 years ago | Akmal Abbasov
    java.io.FileNotFoundException: File does not exist: hdfs://namenode:9000/home/vagrant/hbase-0.98.7-hadoop2/lib/hadoop-mapreduce-client-core-2.5.1.jar

    1 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.io.FileNotFoundException

      File does not exist: hdfs://ch-0:8020/home/0xdiag/h2o.jar

      at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall()
    2. Apache Hadoop HDFS
      DistributedFileSystem$17.doCall
      1. org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1116)
      2. org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1108)
      2 frames
    3. Hadoop
      FileSystemLinkResolver.resolve
      1. org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
      1 frame
    4. Apache Hadoop HDFS
      DistributedFileSystem.getFileStatus
      1. org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1108)
      1 frame
    5. Hadoop
      Job$10.run
      1. org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
      2. org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
      3. org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
      4. org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
      5. org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
      6. org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
      7. org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
      8. org.apache.hadoop.mapreduce.Job$10.run(Job.java:1301)
      9. org.apache.hadoop.mapreduce.Job$10.run(Job.java:1298)
      9 frames
    6. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:415)
      2 frames
    7. Hadoop
      UserGroupInformation.doAs
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
      1 frame
    8. Hadoop
      Job.submit
      1. org.apache.hadoop.mapreduce.Job.submit(Job.java:1298)
      1 frame
    9. water.hadoop
      h2odriver.run
      1. water.hadoop.h2odriver.run2(h2odriver.java:785)
      2. water.hadoop.h2odriver.run(h2odriver.java:846)
      2 frames
    10. Hadoop
      ToolRunner.run
      1. org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
      2. org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
      2 frames
    11. water.hadoop
      h2odriver.main
      1. water.hadoop.h2odriver.main(h2odriver.java:868)
      1 frame
    12. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:606)
      4 frames
    13. Hadoop
      RunJar.main
      1. org.apache.hadoop.util.RunJar.main(RunJar.java:212)
      1 frame