java.io.FileNotFoundException: File does not exist: /home/hdadmin/homework.dat

Stack Overflow | Harris Morris | 7 months ago
tip
Click on the to mark the solution that helps you, Samebug will learn from it.
As a community member, you’ll be rewarded for you help.
  1. 0

    Hadoop Java programming compress direcory that contains 10 files and send to hdfs

    Stack Overflow | 7 months ago | Harris Morris
    java.io.FileNotFoundException: File does not exist: /home/hdadmin/homework.dat
  2. 0

    Trouble in RecommenderJob on hadoop

    Stack Overflow | 2 years ago | qianda66
    java.io.FileNotFoundException: File does not exist: /user/hduser/temp/preparePreferenceMatrix/numUsers.bin
  3. 0

    Spark cluster computing framework

    gmane.org | 2 years ago
    java.io.FileNotFoundException: File does not exist: /user/marcel/outputs/output_spark/log0
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Crunch, mail # user - Re: LeaseExpiredExceptions and temp side effect files - 2015-08-21, 20:03

    search-hadoop.com | 2 years ago
    org.apache.crunch.CrunchRuntimeException: Could not read runtime node information
  6. 0

    Failed to run Kafka MapReduce example

    GitHub | 1 year ago | marianomirabelli
    java.io.FileNotFoundException: File does not exist: /home/hduser/gobblin/gobblin-dist/conf/gobblin-mapreduce.properties

  1. jf-ast 8 times, last 5 days ago
2 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.io.FileNotFoundException

    File does not exist: /home/hdadmin/homework.dat

    at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf()
  2. Apache Hadoop HDFS
    ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod
    1. org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71)
    2. org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
    3. org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1828)
    4. org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1799)
    5. org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1712)
    6. org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:587)
    7. org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:365)
    8. org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    8 frames
  3. Hadoop
    Server$Handler$1.run
    1. org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    2. org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
    3. org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    4. org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    4 frames
  4. Java RT
    Subject.doAs
    1. java.security.AccessController.doPrivileged(Native Method)
    2. javax.security.auth.Subject.doAs(Subject.java:415)
    2 frames
  5. Hadoop
    Server$Handler.run
    1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    2. org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
    2 frames
  6. Java RT
    Constructor.newInstance
    1. sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    2. sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    3. sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    4. java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    4 frames
  7. Hadoop
    RemoteException.unwrapRemoteException
    1. org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
    2. org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
    2 frames
  8. Apache Hadoop HDFS
    DistributedFileSystem$3.doCall
    1. org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1242)
    2. org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1227)
    3. org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1215)
    4. org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:303)
    5. org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:269)
    6. org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:261)
    7. org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1540)
    8. org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:303)
    9. org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:299)
    9 frames
  9. Hadoop
    FileSystemLinkResolver.resolve
    1. org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    1 frame
  10. Apache Hadoop HDFS
    DistributedFileSystem.open
    1. org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:299)
    1 frame
  11. Hadoop
    FileSystem.open
    1. org.apache.hadoop.fs.FileSystem.open(FileSystem.java:767)
    1 frame
  12. testing
    ParallelLocalToHdfsCopy.main
    1. testing.ParallelLocalToHdfsCopy.main(ParallelLocalToHdfsCopy.java:76)
    1 frame