org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /usr/pdi/weblogs/raw could only be replicated to 0 nodes, instead of 1

pentaho.com | 5 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Hadoop Copy files Step in Job [Archive] - Pentaho Community Forums

    pentaho.com | 5 months ago
    org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /usr/pdi/weblogs/raw could only be replicated to 0 nodes, instead of 1
  2. 0

    Data Replication error in Hadoop

    Stack Overflow | 5 years ago | Cody
    org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/hadoop/testfiles/testfiles/file1.txt could only be replicated to
  3. 0

    Trying to put 16gb file onto hdfs

    Google Groups | 5 years ago | Barry, Sean F
    org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/hduser/wiki/16gb.txt could only be replicated to 0 nodes, instead of 1
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Where is web interface in stand alone operation?

    Google Groups | 6 years ago | A Df
    org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/my-user/input/HadoopInputFile_Request_2011-08-05_162106_1.txt could only be replicated to 0 nodes, instead of 1
  6. 0

    hi

    Google Groups | 6 years ago | abhay ratnaparkhi
    org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /input/i1 could only be replicated to 0 nodes, instead of 1

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.apache.hadoop.ipc.RemoteException

      java.io.IOException: File /usr/pdi/weblogs/raw could only be replicated to 0 nodes, instead of 1

      at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock()
    2. Apache Hadoop HDFS
      NameNode.addBlock
      1. org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1271)
      2. org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:422)
      2 frames
    3. Java RT
      Method.invoke
      1. sun.reflect.GeneratedMethodAccessor7.invoke(Unknown Source)
      2. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      3. java.lang.reflect.Method.invoke(Method.java:597)
      3 frames
    4. Hadoop
      Server$Handler$1.run
      1. org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
      2. org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
      3. org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
      3 frames
    5. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:396)
      2 frames
    6. Hadoop
      RPC$Invoker.invoke
      1. org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
      2. org.apache.hadoop.ipc.Client.call(Client.java:740)
      3. org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
      3 frames
    7. Unknown
      $Proxy0.addBlock
      1. $Proxy0.addBlock(Unknown Source)
      1 frame
    8. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      4. java.lang.reflect.Method.invoke(Method.java:597)
      4 frames
    9. Hadoop
      RetryInvocationHandler.invoke
      1. org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
      2. org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
      2 frames
    10. Unknown
      $Proxy0.addBlock
      1. $Proxy0.addBlock(Unknown Source)
      1 frame
    11. Apache Hadoop HDFS
      DFSClient$DFSOutputStream$DataStreamer.run
      1. org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:2937)
      2. org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2819)
      3. org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)
      4. org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)
      4 frames