java.io.IOException: Failed to add a datanode: nodes.length != original.length + 1, nodes=[127.0.0.1:50010], original=[127.0.0.1:50010]

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • Create a single datanode cluster disable permissions enable webhfds start hdfs run the test script expected result: a file named "test" is created and the content is "testtest" the result I got: hdfs throw an exception on the second append operation. {code} ./test.sh {"RemoteException":{"exception":"IOException","javaClassName":"java.io.IOException","message":"Failed to add a datanode: nodes.length != original.length + 1, nodes=[127.0.0.1:50010], original=[127.0.0.1:50010]"}} {code} Log in datanode: {code} 2012-04-02 14:34:21,058 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: Failed to add a datanode: nodes.length != original.length + 1, nodes=[127.0.0.1:50010], original=[127.0.0.1:50010] at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.findNewDatanode(DFSOutputStream.java:778) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.addDatanode2ExistingPipeline(DFSOutputStream.java:834) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.setupPipelineForAppendOrRecovery(DFSOutputStream.java:930) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:461) 2012-04-02 14:34:21,059 ERROR org.apache.hadoop.hdfs.DFSClient: Failed to close file /test java.io.IOException: Failed to add a datanode: nodes.length != original.length + 1, nodes=[127.0.0.1:50010], original=[127.0.0.1:50010] at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.findNewDatanode(DFSOutputStream.java:778) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.addDatanode2ExistingPipeline(DFSOutputStream.java:834) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.setupPipelineForAppendOrRecovery(DFSOutputStream.java:930) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:461) {code} test.sh {code} #!/bin/sh echo "test" > test.txt curl -L -X PUT "http://localhost:50070/webhdfs/v1/test?op=CREATE" curl -L -X POST -T test.txt "http://localhost:50070/webhdfs/v1/test?op=APPEND" curl -L -X POST -T test.txt "http://localhost:50070/webhdfs/v1/test?op=APPEND" {code}
    via by Zhanwei Wang,
  • Create a single datanode cluster disable permissions enable webhfds start hdfs run the test script expected result: a file named "test" is created and the content is "testtest" the result I got: hdfs throw an exception on the second append operation. {code} ./test.sh {"RemoteException":{"exception":"IOException","javaClassName":"java.io.IOException","message":"Failed to add a datanode: nodes.length != original.length + 1, nodes=[127.0.0.1:50010], original=[127.0.0.1:50010]"}} {code} Log in datanode: {code} 2012-04-02 14:34:21,058 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer Exception java.io.IOException: Failed to add a datanode: nodes.length != original.length + 1, nodes=[127.0.0.1:50010], original=[127.0.0.1:50010] at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.findNewDatanode(DFSOutputStream.java:778) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.addDatanode2ExistingPipeline(DFSOutputStream.java:834) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.setupPipelineForAppendOrRecovery(DFSOutputStream.java:930) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:461) 2012-04-02 14:34:21,059 ERROR org.apache.hadoop.hdfs.DFSClient: Failed to close file /test java.io.IOException: Failed to add a datanode: nodes.length != original.length + 1, nodes=[127.0.0.1:50010], original=[127.0.0.1:50010] at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.findNewDatanode(DFSOutputStream.java:778) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.addDatanode2ExistingPipeline(DFSOutputStream.java:834) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.setupPipelineForAppendOrRecovery(DFSOutputStream.java:930) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:461) {code} test.sh {code} #!/bin/sh echo "test" > test.txt curl -L -X PUT "http://localhost:50070/webhdfs/v1/test?op=CREATE" curl -L -X POST -T test.txt "http://localhost:50070/webhdfs/v1/test?op=APPEND" curl -L -X POST -T test.txt "http://localhost:50070/webhdfs/v1/test?op=APPEND" {code}
    via by Zhanwei Wang,
  • Hadoop bad connect ack exception
    via Stack Overflow by Istvan
    ,
  • Spark 1.2 cannot connect to HDFS on HDP 2.2
    via Stack Overflow by John
    ,
    • java.io.IOException: Failed to add a datanode: nodes.length != original.length + 1, nodes=[127.0.0.1:50010], original=[127.0.0.1:50010] at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.findNewDatanode(DFSOutputStream.java:778) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.addDatanode2ExistingPipeline(DFSOutputStream.java:834) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.setupPipelineForAppendOrRecovery(DFSOutputStream.java:930) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:461)

    Users with the same issue

    Unknown visitor1 times, last one,
    Unknown visitor1 times, last one,
    Unknown visitor1 times, last one,
    Unknown visitor1 times, last one,
    Unknown visitor2 times, last one,
    1 more bugmates