java.lang.RuntimeException: java.io.IOException: Failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try. (Nodes: current=[DatanodeInfoWithStorage[ 10.88.131.233:50010,DS-5080e110-5907-4e31-84f2-d7308e722562,DISK], DatanodeInfoWithStorage[10.88.131.235:50010,DS-b5dea108-94a8-4232-a849-eba697a4a3ab,DISK]], original=[DatanodeInfoWithStorage[10.88.131.233:50010,DS-5080e110-5907-4e31-84f2-d7308e722562,DISK], DatanodeInfoWithStorage[10.88.131.235:50010,DS-b5dea108-94a8-4232-a849-eba697a4a3ab,DISK]]). The current failed datanode replacement policy is DEFAULT, and a client may configure this via 'dfs.client.block.write.replace-datanode-on-failure.policy' in its configuration.

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Google Groups by Antonio Si, 6 months ago
java.io.IOException: Failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try. (Nodes: current=[DatanodeInfoWithStorage[10.88.131.233:50010,DS-5080e110-5907-4e31-84f2-d7308e722562,DISK
java.lang.RuntimeException: java.io.IOException: Failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try. (Nodes: current=[DatanodeInfoWithStorage[ 10.88.131.233:50010,DS-5080e110-5907-4e31-84f2-d7308e722562,DISK], DatanodeInfoWithStorage[10.88.131.235:50010,DS-b5dea108-94a8-4232-a849-eba697a4a3ab,DISK]], original=[DatanodeInfoWithStorage[10.88.131.233:50010,DS-5080e110-5907-4e31-84f2-d7308e722562,DISK], DatanodeInfoWithStorage[10.88.131.235:50010,DS-b5dea108-94a8-4232-a849-eba697a4a3ab,DISK]]). The current failed datanode replacement policy is DEFAULT, and a client may configure this via 'dfs.client.block.write.replace-datanode-on-failure.policy' in its configuration.
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.findNewDatanode(DFSOutputStream.java:929)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.addDatanode2ExistingPipeline(DFSOutputStream.java:984)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.setupPipelineForAppendOrRecovery(DFSOutputStream.java:1131)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.processDatanodeError(DFSOutputStream.java:876)

Users with the same issue

Samebug visitor profile picture
Unknown user
Once, 2 years ago
Samebug visitor profile picture
Unknown user
Once, 1 year ago
Samebug visitor profile picture
Unknown user
Once, 1 year ago
Samebug visitor profile picture
Unknown user
2 times, 1 year ago
Samebug visitor profile picture
Unknown user
Once, 1 year ago
1 more bugmates

Know the solutions? Share your knowledge to help other developers to debug faster.