org.apache.nifi.processor.exception.ProcessException

IOException thrown from PutHDFS[id=2d926905-af62-4f87-9f23-e1fd8b7bf505]: org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/maria_dev/.HPCamDrv.12.54.53.txt could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.

Samebug tips0

There are no available Samebug tips for this exception. If you know how to solve this issue, help other users by writing a short tip.

Don't give up yet. Paste your full stack trace to get a solution.

Solutions on the web16

  • via hortonworks.com by Unknown author, 9 months ago
    IOException thrown from PutHDFS[id=2d926905-af62-4f87-9f23-e1fd8b7bf505]: org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/maria_dev/.HPCamDrv.12.54.53.txt could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
  • IOException thrown from PutHDFS[id=015a1010-9c64-1ed3-c39b-d19ab2dfe19b]: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/raw/externaltbls/falcon/testing/.1PUGETSLA_PO810.P0125.EDIINV.P20150125.107.20160304025143629.gz":hdfs:hdfs:drwxrwxr-x
  • via nabble.com by Unknown author, 4 months ago
    IOException thrown from ConvertAvroToJSON[id=fb761b66-1010-1157-4673-c5198a522367]: java.io.IOException: Not a data file.
  • Stack trace

    • org.apache.nifi.processor.exception.ProcessException: IOException thrown from PutHDFS[id=2d926905-af62-4f87-9f23-e1fd8b7bf505]: org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/maria_dev/.HPCamDrv.12.54.53.txt could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation. at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1588) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3116) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3040) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:789) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2151) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2147) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2145)

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    You’re the first here who have seen this exception.