mapreduce.Job: Task Id : attempt_1450793371170_0056_m_000001_0, Status : FAILED Error: org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/hadoop/output/2016/01/05/17/testdata/unmerged/P_1452354760944/file-m-00001.parquet could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation. at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:635) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033) at org.apache.hadoop.ipc.Client.call(Client.java:1468)

Unix & Linux | anurag | 11 months ago
  1. 0

    could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation

    Unix & Linux | 11 months ago | anurag
    mapreduce.Job: Task Id : attempt_1450793371170_0056_m_000001_0, Status : FAILED Error: org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/hadoop/output/2016/01/05/17/testdata/unmerged/P_1452354760944/file-m-00001.parquet could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation. at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:635) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033) at org.apache.hadoop.ipc.Client.call(Client.java:1468)
  2. 0

    [HADOOP-10558] java.net.UnknownHostException: Invalid host name: local host is: (unknown) - ASF JIRA

    apache.org | 11 months ago
    mapreduce.Job: Job job_1404879176168_0003 failed with state FAILED due to: Application application_1404879176168_0003 failed 2 times due to Error launching appattempt_1404879176168_0003_000002. Got exception: java.net.UnknownHostException: Invalid host name: local host is: (unknown); destination host is: "d1":10088; java.net.UnknownHostException; For more details see:
  3. 0

    I had this exception every time i try to run map-red job, I went to http://wiki.apache.org/hadoop/UnknownHost and tried every possible solution and still have the same result Task Id : attempt_1398945803120_0001_m_000004_0, Status : FAILED Container launch failed for container_1398945803120_0001_01_000006 : java.lang.reflect.UndeclaredThrowableException . . Caused by: com.google.protobuf.ServiceException: java.net.UnknownHostException: Invalid host name: local host is: (unknown); destination host is: ""fatima-HP-ProBook-4520s":8042; java.net.UnknownHostException; For more details see: http://wiki.apache.org/hadoop/UnknownHost . .

    Apache's JIRA Issue Tracker | 3 years ago | Sami Abobala
    mapreduce.Job: Job job_1404879176168_0003 failed with state FAILED due to: Application application_1404879176168_0003 failed 2 times due to Error launching appattempt_1404879176168_0003_000002. Got exception: java.net.UnknownHostException: Invalid host name: local host is: (unknown); destination host is: "d1":10088; java.net.UnknownHostException; For more details see: http://wiki.apache.org/hadoop/UnknownHost
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    NullPointerException when connect HBase

    Stack Overflow | 8 months ago | Mia
    mapreduce.Job: Task Id : attempt_1445410902006_0015_m_000000_1, Status : FAILED Error: java.lang.NullPointerException at org.apache.hadoop.hbase.client.RpcRetryingCallerFactory.instantiate(RpcRetryingCallerFactory.java:54)
  6. 0

    [CIS-CMMI-3] Re: [CIS-CMMI-3] Re: [CIS-CMMI-3] Invalid UTF-8 character 0xffff at char exception

    nutch-user | 11 months ago | Kshitij Shukla
    mapreduce.Job: Task Id : attempt_1453472314066_0007_m_000000_0, Status : FAILED Error: org.apache.solr.client.solrj.impl.HttpSolrServer$RemoteSolrException: [com.ctc.wstx.exc.WstxLazyException] Invalid UTF-8 character 0xffff at char #1296459, byte #1310719)

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. mapreduce.Job

      Task Id : attempt_1450793371170_0056_m_000001_0, Status : FAILED Error: org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/hadoop/output/2016/01/05/17/testdata/unmerged/P_1452354760944/file-m-00001.parquet could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation. at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1549) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3200) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:641) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:635) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033) at org.apache.hadoop.ipc.Client.call(Client.java:1468)

      at org.apache.hadoop.ipc.Client.call()
    2. Hadoop
      ProtobufRpcEngine$Invoker.invoke
      1. org.apache.hadoop.ipc.Client.call(Client.java:1399)
      2. org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:241)
      2 frames
    3. com.sun.proxy
      $Proxy13.addBlock
      1. com.sun.proxy.$Proxy13.addBlock(Unknown Source)
      1 frame
    4. Apache Hadoop HDFS
      ClientNamenodeProtocolTranslatorPB.addBlock
      1. org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399)
      1 frame
    5. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:606)
      4 frames
    6. Hadoop
      RetryInvocationHandler.invoke
      1. org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
      2. org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
      2 frames
    7. com.sun.proxy
      $Proxy14.addBlock
      1. com.sun.proxy.$Proxy14.addBlock(Unknown Source)
      1 frame
    8. Apache Hadoop HDFS
      DFSOutputStream$DataStreamer.run
      1. org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1544)
      2. org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1361)
      3. org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:594)
      3 frames