java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status; Host Details : local host is: "mr-0x10/192.168.1.180"; destination host is: "mr-0x10.0xdata.loc":8020;

JIRA | Kevin Normoyle | 3 years ago
  1. 0

    here's how I started the cloud. cdh5 is on 180 2014-02-04 23:33:50.622694 -- #********************************************************************* 2014-02-04 23:33:50.622763 -- Starting new test: test_with_a_browser.py at build_cloud() 2014-02-04 23:33:50.622800 -- #********************************************************************* 2014-02-04 23:33:50.998369 -- java -Xms14G -Xmx14G -ea -jar ../../target/h2o.jar -beta --port=54321 --ice_root=sandbox/ice.TtCO5o --name=pytest-kevin-17276 -hdfs hdfs://192.168.1.180 -hdfs_version=cdh4 #PID 17302, stdout local-h2o-0.stdout.QtFTCb.log, stderr local-h2o-0.stderr.qRPnxn.log h2o stdout 16.806 # Session INFO HTTPD: GET /ImportHdfs.html path=hdfs://192.168.1.180/datasets 11:35:16.807 # Session INFO WATER: ImportHDFS processing (hdfs://192.168.1.180/datasets) 11:35:16.810 # Session ERRR WATER: + java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status; Host Details : local host is: "mr-0x10/192.168.1.180"; destination host is: "mr-0x10.0xdata.loc":8020; + at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:765) + at org.apache.hadoop.ipc.Client.call(Client.java:1165) + at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:184) + at com.sun.proxy.$Proxy13.getFileInfo(Unknown Source) + at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) + at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) + at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) + at java.lang.reflect.Method.invoke(Method.java:597) + at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:165) + at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:84) + at com.sun.proxy.$Proxy13.getFileInfo(Unknown Source) + at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:612) + at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1366) + at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:732) + at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1197) + at water.persist.PersistHdfs.addFolder(PersistHdfs.java:278) + at water.api.ImportHdfs.serve(ImportHdfs.java:52) + at water.api.Request.serveGrid(Request.java:129) + at water.api.Request.serve(Request.java:108) + at water.api.RequestServer.serve(RequestServer.java:315) + at water.NanoHTTPD$HTTPSession.run(NanoHTTPD.java:421) + at java.lang.Thread.run(Thread.java:662) + Caused by: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status + at com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81) + at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094) + at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028) + at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986) + at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:850) + at org.apache.hadoop.ipc.Client$Connection.run(Client.java:781)

    JIRA | 3 years ago | Kevin Normoyle
    java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status; Host Details : local host is: "mr-0x10/192.168.1.180"; destination host is: "mr-0x10.0xdata.loc":8020;
  2. 0

    here's how I started the cloud. cdh5 is on 180 2014-02-04 23:33:50.622694 -- #********************************************************************* 2014-02-04 23:33:50.622763 -- Starting new test: test_with_a_browser.py at build_cloud() 2014-02-04 23:33:50.622800 -- #********************************************************************* 2014-02-04 23:33:50.998369 -- java -Xms14G -Xmx14G -ea -jar ../../target/h2o.jar -beta --port=54321 --ice_root=sandbox/ice.TtCO5o --name=pytest-kevin-17276 -hdfs hdfs://192.168.1.180 -hdfs_version=cdh4 #PID 17302, stdout local-h2o-0.stdout.QtFTCb.log, stderr local-h2o-0.stderr.qRPnxn.log h2o stdout 16.806 # Session INFO HTTPD: GET /ImportHdfs.html path=hdfs://192.168.1.180/datasets 11:35:16.807 # Session INFO WATER: ImportHDFS processing (hdfs://192.168.1.180/datasets) 11:35:16.810 # Session ERRR WATER: + java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status; Host Details : local host is: "mr-0x10/192.168.1.180"; destination host is: "mr-0x10.0xdata.loc":8020; + at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:765) + at org.apache.hadoop.ipc.Client.call(Client.java:1165) + at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:184) + at com.sun.proxy.$Proxy13.getFileInfo(Unknown Source) + at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) + at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) + at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) + at java.lang.reflect.Method.invoke(Method.java:597) + at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:165) + at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:84) + at com.sun.proxy.$Proxy13.getFileInfo(Unknown Source) + at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:612) + at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1366) + at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:732) + at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1197) + at water.persist.PersistHdfs.addFolder(PersistHdfs.java:278) + at water.api.ImportHdfs.serve(ImportHdfs.java:52) + at water.api.Request.serveGrid(Request.java:129) + at water.api.Request.serve(Request.java:108) + at water.api.RequestServer.serve(RequestServer.java:315) + at water.NanoHTTPD$HTTPSession.run(NanoHTTPD.java:421) + at java.lang.Thread.run(Thread.java:662) + Caused by: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status + at com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81) + at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094) + at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028) + at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986) + at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:850) + at org.apache.hadoop.ipc.Client$Connection.run(Client.java:781)

    JIRA | 3 years ago | Kevin Normoyle
    java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status; Host Details : local host is: "mr-0x10/192.168.1.180"; destination host is: "mr-0x10.0xdata.loc":8020;
  3. 0

    I have an exception with accessing HDFS Remotely, help please~~

    Stack Overflow | 3 years ago | zxz
    java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status; Host Details : local host is: "webserver/127.0.0.1"; destination host is: "222.333.111.77":8020;
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Apache Pig - ERROR 6007: Unable to check name

    Stack Overflow | 4 years ago | kewpiedoll99
    org.apache.pig.backend.datastorage.DataStorageException: ERROR 6007: Unable to check name hdfs://stage-hadoop101.cluster:8020/user/myusername
  6. 0

    Apache Pig - ERROR 6007: Unable to check name

    qnundrum.com | 4 months ago
    org.apache.pig.backend.datastorage.DataStorageException: ERROR 6007: Unable to check name hdfs://stage-hadoop101.cluster:8020/user/myusername

    1 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. com.google.protobuf.InvalidProtocolBufferException

      Message missing required fields: callId, status

      at com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException()
    2. Protocol Buffer Java API
      UninitializedMessageException.asInvalidProtocolBufferException
      1. com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81)
      1 frame
    3. Hadoop
      Client$Connection.run
      1. org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094)
      2. org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028)
      3. org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986)
      4. org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:850)
      5. org.apache.hadoop.ipc.Client$Connection.run(Client.java:781)
      5 frames