Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Google Groups by Unknown author, 1 year ago
Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "ip-172-31-40-120.us-west-2.compute.internal/172.31.40.120"; destination host is: "ip-172-31-37-123.us-west-2.compute.internal":8020;
via Stack Overflow by avinash patil
, 1 year ago
Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name; Host Details : local host is: "Securonix-int3.local/10.0.4.36"; destination host is: "sobd189.securonix.com":8020;
via apache.org by Unknown author, 1 year ago
Failed on local exception: java.net.SocketException: Host is down; Host Details : local host is: "client1.example.org/192.168.1.86"; destination host is: "hdfs.example.org":8020;
via apache.org by Unknown author, 1 year ago
Failed on local exception: java.net.SocketException: Host is down; Host Details : local host is: "client1.example.org/192.168.1.86"; destination host is: "hdfs.example.org":8020;
via gmane.org by Unknown author, 2 years ago
Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
via nabble.com by Unknown author, 2 years ago
Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/ "; destination host is: "hadoop-coc-2":50070;
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]	at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)	at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:413)	at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:553)	at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:368)	at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:722)	at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:718)	at java.security.AccessController.doPrivileged(Native Method)	at javax.security.auth.Subject.doAs(Subject.java:422)	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:717)	at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:368)	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1521)	at org.apache.hadoop.ipc.Client.call(Client.java:1438)	at org.apache.hadoop.ipc.Client.call(Client.java:1399)	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)	at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)	at java.lang.reflect.Method.invoke(Method.java:497)	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)	at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1982)	at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1128)	at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1124)	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1124)	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)	at alluxio.underfs.hdfs.HdfsUnderFileSystem.exists(HdfsUnderFileSystem.java:207)	at alluxio.master.file.FileSystemMaster.loadMetadataAndJournal(FileSystemMaster.java:1792)	at alluxio.master.file.FileSystemMaster.loadMetadataIfNotExistAndJournal(FileSystemMaster.java:1932)	at alluxio.master.file.FileSystemMaster.getFileInfo(FileSystemMaster.java:447)	at alluxio.master.file.FileSystemMasterClientServiceHandler$7.call(FileSystemMasterClientServiceHandler.java:155)	at alluxio.master.file.FileSystemMasterClientServiceHandler$7.call(FileSystemMasterClientServiceHandler.java:152)	at alluxio.RpcUtils.call(RpcUtils.java:40)	at alluxio.master.file.FileSystemMasterClientServiceHandler.getStatus(FileSystemMasterClientServiceHandler.java:152)	at alluxio.thrift.FileSystemMasterClientService$Processor$getStatus.getResult(FileSystemMasterClientService.java:1460)	at alluxio.thrift.FileSystemMasterClientService$Processor$getStatus.getResult(FileSystemMasterClientService.java:1444)	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)	at org.apache.thrift.TMultiplexedProcessor.process(TMultiplexedProcessor.java:123)	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)	at java.lang.Thread.run(Thread.java:745)