Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via GitHub by robinisme2
, 1 year ago
Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "dev-tpe-dn-3-2/192.168.1.112"; destination host is: "dev-tpe-nn-3-1":8020;
via GitHub by kpweiler
, 8 months ago
Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "<REDACTED>"; destination host is: "<REDACTED>":8020;
via reddit.com by Unknown author, 1 year ago
Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "namenode1.myhadoop.example.com/172.31.9.109"; destination host is: "namenode2.myhadoop.example.com":8020;
via apache.org by Unknown author, 1 year ago
Failed on local exception: java.net.SocketException: Host is down; Host Details : local host is: "client1.example.org/192.168.1.86"; destination host is: "hdfs.example.org":8020;
via apache.org by Unknown author, 1 year ago
Failed on local exception: java.net.SocketException: Host is down; Host Details : local host is: "client1.example.org/192.168.1.86"; destination host is: "hdfs.example.org":8020;
via gitbooks.io by Unknown author, 1 year ago
Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "os-u14-2-2.novalocal/172.22.73.243"; destination host is: "os-u14-2-3.novalocal":8020;
org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]	at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:172)	at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:396)	at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:560)	at org.apache.hadoop.ipc.Client$Connection.access$1900(Client.java:375)	at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:730)	at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:726)	at java.security.AccessController.doPrivileged(Native Method)	at javax.security.auth.Subject.doAs(Subject.java:415)	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:725)	at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1524)	at org.apache.hadoop.ipc.Client.call(Client.java:1447)	at org.apache.hadoop.ipc.Client.call(Client.java:1408)	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)	at com.sun.proxy.$Proxy25.getFileInfo(Unknown Source)	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:757)	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)	at java.lang.reflect.Method.invoke(Method.java:606)	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:256)	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)	at com.sun.proxy.$Proxy26.getFileInfo(Unknown Source)	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2121)	at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1215)	at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1211)	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1211)	at org.apache.hadoop.yarn.server.nodemanager.containermanager.logaggregation.LogAggregationService.checkExists(LogAggregationService.java:250)	at org.apache.hadoop.yarn.server.nodemanager.containermanager.logaggregation.LogAggregationService.access$100(LogAggregationService.java:68)	at org.apache.hadoop.yarn.server.nodemanager.containermanager.logaggregation.LogAggregationService$1.run(LogAggregationService.java:278)	at java.security.AccessController.doPrivileged(Native Method)	at javax.security.auth.Subject.doAs(Subject.java:415)	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)	at org.apache.hadoop.yarn.server.nodemanager.containermanager.logaggregation.LogAggregationService.createAppDir(LogAggregationService.java:263)	at org.apache.hadoop.yarn.server.nodemanager.containermanager.logaggregation.LogAggregationService.initAppAggregator(LogAggregationService.java:368)	at org.apache.hadoop.yarn.server.nodemanager.containermanager.logaggregation.LogAggregationService.initApp(LogAggregationService.java:322)	at org.apache.hadoop.yarn.server.nodemanager.containermanager.logaggregation.LogAggregationService.handle(LogAggregationService.java:445)	at org.apache.hadoop.yarn.server.nodemanager.containermanager.logaggregation.LogAggregationService.handle(LogAggregationService.java:68)	at org.apache.hadoop.yarn.event.AsyncDispatcher.dispatch(AsyncDispatcher.java:176)	at org.apache.hadoop.yarn.event.AsyncDispatcher$1.run(AsyncDispatcher.java:108)	at java.lang.Thread.run(Thread.java:745)