Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via hive-dev by Wataru Yukawa (JIRA), 2 years ago
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "..."; destination host is: "...":8020;
via Apache's JIRA Issue Tracker by Wataru Yukawa, 1 year ago
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "..."; destination host is: "...":8020;
via Apache's JIRA Issue Tracker by Wataru Yukawa, 2 years ago
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "..."; destination host is: "...":8020;
via hive-dev by Wataru Yukawa (JIRA), 2 years ago
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "..."; destination host is: "...":8020;
via Apache's JIRA Issue Tracker by Karam Singh, 1 year ago
Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "<hostname>/<ip>"; destination host is: "<NameNodeHost>":<FSPort>;
via GitHub by invis87
, 1 year ago
Failed on local exception: java.io.IOException: Couldn't setup connection for {{{login@realm}}} to cepng-hadoop-test-3.infr.sel-vpc.one factor.com/192.168.100.235:8020; Host Details : local host is: "{{{my_host}}}/192.168.100.205"; destination host is: "{{{hadoop_host}}}":8020;
java.lang.OutOfMemoryError: unable to create new native thread	at java.lang.Thread.start0(Native Method)	at java.lang.Thread.start(Thread.java:714)	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:784)	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:791)	at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:373)	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1493)	at org.apache.hadoop.ipc.Client.call(Client.java:1397)	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:773)	at org.apache.hadoop.ipc.Client.call(Client.java:1431)	at org.apache.hadoop.ipc.Client.call(Client.java:1358)	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)	at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)	at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown Source)	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)	at java.lang.reflect.Method.invoke(Method.java:498)	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:252)	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)	at com.sun.proxy.$Proxy16.getFileInfo(Unknown Source)	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116)	at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1315)	at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1311)	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1311)	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1424)	at org.apache.hadoop.mapreduce.JobResourceUploader.uploadFiles(JobResourceUploader.java:85)	at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:95)	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:190)	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)	at java.security.AccessController.doPrivileged(Native Method)	at javax.security.auth.Subject.doAs(Subject.java:422)	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)	at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:575)	at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:570)	at java.security.AccessController.doPrivileged(Native Method)	at javax.security.auth.Subject.doAs(Subject.java:422)	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)	at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:570)	at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:561)	at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:431)	at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137)	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1237)	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1101)	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1096)	at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:154)	at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:183)	at org.apache.hive.service.cli.operation.Operation.run(Operation.java:257)	at org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:261)	at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:486)	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)	at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)	at java.lang.Thread.run(Thread.java:745)