scheduler.TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, XXX): java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "XXXX"; destination host is: "XXX":9000;

tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Spark HDFS read fails when using Mesos

    Stack Overflow | 1 year ago | Daniel Zolnai
    scheduler.TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, XXX): java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "XXXX"; destination host is: "XXX":9000;
  2. 0

    Users - Issue with Hadoop/Kerberos security as client

    nabble.com | 2 years ago
    java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host D etails : local host is: "soad/192.168.57.232"; destination host is: "rpb-cds-cent6-01.office.datalever.com":8020;
  3. 0

    Hibernate 5.0.1 and Java 8; Provider org.hibernate.type.Java8DateTimeTypeContributor not found

    Stack Overflow | 2 years ago | Ruby9191
    java.util.ServiceConfigurationError: org.hibernate.boot.model.TypeContributor: Provider org.hibernate.type.Java8DateTimeTypeContributor not found
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    jmockit not support testng-6.9.4

    GitHub | 2 years ago | cctvzd7
    java.util.ServiceConfigurationError: org.testng.ITestNGListener: Provider mockit.integration.testng.internal.TestNGRunnerDecorator not found
  6. 0

    Try to upgrade version of vert.x to 2.1.1 failed to run testing.

    GitHub | 3 years ago | stream1984
    java.util.ServiceConfigurationError: org.vertx.java.core.VertxFactory: Provider org.vertx.java.core.impl.DefaultVertxFactory not found, compiling:(/var/folders/vy/00ktbtq11qzb2953km0g4r_m0000gp/T/run-test503850378424461436.clj:40:55)

  1. Handemelindo 86 times, last 2 weeks ago
  2. kid 1 times, last 6 months ago
57 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.util.ServiceConfigurationError

    org.apache.hadoop.security.SecurityInfo: Provider org.apache.hadoop.security.AnnotatedSecurityInfo not found

    at java.util.ServiceLoader.fail()
  2. Java RT
    ServiceLoader$1.next
    1. java.util.ServiceLoader.fail(ServiceLoader.java:231)
    2. java.util.ServiceLoader.access$300(ServiceLoader.java:181)
    3. java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:365)
    4. java.util.ServiceLoader$1.next(ServiceLoader.java:445)
    4 frames
  3. Hadoop
    Client$Connection$2.run
    1. org.apache.hadoop.security.SecurityUtil.getTokenInfo(SecurityUtil.java:327)
    2. org.apache.hadoop.security.SaslRpcClient.getServerToken(SaslRpcClient.java:263)
    3. org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:219)
    4. org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:159)
    5. org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:396)
    6. org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:553)
    7. org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:368)
    8. org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:722)
    9. org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:718)
    9 frames
  4. Java RT
    Subject.doAs
    1. java.security.AccessController.doPrivileged(Native Method)
    2. javax.security.auth.Subject.doAs(Subject.java:415)
    2 frames
  5. Hadoop
    ProtobufRpcEngine$Invoker.invoke
    1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
    2. org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:717)
    3. org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:368)
    4. org.apache.hadoop.ipc.Client.getConnection(Client.java:1521)
    5. org.apache.hadoop.ipc.Client.call(Client.java:1438)
    6. org.apache.hadoop.ipc.Client.call(Client.java:1399)
    7. org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
    7 frames
  6. com.sun.proxy
    $Proxy13.getBlockLocations
    1. com.sun.proxy.$Proxy13.getBlockLocations(Unknown Source)
    1 frame
  7. Apache Hadoop HDFS
    ClientNamenodeProtocolTranslatorPB.getBlockLocations
    1. org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:254)
    1 frame
  8. Java RT
    Method.invoke
    1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    4. java.lang.reflect.Method.invoke(Method.java:606)
    4 frames
  9. Hadoop
    RetryInvocationHandler.invoke
    1. org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
    2. org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    2 frames
  10. com.sun.proxy
    $Proxy14.getBlockLocations
    1. com.sun.proxy.$Proxy14.getBlockLocations(Unknown Source)
    1 frame
  11. Apache Hadoop HDFS
    DistributedFileSystem$3.doCall
    1. org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1220)
    2. org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1210)
    3. org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1200)
    4. org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:271)
    5. org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:238)
    6. org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:231)
    7. org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1498)
    8. org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:302)
    9. org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:298)
    9 frames
  12. Hadoop
    FileSystemLinkResolver.resolve
    1. org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    1 frame
  13. Apache Hadoop HDFS
    DistributedFileSystem.open
    1. org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:298)
    1 frame
  14. Hadoop
    FileSystem.open
    1. org.apache.hadoop.fs.FileSystem.open(FileSystem.java:766)
    1 frame
  15. Spark
    WholeTextFileRecordReader.nextKeyValue
    1. org.apache.spark.input.WholeTextFileRecordReader.nextKeyValue(WholeTextFileRecordReader.scala:79)
    1 frame
  16. Hadoop
    CombineFileRecordReader.nextKeyValue
    1. org.apache.hadoop.mapreduce.lib.input.CombineFileRecordReader.nextKeyValue(CombineFileRecordReader.java:69)
    1 frame
  17. Spark
    Executor$TaskRunner.run
    1. org.apache.spark.rdd.NewHadoopRDD$$anon$1.hasNext(NewHadoopRDD.scala:163)
    2. org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
    3. org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1553)
    4. org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121)
    5. org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121)
    6. org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848)
    7. org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848)
    8. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    9. org.apache.spark.scheduler.Task.run(Task.scala:88)
    10. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    10 frames
  18. Java RT
    Thread.run
    1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    3. java.lang.Thread.run(Thread.java:745)
    3 frames