java.net.NoRouteToHostException: No Route to Host from ip-172-31-13-2/172.31.13.2 to ip-172-31-8-86.us-west-2.compute.internal:8020 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost

Stack Overflow | autodidacticon | 6 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    EMR Spark thrift server create table: NoRouteToHost

    Stack Overflow | 6 months ago | autodidacticon
    java.net.NoRouteToHostException: No Route to Host from ip-172-31-13-2/172.31.13.2 to ip-172-31-8-86.us-west-2.compute.internal:8020 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost
  2. 0

    how to start hadoop-yarn-nodemanager on multinode... - Cloudera Community

    cloudera.com | 2 years ago
    java.net.NoRouteToHostException: No Route to Host from hdmachine3.example.com/128.243.29.227 to hdmachine1.example.com:8031 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see:
  3. 0

    Spark App fails to launch : No route to host , random port connections

    Stack Overflow | 2 years ago | Gaurav
    java.net.NoRouteToHostException: No Route to Host from host-192-168-0-27/192.168.0.27 to host-192-168-0-32:47308 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    CDH4.3 HA automatic failover failed (QJM) - Grokbase

    grokbase.com | 1 year ago
    java.net.NoRouteToHostException: No route to host; For more details see: at sun.reflect.GeneratedConstructorAccessor14.newInstance(Unknown Source) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)

    Root Cause Analysis

    1. java.net.NoRouteToHostException

      No Route to Host from ip-172-31-13-2/172.31.13.2 to ip-172-31-8-86.us-west-2.compute.internal:8020 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost

      at sun.reflect.NativeConstructorAccessorImpl.newInstance0()
    2. Java RT
      Constructor.newInstance
      1. sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      2. sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
      3. sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      4. java.lang.reflect.Constructor.newInstance(Constructor.java:423)
      4 frames
    3. Hadoop
      ProtobufRpcEngine$Invoker.invoke
      1. org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
      2. org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:758)
      3. org.apache.hadoop.ipc.Client.call(Client.java:1479)
      4. org.apache.hadoop.ipc.Client.call(Client.java:1412)
      5. org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
      5 frames
    4. com.sun.proxy
      $Proxy13.delete
      1. com.sun.proxy.$Proxy13.delete(Unknown Source)
      1 frame
    5. Apache Hadoop HDFS
      ClientNamenodeProtocolTranslatorPB.delete
      1. org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.delete(ClientNamenodeProtocolTranslatorPB.java:540)
      1 frame
    6. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:498)
      4 frames
    7. Hadoop
      RetryInvocationHandler.invoke
      1. org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
      2. org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
      2 frames
    8. com.sun.proxy
      $Proxy14.delete
      1. com.sun.proxy.$Proxy14.delete(Unknown Source)
      1 frame
    9. Apache Hadoop HDFS
      DistributedFileSystem$14.doCall
      1. org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:2044)
      2. org.apache.hadoop.hdfs.DistributedFileSystem$14.doCall(DistributedFileSystem.java:707)
      3. org.apache.hadoop.hdfs.DistributedFileSystem$14.doCall(DistributedFileSystem.java:703)
      3 frames
    10. Hadoop
      FileSystemLinkResolver.resolve
      1. org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
      1 frame
    11. Apache Hadoop HDFS
      DistributedFileSystem.delete
      1. org.apache.hadoop.hdfs.DistributedFileSystem.delete(DistributedFileSystem.java:703)
      1 frame
    12. Spark Project Hive
      HiveExternalCatalog.createTable
      1. org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createTable$1.apply$mcV$sp(HiveExternalCatalog.scala:185)
      2. org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createTable$1.apply(HiveExternalCatalog.scala:152)
      3. org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createTable$1.apply(HiveExternalCatalog.scala:152)
      4. org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:72)
      5. org.apache.spark.sql.hive.HiveExternalCatalog.createTable(HiveExternalCatalog.scala:152)
      5 frames
    13. org.apache.spark
      ExecutedCommandExec.doExecute
      1. org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:226)
      2. org.apache.spark.sql.execution.command.CreateDataSourceTableUtils$.createDataSourceTable(createDataSourceTables.scala:501)
      3. org.apache.spark.sql.execution.command.CreateDataSourceTableCommand.run(createDataSourceTables.scala:105)
      4. org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:60)
      5. org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:58)
      6. org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
      6 frames
    14. Spark Project SQL
      SparkPlan$$anonfun$executeQuery$1.apply
      1. org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:115)
      2. org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:115)
      3. org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:136)
      3 frames
    15. Spark
      RDDOperationScope$.withScope
      1. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
      1 frame
    16. Spark Project SQL
      SparkSession.sql
      1. org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:133)
      2. org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:114)
      3. org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:86)
      4. org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:86)
      5. org.apache.spark.sql.Dataset.<init>(Dataset.scala:186)
      6. org.apache.spark.sql.Dataset.<init>(Dataset.scala:167)
      7. org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:65)
      8. org.apache.spark.sql.SparkSession.sql(SparkSession.scala:582)
      8 frames