com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried)

DataStax JIRA | Lukas Stefaniak | 1 year ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    I'm using CassandraConnector.withSessionDo inside RDD.mapPartitions to achieve LEFT OUTER JOIN which is currently not provided by the driver. Code to reproduce the issue: https://github.com/lustefaniak/spark-cassandra-connector-bug Unfortunatelly I got hit by exception which wasn't telling me too much as I'm using single local Cassandra node for testing: {code} [error] com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried) [error] at com.datastax.driver.core.exceptions.NoHostAvailableException.copy(NoHostAvailableException.java:84) [error] at com.datastax.driver.core.DefaultResultSetFuture.extractCauseFromExecutionException(DefaultResultSetFuture.java:271) [error] at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:185) [error] at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:55) [error] at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:47) [error] at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source) [error] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) [error] at java.lang.reflect.Method.invoke(Method.java:483) [error] at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:33) [error] at com.sun.proxy.$Proxy4.execute(Unknown Source) [error] at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source) [error] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) [error] at java.lang.reflect.Method.invoke(Method.java:483) [error] at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:33) [error] at com.sun.proxy.$Proxy4.execute(Unknown Source) [error] at buggy.Main$$anonfun$doMappingInPartitions$1$$anonfun$apply$1$$anonfun$apply$2.apply(Main.scala:77) [error] at buggy.Main$$anonfun$doMappingInPartitions$1$$anonfun$apply$1$$anonfun$apply$2.apply(Main.scala:76) [error] at scala.collection.Iterator$$anon$11.next(Iterator.scala:370) [error] at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1555) [error] at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121) [error] at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121) [error] at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848) [error] at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848) [error] at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) [error] at org.apache.spark.scheduler.Task.run(Task.scala:88) [error] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) [error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [error] at java.lang.Thread.run(Thread.java:745) [error] Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried) [error] at com.datastax.driver.core.RequestHandler.reportNoMoreHosts(RequestHandler.java:220) [error] at com.datastax.driver.core.RequestHandler.access$900(RequestHandler.java:45) [error] at com.datastax.driver.core.RequestHandler$SpeculativeExecution.sendRequest(RequestHandler.java:279) [error] at com.datastax.driver.core.RequestHandler.startNewExecution(RequestHandler.java:118) [error] at com.datastax.driver.core.RequestHandler.sendRequest(RequestHandler.java:94) [error] at com.datastax.driver.core.SessionManager.execute(SessionManager.java:565) [error] at com.datastax.driver.core.SessionManager.executeQuery(SessionManager.java:602) [error] at com.datastax.driver.core.SessionManager.executeAsync(SessionManager.java:98) [error] ... 26 more {code} Just before exception there is: {code} [error] 15/11/19 17:05:59 INFO Cluster: New Cassandra host localhost/127.0.0.1:9042 added [error] 15/11/19 17:05:59 INFO CassandraConnector: Connected to Cassandra cluster: Mac Cluster [error] 15/11/19 17:05:59 INFO Cluster: New Cassandra host localhost/127.0.0.1:9042 added [error] 15/11/19 17:05:59 INFO CassandraConnector: Connected to Cassandra cluster: Mac Cluster [error] 15/11/19 17:05:59 INFO Cluster: New Cassandra host localhost/127.0.0.1:9042 added [error] 15/11/19 17:05:59 INFO CassandraConnector: Connected to Cassandra cluster: Mac Cluster [error] 15/11/19 17:05:59 INFO Cluster: New Cassandra host localhost/127.0.0.1:9042 added [error] 15/11/19 17:05:59 INFO CassandraConnector: Connected to Cassandra cluster: Mac Cluster [error] 15/11/19 17:05:59 INFO CassandraConnector: Disconnected from Cassandra cluster: Mac Cluster [error] 15/11/19 17:05:59 INFO CassandraConnector: Disconnected from Cassandra cluster: Mac Cluster [error] 15/11/19 17:05:59 INFO CassandraConnector: Disconnected from Cassandra cluster: Mac Cluster {code} After lot of debugging, I noticed that probably there is a race condition with marking connection as being closed/closing. As a workaround I found that by setting `spark.cassandra.connection.keep_alive_ms` config to "very-big-number-longer-than-whole-computation" I was able to run whole processing without any other issues.

    DataStax JIRA | 1 year ago | Lukas Stefaniak
    com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried)
  2. 0

    I'm using CassandraConnector.withSessionDo inside RDD.mapPartitions to achieve LEFT OUTER JOIN which is currently not provided by the driver. Code to reproduce the issue: https://github.com/lustefaniak/spark-cassandra-connector-bug Unfortunatelly I got hit by exception which wasn't telling me too much as I'm using single local Cassandra node for testing: {code} [error] com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried) [error] at com.datastax.driver.core.exceptions.NoHostAvailableException.copy(NoHostAvailableException.java:84) [error] at com.datastax.driver.core.DefaultResultSetFuture.extractCauseFromExecutionException(DefaultResultSetFuture.java:271) [error] at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:185) [error] at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:55) [error] at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:47) [error] at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source) [error] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) [error] at java.lang.reflect.Method.invoke(Method.java:483) [error] at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:33) [error] at com.sun.proxy.$Proxy4.execute(Unknown Source) [error] at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source) [error] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) [error] at java.lang.reflect.Method.invoke(Method.java:483) [error] at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:33) [error] at com.sun.proxy.$Proxy4.execute(Unknown Source) [error] at buggy.Main$$anonfun$doMappingInPartitions$1$$anonfun$apply$1$$anonfun$apply$2.apply(Main.scala:77) [error] at buggy.Main$$anonfun$doMappingInPartitions$1$$anonfun$apply$1$$anonfun$apply$2.apply(Main.scala:76) [error] at scala.collection.Iterator$$anon$11.next(Iterator.scala:370) [error] at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1555) [error] at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121) [error] at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121) [error] at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848) [error] at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848) [error] at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) [error] at org.apache.spark.scheduler.Task.run(Task.scala:88) [error] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) [error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [error] at java.lang.Thread.run(Thread.java:745) [error] Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried) [error] at com.datastax.driver.core.RequestHandler.reportNoMoreHosts(RequestHandler.java:220) [error] at com.datastax.driver.core.RequestHandler.access$900(RequestHandler.java:45) [error] at com.datastax.driver.core.RequestHandler$SpeculativeExecution.sendRequest(RequestHandler.java:279) [error] at com.datastax.driver.core.RequestHandler.startNewExecution(RequestHandler.java:118) [error] at com.datastax.driver.core.RequestHandler.sendRequest(RequestHandler.java:94) [error] at com.datastax.driver.core.SessionManager.execute(SessionManager.java:565) [error] at com.datastax.driver.core.SessionManager.executeQuery(SessionManager.java:602) [error] at com.datastax.driver.core.SessionManager.executeAsync(SessionManager.java:98) [error] ... 26 more {code} Just before exception there is: {code} [error] 15/11/19 17:05:59 INFO Cluster: New Cassandra host localhost/127.0.0.1:9042 added [error] 15/11/19 17:05:59 INFO CassandraConnector: Connected to Cassandra cluster: Mac Cluster [error] 15/11/19 17:05:59 INFO Cluster: New Cassandra host localhost/127.0.0.1:9042 added [error] 15/11/19 17:05:59 INFO CassandraConnector: Connected to Cassandra cluster: Mac Cluster [error] 15/11/19 17:05:59 INFO Cluster: New Cassandra host localhost/127.0.0.1:9042 added [error] 15/11/19 17:05:59 INFO CassandraConnector: Connected to Cassandra cluster: Mac Cluster [error] 15/11/19 17:05:59 INFO Cluster: New Cassandra host localhost/127.0.0.1:9042 added [error] 15/11/19 17:05:59 INFO CassandraConnector: Connected to Cassandra cluster: Mac Cluster [error] 15/11/19 17:05:59 INFO CassandraConnector: Disconnected from Cassandra cluster: Mac Cluster [error] 15/11/19 17:05:59 INFO CassandraConnector: Disconnected from Cassandra cluster: Mac Cluster [error] 15/11/19 17:05:59 INFO CassandraConnector: Disconnected from Cassandra cluster: Mac Cluster {code} After lot of debugging, I noticed that probably there is a race condition with marking connection as being closed/closing. As a workaround I found that by setting `spark.cassandra.connection.keep_alive_ms` config to "very-big-number-longer-than-whole-computation" I was able to run whole processing without any other issues.

    DataStax JIRA | 1 year ago | Lukas Stefaniak
    com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried)
  3. 0

    Need help getting spark-cassandra-connector to work

    Stack Overflow | 2 years ago
    com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried)
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    GitHub comment 85#49526609

    GitHub | 3 years ago | tobert
    com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried)
  6. 0

    GitHub comment 85#51643567

    GitHub | 3 years ago | enigma1510
    com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried)

    5 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. com.datastax.driver.core.exceptions.NoHostAvailableException

      All host(s) tried for query failed (no host was tried)

      at com.datastax.driver.core.RequestHandler.reportNoMoreHosts()
    2. DataStax Java Driver for Apache Cassandra - Core
      AbstractSession.execute
      1. com.datastax.driver.core.RequestHandler.reportNoMoreHosts(RequestHandler.java:220)
      2. com.datastax.driver.core.RequestHandler.access$900(RequestHandler.java:45)
      3. com.datastax.driver.core.RequestHandler$SpeculativeExecution.sendRequest(RequestHandler.java:279)
      4. com.datastax.driver.core.RequestHandler.startNewExecution(RequestHandler.java:118)
      5. com.datastax.driver.core.RequestHandler.sendRequest(RequestHandler.java:94)
      6. com.datastax.driver.core.SessionManager.execute(SessionManager.java:565)
      7. com.datastax.driver.core.SessionManager.executeQuery(SessionManager.java:602)
      8. com.datastax.driver.core.SessionManager.executeAsync(SessionManager.java:98)
      9. com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:55)
      10. com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:47)
      10 frames
    3. Java RT
      Method.invoke
      1. sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
      2. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      3. java.lang.reflect.Method.invoke(Method.java:483)
      3 frames
    4. spark-cassandra-connector
      SessionProxy.invoke
      1. com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:33)
      1 frame
    5. com.sun.proxy
      $Proxy4.execute
      1. com.sun.proxy.$Proxy4.execute(Unknown Source)
      1 frame
    6. Java RT
      Method.invoke
      1. sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
      2. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      3. java.lang.reflect.Method.invoke(Method.java:483)
      3 frames
    7. spark-cassandra-connector
      SessionProxy.invoke
      1. com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:33)
      1 frame
    8. com.sun.proxy
      $Proxy4.execute
      1. com.sun.proxy.$Proxy4.execute(Unknown Source)
      1 frame
    9. buggy
      Main$$anonfun$doMappingInPartitions$1$$anonfun$apply$1$$anonfun$apply$2.apply
      1. buggy.Main$$anonfun$doMappingInPartitions$1$$anonfun$apply$1$$anonfun$apply$2.apply(Main.scala:77)
      2. buggy.Main$$anonfun$doMappingInPartitions$1$$anonfun$apply$1$$anonfun$apply$2.apply(Main.scala:76)
      2 frames
    10. Scala
      Iterator$$anon$11.next
      1. scala.collection.Iterator$$anon$11.next(Iterator.scala:370)
      1 frame
    11. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1555)
      2. org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121)
      3. org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121)
      4. org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848)
      5. org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848)
      6. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
      7. org.apache.spark.scheduler.Task.run(Task.scala:88)
      8. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
      8 frames
    12. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames