Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via spark-user by Antonio Giambanco, 1 year ago
", :"security_type", :"security_name", :"date", :"time", :"price", :"currency", :"user_id", :"quantity", :"amount", :"session_id"): All host(s) tried for query failed (no host was tried) at com.datastax.spark.connector.writer.TableWriter.com <http://com.datastax.spark.connector.writer.tablewriter.com/>$ datastax$spark$connector$writer$TableWriter$$prepareStatement(TableWriter. scala:96)
via spark-user by Helena Edelson, 1 year ago
", :"security_type", :"security_name", :"date", :"time", :"price", :"currency", :"user_id", :"quantity", :"amount", :"session_id"): All host(s) tried for query failed (no host was tried) at com.datastax.spark.connector.writer.TableWriter.com <http://com.datastax.spark.connector.writer.tablewriter.com/>$datastax$spark$connector$writer$TableWriter$$prepareStatement(TableWriter.scala:96)
via Stack Overflow by user3376961
, 2 years ago
Failed to write statements to keyspacename.tablename.
via DataStax JIRA by Big Bansal, 1 year ago
Failed to write statements to test.words.
java.io.IOException: Failed to prepare statement
INSERT INTO "cassandrasink"."transaction" ("event_id", "isin",
"security_type", "security_name", "date", "time", "price", "currency",
"user_id", "quantity", "amount", "session_id") VALUES (:"event_id",
:"isin", :"security_type", :"security_name", :"date", :"time", :"price",
:"currency", :"user_id", :"quantity", :"amount", :"session_id"): All
host(s) tried for query failed (no host was tried)
        at com.datastax.spark.connector.writer.TableWriter.com
<http://com.datastax.spark.connector.writer.tablewriter.com/>$
datastax$spark$connector$writer$TableWriter$$prepareStatement(TableWriter.
scala:96)	at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:122)	at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:120)	at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:100)	at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:99)	at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:151)	at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:99)	at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:120)	at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:36)	at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:36)	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)	at org.apache.spark.scheduler.Task.run(Task.scala:64)	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)	at java.lang.Thread.run(Thread.java:722)