Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Stack Overflow by Ashis
, 2 years ago
Failed to get converter for field "getCompanyid" of type java.lang.String in SampleDBOperation$Employee mapped to column "companyid" of "test.employee"
via DataStax JIRA by SparkNewbie, 1 year ago
Failed to get converter for field "getId" of type java.lang.Integer in com.datastax.spark.demo.JavaDemo$Product mapped to column "id" of "java_api.products"
via DataStax JIRA by Purvi, 2 years ago
Failed to get converter for field "getId" of type java.lang.Integer in com.datastax.spark.demo.JavaDemo$Product mapped to column "id" of "java_api.products"
via DataStax JIRA by Russell Spitzer, 1 year ago
Failed to get converter for field "loc" of type scala.Seq[Point] in KVRow mapped to column "loc" of "test.kv"
via DataStax JIRA by Russell Spitzer, 2 years ago
Failed to get converter for field "loc" of type scala.Seq[Point] in KVRow mapped to column "loc" of "test.kv"
java.lang.IllegalArgumentException: Failed to get converter for field "getCompanyid" of type java.lang.String in SampleDBOperation$Employee mapped to column "companyid" of "test.employee" at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$5.apply(MappedToGettableDataConverter.scala:157) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$5.apply(MappedToGettableDataConverter.scala:150) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.immutable.Range.foreach(Range.scala:141) at scala.collection.TraversableLike$class.map(TraversableLike.scala:244) at scala.collection.AbstractTraversable.map(Traversable.scala:105) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.<init>(MappedToGettableDataConverter.scala:150) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$.apply(MappedToGettableDataConverter.scala:20) at com.datastax.spark.connector.writer.DefaultRowWriter.<init>(DefaultRowWriter.scala:17) at com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1.rowWriter(DefaultRowWriter.scala:31) at com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1.rowWriter(DefaultRowWriter.scala:29) at com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:269) at com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36) at com.datastax.spark.connector.japi.RDDJavaFunctions.saveToCassandra(RDDJavaFunctions.java:61) at com.datastax.spark.connector.japi.RDDAndDStreamCommonJavaFunctions$WriterBuilder.saveToCassandra(RDDAndDStreamCommonJavaFunctions.java:443)