com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-10-13 of type class java.time.LocalDate to com.datastax.driver.core.LocalDate.

  1. 0

    How to use java.util.LocalDate in Cassandra query from Spark?

    Stack Overflow | 2 months ago | Marcin Armatys
    com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-10-13 of type class java.time.LocalDate to com.datastax.driver.core.LocalDate.
  2. 0

    My Cassandra Table is: {code:java} CREATE TABLE keyspace.wish_counter ( wish_date date, wish_published_time timeuuid, wish_counter_value counter, PRIMARY KEY (wish_date, wish_published_time) ) WITH CLUSTERING ORDER BY (wish_published_time ASC) {code} I'm loading data from Cassandra into a class 'WishCountTable' : {code:java} class WishCountTable extends Serializable { var wish_date: DateTime = new DateTime(0) var wish_published_time: UUID = new UUID(0L, 0L) var wish_counter_value: Long = 0L } {code} Everything is alright but whenever I try to save data into cassandra, I get an error. {code:java} saveRDD.saveToCassandra(keyspace, "wish_counter") {code} h4. ERROR: {code:java} 16/03/07 19:18:08 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 5) com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-21T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$20.applyOrElse(TypeConverter.scala:447) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$28.applyOrElse(TypeConverter.scala:756) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:749) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:749) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1$$anonfun$applyOrElse$1.apply$mcVI$sp(MappedToGettableDataConverter.scala:170) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1.applyOrElse(MappedToGettableDataConverter.scala:169) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.convert(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.readColumnValues(DefaultRowWriter.scala:21) at com.datastax.spark.connector.writer.BoundStatementBuilder.bind(BoundStatementBuilder.scala:35) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:106) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:31) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at com.datastax.spark.connector.writer.GroupingBatchBuilder.foreach(GroupingBatchBuilder.scala:31) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:155) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:139) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:139) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 16/03/07 19:18:08 ERROR TaskSetManager: Task 0 in stage 1.0 failed 1 times; aborting job Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 5, localhost): com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-21T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$20.applyOrElse(TypeConverter.scala:447) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$28.applyOrElse(TypeConverter.scala:756) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:749) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:749) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1$$anonfun$applyOrElse$1.apply$mcVI$sp(MappedToGettableDataConverter.scala:170) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1.applyOrElse(MappedToGettableDataConverter.scala:169) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.convert(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.readColumnValues(DefaultRowWriter.scala:21) at com.datastax.spark.connector.writer.BoundStatementBuilder.bind(BoundStatementBuilder.scala:35) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:106) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:31) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at com.datastax.spark.connector.writer.GroupingBatchBuilder.foreach(GroupingBatchBuilder.scala:31) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:155) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:139) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:139) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Driver stacktrace: at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1283) at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1271) at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1270) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1270) at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697) at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697) at scala.Option.foreach(Option.scala:236) at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1496) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1458) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1447) at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1822) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1835) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1912) at com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:37) at org.qm.UpdateWishTable$.main(UpdateWishTable.scala:93) at org.qm.UpdateWishTable.main(UpdateWishTable.scala) Caused by: com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-21T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$20.applyOrElse(TypeConverter.scala:447) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$28.applyOrElse(TypeConverter.scala:756) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:749) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:749) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1$$anonfun$applyOrElse$1.apply$mcVI$sp(MappedToGettableDataConverter.scala:170) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1.applyOrElse(MappedToGettableDataConverter.scala:169) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.convert(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.readColumnValues(DefaultRowWriter.scala:21) at com.datastax.spark.connector.writer.BoundStatementBuilder.bind(BoundStatementBuilder.scala:35) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:106) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:31) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at com.datastax.spark.connector.writer.GroupingBatchBuilder.foreach(GroupingBatchBuilder.scala:31) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:155) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:139) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:139) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 16/03/07 19:18:08 ERROR Executor: Exception in task 1.0 in stage 1.0 (TID 6) com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-28T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$20.applyOrElse(TypeConverter.scala:447) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$28.applyOrElse(TypeConverter.scala:756) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:749) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:749) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1$$anonfun$applyOrElse$1.apply$mcVI$sp(MappedToGettableDataConverter.scala:170) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1.applyOrElse(MappedToGettableDataConverter.scala:169) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.convert(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.readColumnValues(DefaultRowWriter.scala:21) at com.datastax.spark.connector.writer.BoundStatementBuilder.bind(BoundStatementBuilder.scala:35) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:106) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:31) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at com.datastax.spark.connector.writer.GroupingBatchBuilder.foreach(GroupingBatchBuilder.scala:31) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:155) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:139) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:139) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) {code}

    DataStax JIRA | 9 months ago | Safat Siddiqui
    com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-21T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate.
  3. 0

    My Cassandra Table is: {code:java} CREATE TABLE keyspace.wish_counter ( wish_date date, wish_published_time timeuuid, wish_counter_value counter, PRIMARY KEY (wish_date, wish_published_time) ) WITH CLUSTERING ORDER BY (wish_published_time ASC) {code} I'm loading data from Cassandra into a class 'WishCountTable' : {code:java} class WishCountTable extends Serializable { var wish_date: DateTime = new DateTime(0) var wish_published_time: UUID = new UUID(0L, 0L) var wish_counter_value: Long = 0L } {code} Everything is alright but whenever I try to save data into cassandra, I get an error. {code:java} saveRDD.saveToCassandra(keyspace, "wish_counter") {code} h4. ERROR: {code:java} 16/03/07 19:18:08 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 5) com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-21T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$20.applyOrElse(TypeConverter.scala:447) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$28.applyOrElse(TypeConverter.scala:756) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:749) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:749) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1$$anonfun$applyOrElse$1.apply$mcVI$sp(MappedToGettableDataConverter.scala:170) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1.applyOrElse(MappedToGettableDataConverter.scala:169) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.convert(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.readColumnValues(DefaultRowWriter.scala:21) at com.datastax.spark.connector.writer.BoundStatementBuilder.bind(BoundStatementBuilder.scala:35) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:106) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:31) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at com.datastax.spark.connector.writer.GroupingBatchBuilder.foreach(GroupingBatchBuilder.scala:31) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:155) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:139) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:139) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 16/03/07 19:18:08 ERROR TaskSetManager: Task 0 in stage 1.0 failed 1 times; aborting job Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 5, localhost): com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-21T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$20.applyOrElse(TypeConverter.scala:447) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$28.applyOrElse(TypeConverter.scala:756) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:749) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:749) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1$$anonfun$applyOrElse$1.apply$mcVI$sp(MappedToGettableDataConverter.scala:170) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1.applyOrElse(MappedToGettableDataConverter.scala:169) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.convert(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.readColumnValues(DefaultRowWriter.scala:21) at com.datastax.spark.connector.writer.BoundStatementBuilder.bind(BoundStatementBuilder.scala:35) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:106) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:31) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at com.datastax.spark.connector.writer.GroupingBatchBuilder.foreach(GroupingBatchBuilder.scala:31) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:155) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:139) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:139) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Driver stacktrace: at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1283) at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1271) at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1270) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1270) at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697) at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697) at scala.Option.foreach(Option.scala:236) at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1496) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1458) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1447) at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1822) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1835) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1912) at com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:37) at org.qm.UpdateWishTable$.main(UpdateWishTable.scala:93) at org.qm.UpdateWishTable.main(UpdateWishTable.scala) Caused by: com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-21T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$20.applyOrElse(TypeConverter.scala:447) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$28.applyOrElse(TypeConverter.scala:756) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:749) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:749) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1$$anonfun$applyOrElse$1.apply$mcVI$sp(MappedToGettableDataConverter.scala:170) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1.applyOrElse(MappedToGettableDataConverter.scala:169) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.convert(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.readColumnValues(DefaultRowWriter.scala:21) at com.datastax.spark.connector.writer.BoundStatementBuilder.bind(BoundStatementBuilder.scala:35) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:106) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:31) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at com.datastax.spark.connector.writer.GroupingBatchBuilder.foreach(GroupingBatchBuilder.scala:31) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:155) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:139) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:139) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 16/03/07 19:18:08 ERROR Executor: Exception in task 1.0 in stage 1.0 (TID 6) com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-28T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$20.applyOrElse(TypeConverter.scala:447) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$28.applyOrElse(TypeConverter.scala:756) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:749) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:749) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1$$anonfun$applyOrElse$1.apply$mcVI$sp(MappedToGettableDataConverter.scala:170) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1.applyOrElse(MappedToGettableDataConverter.scala:169) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.convert(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.readColumnValues(DefaultRowWriter.scala:21) at com.datastax.spark.connector.writer.BoundStatementBuilder.bind(BoundStatementBuilder.scala:35) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:106) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:31) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at com.datastax.spark.connector.writer.GroupingBatchBuilder.foreach(GroupingBatchBuilder.scala:31) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:155) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:139) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:139) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) {code}

    DataStax JIRA | 9 months ago | Safat Siddiqui
    com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-28T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate.
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    My Cassandra Table is: {code:java} CREATE TABLE keyspace.wish_counter ( wish_date date, wish_published_time timeuuid, wish_counter_value counter, PRIMARY KEY (wish_date, wish_published_time) ) WITH CLUSTERING ORDER BY (wish_published_time ASC) {code} I'm loading data from Cassandra into a class 'WishCountTable' : {code:java} class WishCountTable extends Serializable { var wish_date: DateTime = new DateTime(0) var wish_published_time: UUID = new UUID(0L, 0L) var wish_counter_value: Long = 0L } {code} Everything is alright but whenever I try to save data into cassandra, I get an error. {code:java} saveRDD.saveToCassandra(keyspace, "wish_counter") {code} h4. ERROR: {code:java} 16/03/07 19:18:08 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 5) com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-21T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$20.applyOrElse(TypeConverter.scala:447) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$28.applyOrElse(TypeConverter.scala:756) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:749) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:749) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1$$anonfun$applyOrElse$1.apply$mcVI$sp(MappedToGettableDataConverter.scala:170) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1.applyOrElse(MappedToGettableDataConverter.scala:169) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.convert(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.readColumnValues(DefaultRowWriter.scala:21) at com.datastax.spark.connector.writer.BoundStatementBuilder.bind(BoundStatementBuilder.scala:35) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:106) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:31) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at com.datastax.spark.connector.writer.GroupingBatchBuilder.foreach(GroupingBatchBuilder.scala:31) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:155) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:139) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:139) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 16/03/07 19:18:08 ERROR TaskSetManager: Task 0 in stage 1.0 failed 1 times; aborting job Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 5, localhost): com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-21T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$20.applyOrElse(TypeConverter.scala:447) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$28.applyOrElse(TypeConverter.scala:756) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:749) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:749) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1$$anonfun$applyOrElse$1.apply$mcVI$sp(MappedToGettableDataConverter.scala:170) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1.applyOrElse(MappedToGettableDataConverter.scala:169) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.convert(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.readColumnValues(DefaultRowWriter.scala:21) at com.datastax.spark.connector.writer.BoundStatementBuilder.bind(BoundStatementBuilder.scala:35) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:106) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:31) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at com.datastax.spark.connector.writer.GroupingBatchBuilder.foreach(GroupingBatchBuilder.scala:31) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:155) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:139) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:139) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Driver stacktrace: at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1283) at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1271) at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1270) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1270) at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697) at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697) at scala.Option.foreach(Option.scala:236) at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1496) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1458) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1447) at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1822) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1835) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1912) at com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:37) at org.qm.UpdateWishTable$.main(UpdateWishTable.scala:93) at org.qm.UpdateWishTable.main(UpdateWishTable.scala) Caused by: com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-21T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$20.applyOrElse(TypeConverter.scala:447) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$28.applyOrElse(TypeConverter.scala:756) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:749) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:749) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1$$anonfun$applyOrElse$1.apply$mcVI$sp(MappedToGettableDataConverter.scala:170) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1.applyOrElse(MappedToGettableDataConverter.scala:169) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.convert(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.readColumnValues(DefaultRowWriter.scala:21) at com.datastax.spark.connector.writer.BoundStatementBuilder.bind(BoundStatementBuilder.scala:35) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:106) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:31) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at com.datastax.spark.connector.writer.GroupingBatchBuilder.foreach(GroupingBatchBuilder.scala:31) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:155) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:139) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:139) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 16/03/07 19:18:08 ERROR Executor: Exception in task 1.0 in stage 1.0 (TID 6) com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-28T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$20.applyOrElse(TypeConverter.scala:447) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$28.applyOrElse(TypeConverter.scala:756) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:749) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:749) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1$$anonfun$applyOrElse$1.apply$mcVI$sp(MappedToGettableDataConverter.scala:170) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1.applyOrElse(MappedToGettableDataConverter.scala:169) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.convert(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.readColumnValues(DefaultRowWriter.scala:21) at com.datastax.spark.connector.writer.BoundStatementBuilder.bind(BoundStatementBuilder.scala:35) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:106) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:31) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at com.datastax.spark.connector.writer.GroupingBatchBuilder.foreach(GroupingBatchBuilder.scala:31) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:155) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:139) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:139) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) {code}

    DataStax JIRA | 9 months ago | Safat Siddiqui
    com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-21T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate.
  6. 0

    My Cassandra Table is: {code:java} CREATE TABLE keyspace.wish_counter ( wish_date date, wish_published_time timeuuid, wish_counter_value counter, PRIMARY KEY (wish_date, wish_published_time) ) WITH CLUSTERING ORDER BY (wish_published_time ASC) {code} I'm loading data from Cassandra into a class 'WishCountTable' : {code:java} class WishCountTable extends Serializable { var wish_date: DateTime = new DateTime(0) var wish_published_time: UUID = new UUID(0L, 0L) var wish_counter_value: Long = 0L } {code} Everything is alright but whenever I try to save data into cassandra, I get an error. {code:java} saveRDD.saveToCassandra(keyspace, "wish_counter") {code} h4. ERROR: {code:java} 16/03/07 19:18:08 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 5) com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-21T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$20.applyOrElse(TypeConverter.scala:447) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$28.applyOrElse(TypeConverter.scala:756) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:749) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:749) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1$$anonfun$applyOrElse$1.apply$mcVI$sp(MappedToGettableDataConverter.scala:170) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1.applyOrElse(MappedToGettableDataConverter.scala:169) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.convert(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.readColumnValues(DefaultRowWriter.scala:21) at com.datastax.spark.connector.writer.BoundStatementBuilder.bind(BoundStatementBuilder.scala:35) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:106) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:31) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at com.datastax.spark.connector.writer.GroupingBatchBuilder.foreach(GroupingBatchBuilder.scala:31) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:155) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:139) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:139) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 16/03/07 19:18:08 ERROR TaskSetManager: Task 0 in stage 1.0 failed 1 times; aborting job Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 5, localhost): com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-21T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$20.applyOrElse(TypeConverter.scala:447) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$28.applyOrElse(TypeConverter.scala:756) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:749) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:749) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1$$anonfun$applyOrElse$1.apply$mcVI$sp(MappedToGettableDataConverter.scala:170) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1.applyOrElse(MappedToGettableDataConverter.scala:169) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.convert(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.readColumnValues(DefaultRowWriter.scala:21) at com.datastax.spark.connector.writer.BoundStatementBuilder.bind(BoundStatementBuilder.scala:35) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:106) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:31) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at com.datastax.spark.connector.writer.GroupingBatchBuilder.foreach(GroupingBatchBuilder.scala:31) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:155) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:139) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:139) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Driver stacktrace: at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1283) at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1271) at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1270) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1270) at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697) at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697) at scala.Option.foreach(Option.scala:236) at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1496) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1458) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1447) at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1822) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1835) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1912) at com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:37) at org.qm.UpdateWishTable$.main(UpdateWishTable.scala:93) at org.qm.UpdateWishTable.main(UpdateWishTable.scala) Caused by: com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-21T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$20.applyOrElse(TypeConverter.scala:447) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$28.applyOrElse(TypeConverter.scala:756) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:749) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:749) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1$$anonfun$applyOrElse$1.apply$mcVI$sp(MappedToGettableDataConverter.scala:170) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1.applyOrElse(MappedToGettableDataConverter.scala:169) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.convert(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.readColumnValues(DefaultRowWriter.scala:21) at com.datastax.spark.connector.writer.BoundStatementBuilder.bind(BoundStatementBuilder.scala:35) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:106) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:31) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at com.datastax.spark.connector.writer.GroupingBatchBuilder.foreach(GroupingBatchBuilder.scala:31) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:155) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:139) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:139) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 16/03/07 19:18:08 ERROR Executor: Exception in task 1.0 in stage 1.0 (TID 6) com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-28T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$20.applyOrElse(TypeConverter.scala:447) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:437) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$28.applyOrElse(TypeConverter.scala:756) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:749) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:749) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1$$anonfun$applyOrElse$1.apply$mcVI$sp(MappedToGettableDataConverter.scala:170) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$convertPF$1.applyOrElse(MappedToGettableDataConverter.scala:169) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.convert(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.readColumnValues(DefaultRowWriter.scala:21) at com.datastax.spark.connector.writer.BoundStatementBuilder.bind(BoundStatementBuilder.scala:35) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:106) at com.datastax.spark.connector.writer.GroupingBatchBuilder.next(GroupingBatchBuilder.scala:31) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at com.datastax.spark.connector.writer.GroupingBatchBuilder.foreach(GroupingBatchBuilder.scala:31) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:155) at com.datastax.spark.connector.writer.TableWriter$$anonfun$write$1.apply(TableWriter.scala:139) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.writer.TableWriter.write(TableWriter.scala:139) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at com.datastax.spark.connector.RDDFunctions$$anonfun$saveToCassandra$1.apply(RDDFunctions.scala:37) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) {code}

    DataStax JIRA | 9 months ago | Safat Siddiqui
    com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-02-28T06:00:00.000+06:00 of type class org.joda.time.DateTime to com.datastax.driver.core.LocalDate.

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. com.datastax.spark.connector.types.TypeConversionException

      Cannot convert object 2016-10-13 of type class java.time.LocalDate to com.datastax.driver.core.LocalDate.

      at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply()
    2. spark-cassandra-connector
      BoundStatementBuilder$$anonfun$8.apply
      1. com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45)
      2. com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43)
      3. com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$14.applyOrElse(TypeConverter.scala:449)
      4. com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43)
      5. com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:439)
      6. com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56)
      7. com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:439)
      8. com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$29.applyOrElse(TypeConverter.scala:788)
      9. com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43)
      10. com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:771)
      11. com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56)
      12. com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:771)
      13. com.datastax.spark.connector.writer.BoundStatementBuilder$$anonfun$8.apply(BoundStatementBuilder.scala:93)
      13 frames