java.io.IOException

Exception during preparation of SELECT "id", "series", "rollup_minutes", "period_stamp", "event_type", "value" FROM "linkcurrent"."time_series_counters_2015_09" WHERE "id" = ? AND "series" = ? AND "rollup_minutes" = ? ALLOW FILTERING: Cannot convert object 0.0 of type class org.apache.spark.sql.types.Decimal to java.math.BigInteger.

Solutions on the web20014

  • via DataStax JIRA by Alex Liu, 9 months ago
    Exception during preparation of SELECT "id", "series", "rollup_minutes", "period_stamp", "event_type", "value" FROM "linkcurrent"."time_series_counters_2015_09" WHERE "id" = ? AND "series" = ? AND "rollup_minutes" = ? ALLOW FILTERING: Cannot convert object 0.0 of type class org.apache.spark.sql.types.Decimal to java.math.BigInteger.
  • via Stack Overflow by Gnana
    , 8 months ago
    Exception during preparation of SELECT "id", "name", "parents" FROM "java_api"."products" WHERE token("id") > -1732598212583841281 AND token("id") <= -1668034862038885205 AND id=? ALLOW FILTERING: id cannot be restricted by more than one relation if it includes an Equal
  • via soso.io by Unknown author, 1 year ago
    Exception during preparation of SELECT "role", "id", "fname", "lname" FROM "tester"."empbyrole" WHERE token("role") > -5709068081826432029 AND token("role") <= -5491279024053142424 AND role=? ALLOW FILTERING: role cannot be restricted by more than one relation if it includes an Equal
  • Stack trace

    • java.io.IOException: Exception during preparation of SELECT "id", "series", "rollup_minutes", "period_stamp", "event_type", "value" FROM "linkcurrent"."time_series_counters_2015_09" WHERE "id" = ? AND "series" = ? AND "rollup_minutes" = ? ALLOW FILTERING: Cannot convert object 0.0 of type class org.apache.spark.sql.types.Decimal to java.math.BigInteger. at com.datastax.spark.connector.rdd.CassandraTableScanRDD.createStatement(CassandraTableScanRDD.scala:188) at com.datastax.spark.connector.rdd.CassandraTableScanRDD.com$datastax$spark$connector$rdd$CassandraTableScanRDD$$fetchTokenRange(CassandraTableScanRDD.scala:202) at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$13.apply(CassandraTableScanRDD.scala:229) at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$13.apply(CassandraTableScanRDD.scala:229) at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371) at com.datastax.spark.connector.util.CountingIterator.hasNext(CountingIterator.scala:12) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327) at scala.collection.Iterator$$anon$14.hasNext(Iterator.scala:388) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:308) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327) at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:207) at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:70) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) at org.apache.spark.scheduler.Task.run(Task.scala:70) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 0.0 of type class org.apache.spark.sql.types.Decimal to java.math.BigInteger. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:42) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:40) at com.datastax.spark.connector.types.TypeConverter$JavaBigIntegerConverter$$anonfun$convertPF$15.applyOrElse(TypeConverter.scala:354) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:40) at com.datastax.spark.connector.types.TypeConverter$JavaBigIntegerConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:352) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:53) at com.datastax.spark.connector.types.TypeConverter$JavaBigIntegerConverter$.convert(TypeConverter.scala:352) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$26.applyOrElse(TypeConverter.scala:702) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:40) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:695) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:53) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:695) at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$9.apply(CassandraTableScanRDD.scala:181) at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$9.apply(CassandraTableScanRDD.scala:180) at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721) at com.datastax.spark.connector.rdd.CassandraTableScanRDD.createStatement(CassandraTableScanRDD.scala:180) ... 18 more

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    Unknown user
    Once, 11 months ago