java.io.IOException: Exception during preparation of SELECT "id", "series", "rollup_minutes", "period_stamp", "event_type", "value" FROM "linkcurrent"."time_series_counters_2015_09" WHERE "id" = ? AND "series" = ? AND "rollup_minutes" = ? ALLOW FILTERING: Cannot convert object 0.0 of type class org.apache.spark.sql.types.Decimal to java.math.BigInteger.

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via DataStax JIRA by Alex Liu, 1 year ago
Exception during preparation of SELECT "id", "series", "rollup_minutes", "period_stamp", "event_type", "value" FROM "linkcurrent"."time_series_counters_2015_09" WHERE "id" = ? AND "series" = ? AND "rollup_minutes" = ? ALLOW FILTERING: Cannot convert object 0.0 of type class org.apache.spark.sql.types.Decimal to java.math.BigInteger.
via DataStax JIRA by julien sebrien, 1 year ago
Exception during preparation of SELECT FROM "geneticio"."temp_table_427eba306cb511e58bfae82aea3ab28a" WHERE token("internal_index") > ? AND token("internal_index") <= ? ALLOW FILTERING: line 1:8 no viable alternative at input 'FROM'
via Stack Overflow by Mnemosyne
, 1 year ago
Exception during preparation of SELECT "uuid", "person", "age" FROM "test"."users" WHERE token("uuid") > ? AND token("uuid") <= ? AND name = Jane ALLOW FILTERING: line 1:232 no viable alternative at input 'ALLOW' (...<= ? AND name = [Jane] ALLOW...)
via GitHub by venkatesh-rudraraju
, 1 year ago
Exception during preparation of _**SELECT "dtid", "created", "deleted" FROM "metricsdb"."device_counts_by_hour" WHERE token("period") > ? AND token("period") <= ? AND period = 2016033123 ALLOW FILTERING**_: period cannot be restricted by more than one relation if it includes an Equal
via Stack Overflow by Daniel
, 9 months ago
Exception during preparation of SELECT "coil_id", "event_time", "car_count", "insert_time" FROM "public"."traffic" WHERE token("coil_id") > ? AND token("coil_id") <= ? ALLOW FILTERING: Could not initialize class com.datastax.spark.connector.types.TypeConverter$
via Stack Overflow by Brent Dorsey
, 9 months ago
Exception during preparation of SELECT "item_uuid", "time_series_date", "item_uri" FROM "bug"."per_partition_limit_test" WHERE token("item_uuid") > ? AND token("item_uuid") <= ? AND PER PARTITION LIMIT 1 ALLOW FILTERING: line 1:154 no viable alternative at input 'PARTITION' (...("item_uuid") <= ? AND [PER] PARTITION...)
java.io.IOException: Exception during preparation of SELECT "id", "series", "rollup_minutes", "period_stamp", "event_type", "value" FROM "linkcurrent"."time_series_counters_2015_09" WHERE "id" = ? AND "series" = ? AND "rollup_minutes" = ? ALLOW FILTERING: Cannot convert object 0.0 of type class org.apache.spark.sql.types.Decimal to java.math.BigInteger.
at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:42)
at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:40)
at com.datastax.spark.connector.types.TypeConverter$JavaBigIntegerConverter$$anonfun$convertPF$15.applyOrElse(TypeConverter.scala:354)
at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:40)
at com.datastax.spark.connector.types.TypeConverter$JavaBigIntegerConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:352)
at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:53)
at com.datastax.spark.connector.types.TypeConverter$JavaBigIntegerConverter$.convert(TypeConverter.scala:352)
at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$26.applyOrElse(TypeConverter.scala:702)
at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:40)
at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:695)
at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:53)
at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:695)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$9.apply(CassandraTableScanRDD.scala:181)
at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD.createStatement(CassandraTableScanRDD.scala:180)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD.com$datastax$spark$connector$rdd$CassandraTableScanRDD$$fetchTokenRange(CassandraTableScanRDD.scala:202)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$13.apply(CassandraTableScanRDD.scala:229)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
at scala.collection.Iterator$$anon$14.hasNext(Iterator.scala:388)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:308)
at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:207)
at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:70)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
at org.apache.spark.scheduler.Task.run(Task.scala:70)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Users with the same issue

Samebug visitor profile picture
Unknown user
Once, 1 year ago

Know the solutions? Share your knowledge to help other developers to debug faster.