Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via DataStax JIRA by Alex Liu, 2 years ago
Job aborted due to stage failure: Task 0 in stage 16.0 failed 4 times, most recent failure: Lost task 0.3 in stage 16.0 (TID 25, 127.0.0.1): java.io.IOException: Exception during preparation of SELECT "id", "series", "rollup_minutes", "period_stamp
via DataStax JIRA by Alex Liu, 1 year ago
Job aborted due to stage failure: Task 0 in stage 16.0 failed 4 times, most recent failure: Lost task 0.3 in stage 16.0 (TID 25, 127.0.0.1): java.io.IOException: Exception during preparation of SELECT "id", "series", "rollup_minutes", "period_stamp
via DataStax JIRA by julien sebrien, 2 years ago
Job aborted due to stage failure: Task 0 in stage 2.0 failed 4 times, most recent failure: Lost task 0.3 in stage 2.0 (TID 5, Julien-Spectre): java.io.IOException: Exception during preparation of SELECT FROM "geneticio
via Stack Overflow by Aleksey Kiselev
, 1 year ago
Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.io.IOException: Exception during preparation of SELECT FROM "chat"."dictionary" WHERE token("value_id") > ? AND token("value_id") <= ? ALLOW FILTERING: line 1:8 no viable alternative at input 'FROM' (SELECT [FROM]...)
via Stack Overflow by Mnemosyne
, 10 months ago
Job aborted due to stage failure: Task 0 in stage 8463.0 failed 4 times, most recent failure: Lost task 0.3 in stage 8463.0 (TID 25187, worker2, executor 28): java.io.IOException: Exception during preparation of SELECT "sha256", "id", "label
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 16.0 failed 4 times, most recent failure: Lost task 0.3 in stage 16.0 (TID 25, 127.0.0.1): java.io.IOException: Exception during preparation of SELECT "id", "series", "rollup_minutes", "period_stamp", "event_type", "value" FROM "linkcurrent"."time_series_counters_2015_09" WHERE "id" = ? AND "series" = ? AND "rollup_minutes" = ? ALLOW FILTERING: Cannot convert object 0.0 of type class org.apache.spark.sql.types.Decimal to java.math.BigInteger. at com.datastax.spark.connector.rdd.CassandraTableScanRDD.createStatement(CassandraTableScanRDD.scala:188) at com.datastax.spark.connector.rdd.CassandraTableScanRDD.com$datastax$spark$connector$rdd$CassandraTableScanRDD$$fetchTokenRange(CassandraTableScanRDD.scala:202) at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$13.apply(CassandraTableScanRDD.scala:229) at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$13.apply(CassandraTableScanRDD.scala:229) at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371) at com.datastax.spark.connector.util.CountingIterator.hasNext(CountingIterator.scala:12) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327) at scala.collection.Iterator$$anon$14.hasNext(Iterator.scala:388) at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:308) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327) at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:207) at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:70) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) at org.apache.spark.scheduler.Task.run(Task.scala:70) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)Caused by: com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 0.0 of type class org.apache.spark.sql.types.Decimal to java.math.BigInteger. at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:42) at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:40) at com.datastax.spark.connector.types.TypeConverter$JavaBigIntegerConverter$$anonfun$convertPF$15.applyOrElse(TypeConverter.scala:354) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:40) at com.datastax.spark.connector.types.TypeConverter$JavaBigIntegerConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:352) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:53) at com.datastax.spark.connector.types.TypeConverter$JavaBigIntegerConverter$.convert(TypeConverter.scala:352) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$26.applyOrElse(TypeConverter.scala:702) at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:40) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:695) at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:53) at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:695) at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$9.apply(CassandraTableScanRDD.scala:181) at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$9.apply(CassandraTableScanRDD.scala:180) at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721) at com.datastax.spark.connector.rdd.CassandraTableScanRDD.createStatement(CassandraTableScanRDD.scala:180) ... 18 more