java.io.IOException: Exception during preparation of _**SELECT "dtid", "created", "deleted" FROM "metricsdb"."device_counts_by_hour" WHERE token("period") > ? AND token("period") <= ? AND period = 2016033123 ALLOW FILTERING**_: period cannot be restricted by more than one relation if it includes an Equal

GitHub | venkatesh-rudraraju | 10 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Not allowed to query with all of partition Keys(cassandra) with Equals

    GitHub | 10 months ago | venkatesh-rudraraju
    java.io.IOException: Exception during preparation of _**SELECT "dtid", "created", "deleted" FROM "metricsdb"."device_counts_by_hour" WHERE token("period") > ? AND token("period") <= ? AND period = 2016033123 ALLOW FILTERING**_: period cannot be restricted by more than one relation if it includes an Equal
  2. 0

    collector fails after upgrading from 1.39.9 to 1.40.2

    GitHub | 9 months ago | liyichao
    com.datastax.driver.core.exceptions.InvalidQueryException: Partition KEY part service_name cannot be restricted by IN relation (only the last part of the partition key can)
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

    3 unregistered visitors

    Root Cause Analysis

    1. com.datastax.driver.core.exceptions.InvalidQueryException

      period cannot be restricted by more than one relation if it includes an Equal

      at com.datastax.driver.core.exceptions.InvalidQueryException.copy()
    2. DataStax Java Driver for Apache Cassandra - Core
      AbstractSession.prepare
      1. com.datastax.driver.core.exceptions.InvalidQueryException.copy(InvalidQueryException.java:50)[cassandra-driver-core-3.0.0.jar:na]
      2. com.datastax.driver.core.DriverThrowables.propagateCause(DriverThrowables.java:37)[cassandra-driver-core-3.0.0.jar:na]
      3. com.datastax.driver.core.AbstractSession.prepare(AbstractSession.java:110)[cassandra-driver-core-3.0.0.jar:na]
      3 frames
    3. spark-cassandra-connector
      SessionProxy.invoke
      1. com.datastax.spark.connector.cql.PreparedStatementCache$.prepareStatement(PreparedStatementCache.scala:45)[spark-cassandra-connector_2.10-1.6.0-M2.jar:1.6.0-M2]
      2. com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:28)[spark-cassandra-connector_2.10-1.6.0-M2.jar:1.6.0-M2]
      2 frames
    4. com.sun.proxy
      $Proxy21.prepare
      1. com.sun.proxy.$Proxy21.prepare(Unknown Source)[na:na]
      1 frame
    5. spark-cassandra-connector
      CassandraTableScanRDD$$anonfun$18.apply
      1. com.datastax.spark.connector.rdd.CassandraTableScanRDD.createStatement(CassandraTableScanRDD.scala:274)[spark-cassandra-connector_2.10-1.6.0-M2.jar:1.6.0-M2]
      2. com.datastax.spark.connector.rdd.CassandraTableScanRDD.com$datastax$spark$connector$rdd$CassandraTableScanRDD$$fetchTokenRange(CassandraTableScanRDD.scala:302)[spark-cassandra-connector_2.10-1.6.0-M2.jar:1.6.0-M2]
      3. com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$18.apply(CassandraTableScanRDD.scala:328)[spark-cassandra-connector_2.10-1.6.0-M2.jar:1.6.0-M2]
      4. com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$18.apply(CassandraTableScanRDD.scala:328)[spark-cassandra-connector_2.10-1.6.0-M2.jar:1.6.0-M2]
      4 frames
    6. Scala
      Iterator$$anon$13.hasNext
      1. scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)[scala-library-2.10.5.jar:na]
      1 frame
    7. spark-cassandra-connector
      CountingIterator.hasNext
      1. com.datastax.spark.connector.util.CountingIterator.hasNext(CountingIterator.scala:12)[spark-cassandra-connector_2.10-1.6.0-M2.jar:1.6.0-M2]
      1 frame
    8. Scala
      Iterator$$anon$11.hasNext
      1. scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)[scala-library-2.10.5.jar:na]
      1 frame
    9. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:126)[spark-core_2.10-1.6.0.jar:1.6.0]
      2. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)[spark-core_2.10-1.6.0.jar:1.6.0]
      3. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)[spark-core_2.10-1.6.0.jar:1.6.0]
      4. org.apache.spark.scheduler.Task.run(Task.scala:89)[spark-core_2.10-1.6.0.jar:1.6.0]
      5. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)[spark-core_2.10-1.6.0.jar:1.6.0]
      5 frames
    10. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)[na:1.8.0_60]
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)[na:1.8.0_60]
      3. java.lang.Thread.run(Thread.java:745)[na:1.8.0_60]
      3 frames