com.datastax.driver.core.exceptions.InvalidQueryException: Invalid null value for partition key part key

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • {code} 19:30:49 com.datastax.driver.core.exceptions.InvalidQueryException: Invalid null value for partition key part key 19:30:49 at com.datastax.driver.core.exceptions.InvalidQueryException.copy(InvalidQueryException.java:35) 19:30:49 at com.datastax.driver.core.DefaultResultSetFuture.extractCauseFromExecutionException(DefaultResultSetFuture.java:289) 19:30:49 at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:205) 19:30:49 at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:52) 19:30:49 at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source) 19:30:49 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 19:30:49 at java.lang.reflect.Method.invoke(Method.java:606) 19:30:49 at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:33) 19:30:49 at com.sun.proxy.$Proxy12.execute(Unknown Source) 19:30:49 at com.datastax.spark.connector.rdd.CassandraJoinRDD$$anonfun$fetchIterator$1.apply(CassandraJoinRDD.scala:233) 19:30:49 at com.datastax.spark.connector.rdd.CassandraJoinRDD$$anonfun$fetchIterator$1.apply(CassandraJoinRDD.scala:231) 19:30:49 at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:396) 19:30:49 at com.datastax.spark.connector.util.CountingIterator.hasNext(CountingIterator.scala:12) 19:30:49 at scala.collection.Iterator$class.foreach(Iterator.scala:750) 19:30:49 at com.datastax.spark.connector.util.CountingIterator.foreach(CountingIterator.scala:4) 19:30:49 at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59) 19:30:49 at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104) 19:30:49 at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48) 19:30:49 at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:295) 19:30:49 at com.datastax.spark.connector.util.CountingIterator.to(CountingIterator.scala:4) 19:30:49 at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:287) 19:30:49 at com.datastax.spark.connector.util.CountingIterator.toBuffer(CountingIterator.scala:4) 19:30:49 at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:274) 19:30:49 at com.datastax.spark.connector.util.CountingIterator.toArray(CountingIterator.scala:4) 19:30:49 at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:885) 19:30:49 at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:885) 19:30:49 at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1765) 19:30:49 at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1765) 19:30:49 at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63) 19:30:49 at org.apache.spark.scheduler.Task.run(Task.scala:70) 19:30:49 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213) 19:30:49 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 19:30:49 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 19:30:49 at java.lang.Thread.run(Thread.java:744) {code} This came up on jenkins for a doc only change PR. We've also seen jenkins intermittently fail the actor streaming job which uses the same api.
    via by Russell Spitzer,
  • {code} 19:30:49 com.datastax.driver.core.exceptions.InvalidQueryException: Invalid null value for partition key part key 19:30:49 at com.datastax.driver.core.exceptions.InvalidQueryException.copy(InvalidQueryException.java:35) 19:30:49 at com.datastax.driver.core.DefaultResultSetFuture.extractCauseFromExecutionException(DefaultResultSetFuture.java:289) 19:30:49 at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:205) 19:30:49 at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:52) 19:30:49 at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source) 19:30:49 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 19:30:49 at java.lang.reflect.Method.invoke(Method.java:606) 19:30:49 at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:33) 19:30:49 at com.sun.proxy.$Proxy12.execute(Unknown Source) 19:30:49 at com.datastax.spark.connector.rdd.CassandraJoinRDD$$anonfun$fetchIterator$1.apply(CassandraJoinRDD.scala:233) 19:30:49 at com.datastax.spark.connector.rdd.CassandraJoinRDD$$anonfun$fetchIterator$1.apply(CassandraJoinRDD.scala:231) 19:30:49 at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:396) 19:30:49 at com.datastax.spark.connector.util.CountingIterator.hasNext(CountingIterator.scala:12) 19:30:49 at scala.collection.Iterator$class.foreach(Iterator.scala:750) 19:30:49 at com.datastax.spark.connector.util.CountingIterator.foreach(CountingIterator.scala:4) 19:30:49 at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59) 19:30:49 at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104) 19:30:49 at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48) 19:30:49 at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:295) 19:30:49 at com.datastax.spark.connector.util.CountingIterator.to(CountingIterator.scala:4) 19:30:49 at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:287) 19:30:49 at com.datastax.spark.connector.util.CountingIterator.toBuffer(CountingIterator.scala:4) 19:30:49 at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:274) 19:30:49 at com.datastax.spark.connector.util.CountingIterator.toArray(CountingIterator.scala:4) 19:30:49 at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:885) 19:30:49 at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:885) 19:30:49 at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1765) 19:30:49 at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1765) 19:30:49 at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63) 19:30:49 at org.apache.spark.scheduler.Task.run(Task.scala:70) 19:30:49 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213) 19:30:49 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 19:30:49 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 19:30:49 at java.lang.Thread.run(Thread.java:744) {code} This came up on jenkins for a doc only change PR. We've also seen jenkins intermittently fail the actor streaming job which uses the same api.
    via by Russell Spitzer,
  • Pig filter fails due to unexpected data
    via by Unknown author,
    • com.datastax.driver.core.exceptions.InvalidQueryException: Invalid null value for partition key part key at com.datastax.driver.core.exceptions.InvalidQueryException.copy(InvalidQueryException.java:35) at com.datastax.driver.core.DefaultResultSetFuture.extractCauseFromExecutionException(DefaultResultSetFuture.java:289) at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:205) at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:52) at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:33) at com.sun.proxy.$Proxy12.execute(Unknown Source) at com.datastax.spark.connector.rdd.CassandraJoinRDD$$anonfun$fetchIterator$1.apply(CassandraJoinRDD.scala:233) at com.datastax.spark.connector.rdd.CassandraJoinRDD$$anonfun$fetchIterator$1.apply(CassandraJoinRDD.scala:231) at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:396) at com.datastax.spark.connector.util.CountingIterator.hasNext(CountingIterator.scala:12) at scala.collection.Iterator$class.foreach(Iterator.scala:750) at com.datastax.spark.connector.util.CountingIterator.foreach(CountingIterator.scala:4) at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59) at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104) at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48) at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:295) at com.datastax.spark.connector.util.CountingIterator.to(CountingIterator.scala:4) at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:287) at com.datastax.spark.connector.util.CountingIterator.toBuffer(CountingIterator.scala:4) at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:274) at com.datastax.spark.connector.util.CountingIterator.toArray(CountingIterator.scala:4) at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:885) at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:885) at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1765) at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1765) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63) at org.apache.spark.scheduler.Task.run(Task.scala:70) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744)

    Users with the same issue

    Unknown visitor1 times, last one,
    Unknown visitor1 times, last one,
    Unknown visitor1 times, last one,
    Unknown visitor1 times, last one,
    Unknown visitor1 times, last one,
    6 more bugmates