java.lang.AbstractMethodError: pyspark_cassandra.DeferringRowReader.read(Lcom/datastax/driver/core/Row;Lcom/datastax/spark/connector/CassandraRowMetadata;)Ljava/lang/Object;

GitHub | orencp | 7 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    AbstractMethodError while using pyspark with cassandra

    Stack Overflow | 7 months ago | orenco
    java.lang.AbstractMethodError: pyspark_cassandra.DeferringRowReader.read(Lcom/datastax/driver/core/Row;Lcom/datastax/spark/connector/CassandraRowMetadata;)Ljava/lang/Object;
  2. 0

    AbstractMethodError while using pyspark with cassandra

    GitHub | 7 months ago | orencp
    java.lang.AbstractMethodError: pyspark_cassandra.DeferringRowReader.read(Lcom/datastax/driver/core/Row;Lcom/datastax/spark/connector/CassandraRowMetadata;)Ljava/lang/Object;

    Root Cause Analysis

    1. java.lang.AbstractMethodError

      pyspark_cassandra.DeferringRowReader.read(Lcom/datastax/driver/core/Row;Lcom/datastax/spark/connector/CassandraRowMetadata;)Ljava/lang/Object;

      at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$17.apply()
    2. spark-cassandra-connector
      CassandraTableScanRDD$$anonfun$17.apply
      1. com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$17.apply(CassandraTableScanRDD.scala:315)
      2. com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$17.apply(CassandraTableScanRDD.scala:315)
      2 frames
    3. Scala
      Iterator$$anon$13.next
      1. scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
      2. scala.collection.Iterator$$anon$13.next(Iterator.scala:372)
      2 frames
    4. spark-cassandra-connector
      CountingIterator.next
      1. com.datastax.spark.connector.util.CountingIterator.next(CountingIterator.scala:16)
      1 frame
    5. Scala
      AbstractIterator.foreach
      1. scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
      2. scala.collection.Iterator$GroupedIterator.takeDestructively(Iterator.scala:914)
      3. scala.collection.Iterator$GroupedIterator.go(Iterator.scala:929)
      4. scala.collection.Iterator$GroupedIterator.fill(Iterator.scala:968)
      5. scala.collection.Iterator$GroupedIterator.hasNext(Iterator.scala:972)
      6. scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
      7. scala.collection.Iterator$class.foreach(Iterator.scala:727)
      8. scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
      8 frames
    6. Spark
      PythonRunner$WriterThread.run
      1. org.apache.spark.api.python.PythonRDD$.writeIteratorToStream(PythonRDD.scala:452)
      2. org.apache.spark.api.python.PythonRunner$WriterThread$$anonfun$run$3.apply(PythonRDD.scala:280)
      3. org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1766)
      4. org.apache.spark.api.python.PythonRunner$WriterThread.run(PythonRDD.scala:239)
      4 frames