java.lang.IllegalArgumentException: Failed to map constructor parameter timestamp in Bar to a column of MyNamespace

Stack Overflow | Luke | 7 months ago
  1. 0

    Mapping cassandra row to parametrized type in Spark RDD

    Stack Overflow | 7 months ago | Luke
    java.lang.IllegalArgumentException: Failed to map constructor parameter timestamp in Bar to a column of MyNamespace
  2. 0
    Probably you try to create an actor using actorSystem.actorOf(Props(classOf[AnyActor], args...)) where args does not match AnyActor constructor parameters.
  3. 0
    Properties of subclasses may have to be referenced in a different way. More on this here: https://goo.gl/FvnBXb
    via goo.gl
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0
    Some bots are sending malformed HTTP requests to your site. Try to find their IP addresses in the access logs and ask them to fix the bots or blacklist them.

    1 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.IllegalArgumentException

      Failed to map constructor parameter timestamp in Bar to a column of MyNamespace

      at com.datastax.spark.connector.mapper.DefaultColumnMapper$$anonfun$4$$anonfun$apply$1.apply()
    2. spark-cassandra-connector
      DefaultColumnMapper$$anonfun$4$$anonfun$apply$1.apply
      1. com.datastax.spark.connector.mapper.DefaultColumnMapper$$anonfun$4$$anonfun$apply$1.apply(DefaultColumnMapper.scala:78)
      2. com.datastax.spark.connector.mapper.DefaultColumnMapper$$anonfun$4$$anonfun$apply$1.apply(DefaultColumnMapper.scala:78)
      2 frames
    3. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    4. spark-cassandra-connector
      DefaultColumnMapper$$anonfun$4.apply
      1. com.datastax.spark.connector.mapper.DefaultColumnMapper$$anonfun$4.apply(DefaultColumnMapper.scala:78)
      2. com.datastax.spark.connector.mapper.DefaultColumnMapper$$anonfun$4.apply(DefaultColumnMapper.scala:76)
      2 frames
    5. Scala
      TraversableLike$WithFilter.map
      1. scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
      2. scala.collection.immutable.List.foreach(List.scala:318)
      3. scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)
      3 frames
    6. spark-cassandra-connector
      CassandraTableScanRDD.getPartitions
      1. com.datastax.spark.connector.mapper.DefaultColumnMapper.columnMapForReading(DefaultColumnMapper.scala:76)
      2. com.datastax.spark.connector.rdd.reader.GettableDataToMappedTypeConverter.<init>(GettableDataToMappedTypeConverter.scala:56)
      3. com.datastax.spark.connector.rdd.reader.ClassBasedRowReader.<init>(ClassBasedRowReader.scala:23)
      4. com.datastax.spark.connector.rdd.reader.ClassBasedRowReaderFactory.rowReader(ClassBasedRowReader.scala:48)
      5. com.datastax.spark.connector.rdd.reader.ClassBasedRowReaderFactory.rowReader(ClassBasedRowReader.scala:43)
      6. com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.rowReader(CassandraTableRowReaderProvider.scala:48)
      7. com.datastax.spark.connector.rdd.CassandraTableScanRDD.rowReader$lzycompute(CassandraTableScanRDD.scala:59)
      8. com.datastax.spark.connector.rdd.CassandraTableScanRDD.rowReader(CassandraTableScanRDD.scala:59)
      9. com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.verify(CassandraTableRowReaderProvider.scala:147)
      10. com.datastax.spark.connector.rdd.CassandraTableScanRDD.verify(CassandraTableScanRDD.scala:59)
      11. com.datastax.spark.connector.rdd.CassandraTableScanRDD.getPartitions(CassandraTableScanRDD.scala:143)
      11 frames