java.lang.IllegalArgumentException: Failed to get converter for field "getId" of type java.lang.Integer in com.datastax.spark.demo.JavaDemo$Product mapped to column "id" of "java_api.products"

DataStax JIRA | Purvi | 1 year ago
  1. 0

    Java API for Spark Cassandra Connector - tutorial for blog post ยท GitHub

    github.com | 5 months ago
    java.lang.IllegalArgumentException: Failed to get converter for field "getId" of type java.lang.Integer in com.datastax.spark.demo.JavaDemo$Product mapped to column "id" of "java_api.products"
  2. 0

    I am trying to run JavaDemo sample example from DataStax Tutorials using Spark Cassandra Connector. The blog post: http://www.datastax.com/dev/blog/accessing-cassandra-from-spark-in-java Complete code can be found at : https://gist.github.com/jacek-lewandowski/278bfc936ca990bee35a I am getting following error : {noformat} Exception in thread "main" java.lang.IllegalArgumentException: Failed to get converter for field "getId" of type java.lang.Integer in com.datastax.spark.demo.JavaDemo$Product mapped to column "id" of "java_api.products" at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$5.apply(MappedToGettableDataConverter.scala:155) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$5.apply(MappedToGettableDataConverter.scala:148) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.immutable.Range.foreach(Range.scala:141) at scala.collection.TraversableLike$class.map(TraversableLike.scala:244) at scala.collection.AbstractTraversable.map(Traversable.scala:105) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.<init>(MappedToGettableDataConverter.scala:148) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$.apply(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.<init>(DefaultRowWriter.scala:17) at com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1.rowWriter(DefaultRowWriter.scala:31) at com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1.rowWriter(DefaultRowWriter.scala:29) at com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:269) at com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36) at com.datastax.spark.connector.japi.RDDJavaFunctions.saveToCassandra(RDDJavaFunctions.java:61) at com.datastax.spark.connector.japi.RDDAndDStreamCommonJavaFunctions$WriterBuilder.saveToCassandra(RDDAndDStreamCommonJavaFunctions.java:443) at com.datastax.spark.demo.JavaDemo.generateData(JavaDemo.java:76) at com.datastax.spark.demo.JavaDemo.run(JavaDemo.java:37) at com.datastax.spark.demo.JavaDemo.main(JavaDemo.java:205) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) {noformat}

    DataStax JIRA | 1 year ago | Purvi
    java.lang.IllegalArgumentException: Failed to get converter for field "getId" of type java.lang.Integer in com.datastax.spark.demo.JavaDemo$Product mapped to column "id" of "java_api.products"
  3. 0

    I am trying to run JavaDemo sample example from DataStax Tutorials using Spark Cassandra Connector. The blog post: http://www.datastax.com/dev/blog/accessing-cassandra-from-spark-in-java Complete code can be found at : https://gist.github.com/jacek-lewandowski/278bfc936ca990bee35a I am getting following error : {noformat} Exception in thread "main" java.lang.IllegalArgumentException: Failed to get converter for field "getId" of type java.lang.Integer in com.datastax.spark.demo.JavaDemo$Product mapped to column "id" of "java_api.products" at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$5.apply(MappedToGettableDataConverter.scala:155) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$5.apply(MappedToGettableDataConverter.scala:148) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.immutable.Range.foreach(Range.scala:141) at scala.collection.TraversableLike$class.map(TraversableLike.scala:244) at scala.collection.AbstractTraversable.map(Traversable.scala:105) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.<init>(MappedToGettableDataConverter.scala:148) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$.apply(MappedToGettableDataConverter.scala:18) at com.datastax.spark.connector.writer.DefaultRowWriter.<init>(DefaultRowWriter.scala:17) at com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1.rowWriter(DefaultRowWriter.scala:31) at com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1.rowWriter(DefaultRowWriter.scala:29) at com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:269) at com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36) at com.datastax.spark.connector.japi.RDDJavaFunctions.saveToCassandra(RDDJavaFunctions.java:61) at com.datastax.spark.connector.japi.RDDAndDStreamCommonJavaFunctions$WriterBuilder.saveToCassandra(RDDAndDStreamCommonJavaFunctions.java:443) at com.datastax.spark.demo.JavaDemo.generateData(JavaDemo.java:76) at com.datastax.spark.demo.JavaDemo.run(JavaDemo.java:37) at com.datastax.spark.demo.JavaDemo.main(JavaDemo.java:205) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) {noformat}

    DataStax JIRA | 1 year ago | Purvi
    java.lang.IllegalArgumentException: Failed to get converter for field "getId" of type java.lang.Integer in com.datastax.spark.demo.JavaDemo$Product mapped to column "id" of "java_api.products"
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Spark (1.5.2) & Cassandra (2.2.5) Java Connector exception

    Stack Overflow | 8 months ago | Ashis
    java.lang.IllegalArgumentException: Failed to get converter for field "getCompanyid" of type java.lang.String in SampleDBOperation$Employee mapped to column "companyid" of "test.employee"
  6. 0

    Given tables {code} CREATE TYPE Point (x Int, y int) ; CREATE TABLE kv (k int PRIMARY KEY, loc list<frozen<Point>> ) ; INSERT INTO kv (k , loc ) VALUES ( 1, [{x:1,y:1}]) ; {code} Reading works correctly but writing fails {code} case class Point (x: Int, y: Int) case class KVRow (k: Int, loc: Seq[Point]) sc.cassandraTable[KVRow]("test", "kv").collect sc.cassandraTable[KVRow]("test", "kv").saveToCassandra("test", "kv") {code} Exception on write {code} java.lang.IllegalArgumentException: Failed to get converter for field "loc" of type scala.Seq[Point] in KVRow mapped to column "loc" of "test.kv" at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$5.apply(MappedToGettableDataConverter.scala:164) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$5.apply(MappedToGettableDataConverter.scala:157) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.immutable.Range.foreach(Range.scala:141) at scala.collection.TraversableLike$class.map(TraversableLike.scala:244) at scala.collection.AbstractTraversable.map(Traversable.scala:105) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.<init>(MappedToGettableDataConverter.scala:157) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$.apply(MappedToGettableDataConverter.scala:20) at com.datastax.spark.connector.writer.DefaultRowWriter.<init>(DefaultRowWriter.scala:17) at com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1.rowWriter(DefaultRowWriter.scala:31) at com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1.rowWriter(DefaultRowWriter.scala:29) at com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:280) at com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:60) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:65) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:67) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:69) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:71) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:73) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:75) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:77) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:79) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:81) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:83) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:85) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:87) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:89) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:91) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:93) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:95) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:97) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:99) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:101) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:103) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:105) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:107) at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:109) at $iwC$$iwC$$iwC$$iwC.<init>(<console>:111) at $iwC$$iwC$$iwC.<init>(<console>:113) at $iwC$$iwC.<init>(<console>:115) at $iwC.<init>(<console>:117) at <init>(<console>:119) at .<init>(<console>:123) at .<clinit>(<console>) at .<init>(<console>:7) at .<clinit>(<console>) at $print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657) at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) at com.datastax.bdp.spark.SparkReplMain$.main(SparkReplMain.scala:16) at com.datastax.bdp.spark.SparkReplMain.main(SparkReplMain.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.DseSparkSubmitBootstrapper$.main(DseSparkSubmitBootstrapper.scala:47) at org.apache.spark.deploy.DseSparkSubmitBootstrapper.main(DseSparkSubmitBootstrapper.scala) Caused by: java.lang.ClassNotFoundException: $line48.$read at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at scala.reflect.runtime.JavaMirrors$JavaMirror.javaClass(JavaMirrors.scala:500) at scala.reflect.runtime.JavaMirrors$JavaMirror$$anonfun$classToJava$1.apply(JavaMirrors.scala:1170) at scala.reflect.runtime.JavaMirrors$JavaMirror$$anonfun$classToJava$1.apply(JavaMirrors.scala:1162) at scala.reflect.runtime.TwoWayCache.toJava(TwoWayCache.scala:49) at scala.reflect.runtime.JavaMirrors$JavaMirror.classToJava(JavaMirrors.scala:1162) at scala.reflect.runtime.JavaMirrors$JavaMirror$$anonfun$classToJava$1.apply(JavaMirrors.scala:1177) at scala.reflect.runtime.JavaMirrors$JavaMirror$$anonfun$classToJava$1.apply(JavaMirrors.scala:1162) at scala.reflect.runtime.TwoWayCache.toJava(TwoWayCache.scala:49) at scala.reflect.runtime.JavaMirrors$JavaMirror.classToJava(JavaMirrors.scala:1162) at scala.reflect.runtime.JavaMirrors$JavaMirror$$anonfun$classToJava$1.apply(JavaMirrors.scala:1177) at scala.reflect.runtime.JavaMirrors$JavaMirror$$anonfun$classToJava$1.apply(JavaMirrors.scala:1162) at scala.reflect.runtime.TwoWayCache.toJava(TwoWayCache.scala:49) at scala.reflect.runtime.JavaMirrors$JavaMirror.classToJava(JavaMirrors.scala:1162) at scala.reflect.runtime.JavaMirrors$JavaMirror$$anonfun$classToJava$1.apply(JavaMirrors.scala:1177) at scala.reflect.runtime.JavaMirrors$JavaMirror$$anonfun$classToJava$1.apply(JavaMirrors.scala:1162) at scala.reflect.runtime.TwoWayCache.toJava(TwoWayCache.scala:49) at scala.reflect.runtime.JavaMirrors$JavaMirror.classToJava(JavaMirrors.scala:1162) at scala.reflect.runtime.JavaMirrors$JavaMirror.typeToJavaClass(JavaMirrors.scala:1258) at scala.reflect.runtime.JavaMirrors$JavaMirror.runtimeClass(JavaMirrors.scala:202) at scala.reflect.runtime.JavaMirrors$JavaMirror.runtimeClass(JavaMirrors.scala:65) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.<init>(MappedToGettableDataConverter.scala:133) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$.apply(MappedToGettableDataConverter.scala:20) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.converter(MappedToGettableDataConverter.scala:111) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.com$datastax$spark$connector$writer$MappedToGettableDataConverter$$anon$$converter(MappedToGettableDataConverter.scala:125) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.converter(MappedToGettableDataConverter.scala:63) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.com$datastax$spark$connector$writer$MappedToGettableDataConverter$$anon$$converter(MappedToGettableDataConverter.scala:125) at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$5.apply(MappedToGettableDataConverter.scala:161) ... 80 more {code} This may only effect the shell (because of the CNF exception) but we should still add some tests to make sure this is working at the least.

    DataStax JIRA | 8 months ago | Russell Spitzer
    java.lang.IllegalArgumentException: Failed to get converter for field "loc" of type scala.Seq[Point] in KVRow mapped to column "loc" of "test.kv"

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.IllegalArgumentException

      Failed to get converter for field "getId" of type java.lang.Integer in com.datastax.spark.demo.JavaDemo$Product mapped to column "id" of "java_api.products"

      at com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$5.apply()
    2. spark-cassandra-connector
      MappedToGettableDataConverter$$anon$1$$anonfun$5.apply
      1. com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$5.apply(MappedToGettableDataConverter.scala:155)
      2. com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1$$anonfun$5.apply(MappedToGettableDataConverter.scala:148)
      2 frames
    3. Scala
      AbstractTraversable.map
      1. scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
      2. scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
      3. scala.collection.immutable.Range.foreach(Range.scala:141)
      4. scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
      5. scala.collection.AbstractTraversable.map(Traversable.scala:105)
      5 frames
    4. spark-cassandra-connector
      RDDFunctions.saveToCassandra
      1. com.datastax.spark.connector.writer.MappedToGettableDataConverter$$anon$1.<init>(MappedToGettableDataConverter.scala:148)
      2. com.datastax.spark.connector.writer.MappedToGettableDataConverter$.apply(MappedToGettableDataConverter.scala:18)
      3. com.datastax.spark.connector.writer.DefaultRowWriter.<init>(DefaultRowWriter.scala:17)
      4. com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1.rowWriter(DefaultRowWriter.scala:31)
      5. com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1.rowWriter(DefaultRowWriter.scala:29)
      6. com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:269)
      7. com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
      7 frames
    5. com.datastax.spark
      JavaDemo.main
      1. com.datastax.spark.connector.japi.RDDJavaFunctions.saveToCassandra(RDDJavaFunctions.java:61)
      2. com.datastax.spark.connector.japi.RDDAndDStreamCommonJavaFunctions$WriterBuilder.saveToCassandra(RDDAndDStreamCommonJavaFunctions.java:443)
      3. com.datastax.spark.demo.JavaDemo.generateData(JavaDemo.java:76)
      4. com.datastax.spark.demo.JavaDemo.run(JavaDemo.java:37)
      5. com.datastax.spark.demo.JavaDemo.main(JavaDemo.java:205)
      5 frames
    6. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:497)
      4 frames
    7. Spark
      SparkSubmit.main
      1. org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
      2. org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
      3. org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
      4. org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
      5. org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      5 frames