java.io.NotSerializableException

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • {code:java} val Keywords = Seq("love","hate",":-)",":)",":-(",":(") val sc = new SparkContext(conf) val batchDuration: Duration = Seconds(5) val ssc = new StreamingContext(sc, batchDuration) val stream: ReceiverInputDStream[Status] = TwitterUtils.createStream(ssc, Some(authorization), Nil, StorageLevel.MEMORY_ONLY_SER_2) stream.flatMap(_.getText.toLowerCase.split( """\s+""")) .filter(Keywords.contains(_)) .countByValueAndWindow(batchDuration, batchDuration) .transform((rdd, time) => rdd.map { case (keyword, count) => (keyword, count, now(time)) }) .repartitionByCassandraReplica(KEYSPACE,"stream_read") .transform(rdd => rdd.joinWithCassandraTable(KEYSPACE, "stream_read", AllColumns, SomeColumns("key"))) .map{case((keyword,count,date),row) => (row.getString("value"),count,date)} .saveToCassandra(KEYSPACE, TABLE, SomeColumns("keyword", "count", "interval")) {code} Exception: {code} 2015-06-19 11:18:40,442 [sparkDriver-akka.actor.default-dispatcher-4] ERROR akka.actor.OneForOneStrategy - com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1 java.io.NotSerializableException: com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1 at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1183) {code} I had a similar NotSerializableException when using *joinWithCassandraTable()* so I switched to *transform(rdd => rdd.joinWithCassandraTable(...))* For the record, I'm using the connector version *1.3.0-M1*, I had the same issue with *1.2.1* and I guess the issue is still there for *1.4.0-M1*
    via by DOAN DuyHai,
  • {code:java} val Keywords = Seq("love","hate",":-)",":)",":-(",":(") val sc = new SparkContext(conf) val batchDuration: Duration = Seconds(5) val ssc = new StreamingContext(sc, batchDuration) val stream: ReceiverInputDStream[Status] = TwitterUtils.createStream(ssc, Some(authorization), Nil, StorageLevel.MEMORY_ONLY_SER_2) stream.flatMap(_.getText.toLowerCase.split( """\s+""")) .filter(Keywords.contains(_)) .countByValueAndWindow(batchDuration, batchDuration) .transform((rdd, time) => rdd.map { case (keyword, count) => (keyword, count, now(time)) }) .repartitionByCassandraReplica(KEYSPACE,"stream_read") .transform(rdd => rdd.joinWithCassandraTable(KEYSPACE, "stream_read", AllColumns, SomeColumns("key"))) .map{case((keyword,count,date),row) => (row.getString("value"),count,date)} .saveToCassandra(KEYSPACE, TABLE, SomeColumns("keyword", "count", "interval")) {code} Exception: {code} 2015-06-19 11:18:40,442 [sparkDriver-akka.actor.default-dispatcher-4] ERROR akka.actor.OneForOneStrategy - com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1 java.io.NotSerializableException: com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1 at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1183) {code} I had a similar NotSerializableException when using *joinWithCassandraTable()* so I switched to *transform(rdd => rdd.joinWithCassandraTable(...))* For the record, I'm using the connector version *1.3.0-M1*, I had the same issue with *1.2.1* and I guess the issue is still there for *1.4.0-M1*
    via by DOAN DuyHai,
  • Kamon for Akka 2.3 is now available!
    via by Ivan Topolnjak,
  • Not Serializable Exception - Group Layout
    via Stack Overflow by bSky
    ,
    • java.io.NotSerializableException: com.datastax.spark.connector.writer.DefaultRowWriter$$anon$1 at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1183)

    Users with the same issue

    Unknown visitor
    Unknown visitor1 times, last one,
    Unknown visitor
    Unknown visitor1 times, last one,
    Unknown visitor
    Unknown visitor1 times, last one,
    Unknown visitor
    Unknown visitor1 times, last one,
    Unknown visitor
    Unknown visitor1 times, last one,
    68 more bugmates