java.lang.ClassCastException: io.confluent.kafka.serializers.NonRecordContainer cannot be cast to java.lang.CharSequence

Google Groups | Unknown author | 5 months ago
  1. 0

    issues about converting json-format data in Kafka to parquet-format using Kafka Connector?

    Google Groups | 5 months ago | Unknown author
    java.lang.ClassCastException: io.confluent.kafka.serializers.NonRecordContainer cannot be cast to java.lang.CharSequence
  2. 0

    'Re: How to use java-class with JSON schema?' - MARC

    marc.info | 2 months ago
    java.lang.ClassCastException: com.mediamath.data.util.Timestamp cannot be cast to java.lang.CharSequence
  3. 0
    When you have no build parameters, uncheck 'This build is parameterized'
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    collectionFormat shouldn't be applied when parsing body

    GitHub | 2 months ago | cmpitg
    java.lang.ClassCastException: clojure.lang.PersistentArrayMap cannot be cast to java.lang.CharSequence

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.ClassCastException

      io.confluent.kafka.serializers.NonRecordContainer cannot be cast to java.lang.CharSequence

      at org.apache.avro.generic.GenericDatumWriter.writeString()
    2. Apache Avro
      DataFileWriter.append
      1. org.apache.avro.generic.GenericDatumWriter.writeString(GenericDatumWriter.java:213)
      2. org.apache.avro.generic.GenericDatumWriter.writeString(GenericDatumWriter.java:208)
      3. org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:76)
      4. org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:58)
      5. org.apache.avro.file.DataFileWriter.append(DataFileWriter.java:290)
      5 frames
    3. io.confluent.connect
      HdfsSinkTask.put
      1. io.confluent.connect.hdfs.avro.AvroRecordWriterProvider$1.write(AvroRecordWriterProvider.java:64)
      2. io.confluent.connect.hdfs.avro.AvroRecordWriterProvider$1.write(AvroRecordWriterProvider.java:59)
      3. io.confluent.connect.hdfs.TopicPartitionWriter.writeRecord(TopicPartitionWriter.java:487)
      4. io.confluent.connect.hdfs.TopicPartitionWriter.write(TopicPartitionWriter.java:264)
      5. io.confluent.connect.hdfs.DataWriter.write(DataWriter.java:234)
      6. io.confluent.connect.hdfs.HdfsSinkTask.put(HdfsSinkTask.java:91)
      6 frames
    4. org.apache.kafka
      WorkerTask.run
      1. org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:384)
      2. org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:228)
      3. org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:171)
      4. org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:143)
      5. org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:140)
      6. org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:175)
      6 frames
    5. Java RT
      Thread.run
      1. java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
      2. java.util.concurrent.FutureTask.run(FutureTask.java:266)
      3. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      4. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      5. java.lang.Thread.run(Thread.java:745)
      5 frames