java.lang.IllegalArgumentException: Class is not registered: com.databricks.spark.avro.DefaultSource$SerializableConfiguration Note: To register this class use: kryo.register(com.databricks.spark.avro.DefaultSource$SerializableConfiguration.class);

GitHub | fooblahblah | 3 months ago
  1. 0

    GitHub comment 147#245684072

    GitHub | 3 months ago | fooblahblah
    java.lang.IllegalArgumentException: Class is not registered: com.databricks.spark.avro.DefaultSource$SerializableConfiguration Note: To register this class use: kryo.register(com.databricks.spark.avro.DefaultSource$SerializableConfiguration.class);
  2. 0

    Java Kryonet [Class is not registered Exception ]

    Stack Overflow | 5 years ago | nebula
    java.lang.IllegalArgumentException: Class is not registered: client.SomeRequest
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

  4. 0

    java kryonet - ChatMessage - Class is not registered

    Stack Overflow | 1 year ago | Ray Tayek
    java.lang.IllegalArgumentException: Class is not registered: com.esotericsoftware.kryonet.examples.chat.Network$RegisterName Note: To register this class use: kryo.register(com.esotericsoftware.kryonet.examples.chat.Network$RegisterName.class);
  5. 0

    Spark, mail # user - How to register array class with Kyro in spark-defaults.conf - 2015-07-30, 15:08

    search-hadoop.com | 1 year ago
    java.lang.IllegalArgumentException: Class is not registered: ltn.analytics.es.EsDoc[] Note: To register this class use: kryo.register(ltn.analytics.es.EsDoc[].class);
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.IllegalArgumentException

    Class is not registered: com.databricks.spark.avro.DefaultSource$SerializableConfiguration Note: To register this class use: kryo.register(com.databricks.spark.avro.DefaultSource$SerializableConfiguration.class);

    at com.esotericsoftware.kryo.Kryo.getRegistration()
  2. Kryo
    Kryo.writeClassAndObject
    1. com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)
    2. com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)
    3. com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)
    4. com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)
    4 frames
  3. Spark
    SparkContext.broadcast
    1. org.apache.spark.serializer.KryoSerializationStream.writeObject(KryoSerializer.scala:195)
    2. org.apache.spark.broadcast.TorrentBroadcast$$anonfun$blockifyObject$2.apply(TorrentBroadcast.scala:236)
    3. org.apache.spark.broadcast.TorrentBroadcast$$anonfun$blockifyObject$2.apply(TorrentBroadcast.scala:236)
    4. org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1287)
    5. org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:237)
    6. org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:107)
    7. org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:86)
    8. org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
    9. org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:56)
    10. org.apache.spark.SparkContext.broadcast(SparkContext.scala:1370)
    10 frames
  4. com.databricks.spark
    DefaultSource.buildReader
    1. com.databricks.spark.avro.DefaultSource.buildReader(DefaultSource.scala:144)
    1 frame
  5. org.apache.spark
    FileFormat$class.buildReaderWithPartitionValues
    1. org.apache.spark.sql.execution.datasources.FileFormat$class.buildReaderWithPartitionValues(fileSourceInterfaces.scala:260)
    1 frame
  6. com.databricks.spark
    DefaultSource.buildReaderWithPartitionValues
    1. com.databricks.spark.avro.DefaultSource.buildReaderWithPartitionValues(DefaultSource.scala:46)
    1 frame
  7. org.apache.spark
    FileSourceStrategy$.apply
    1. org.apache.spark.sql.execution.datasources.FileSourceStrategy$.apply(FileSourceStrategy.scala:112)
    1 frame
  8. Spark Project Catalyst
    QueryPlanner$$anonfun$1.apply
    1. org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:60)
    1 frame