org.hibernate.search.exception.SearchException

HSEARCH000083: Unable to serialize List<LuceneWork>

Samebug tips0

We couldn't find tips for this exception.

Don't give up yet. Paste your full stack trace to get a solution.

Solutions on the web9

  • via Unknown by Steffen Terheiden,
  • via Unknown by Unknown author,
  • Stack trace

    • org.hibernate.search.exception.SearchException: HSEARCH000083: Unable to serialize List<LuceneWork> at org.hibernate.search.indexes.serialization.impl.LuceneWorkSerializerImpl.toSerializedModel(LuceneWorkSerializerImpl.java:109) at org.hibernate.search.backend.jms.impl.JmsBackendQueueTask.run(JmsBackendQueueTask.java:61) at org.hibernate.search.backend.jms.impl.JmsBackendQueueProcessor.applyWork(JmsBackendQueueProcessor.java:88) at org.hibernate.search.indexes.spi.DirectoryBasedIndexManager.performOperations(DirectoryBasedIndexManager.java:112) at org.hibernate.search.backend.impl.WorkQueuePerIndexSplitter.commitOperations(WorkQueuePerIndexSplitter.java:49) at org.hibernate.search.backend.impl.BatchedQueueingProcessor.performWorks(BatchedQueueingProcessor.java:81) at org.hibernate.search.backend.impl.PostTransactionWorkQueueSynchronization.flushWorks(PostTransactionWorkQueueSynchronization.java:114) at org.hibernate.search.backend.impl.TransactionalWorker.flushWorks(TransactionalWorker.java:165) at org.hibernate.search.impl.FullTextSessionImpl.flushToIndexes(FullTextSessionImpl.java:87) at com.sobis.jaf.JAFApplication.createIndexFor(JAFApplication.java:919) at com.sobis.jaf.JAFApplication.createIndexAndVerify(JAFApplication.java:820) at com.sobis.jaf.JAFApplication.createIndex(JAFApplication.java:796) at com.sobis.jaf.JAFApplication.createIndex(JAFApplication.java:672) at com.sobis.jaf.JAFApplication$1.performAction(JAFApplication.java:486) at com.sobis.jaf.services.thread.JAFThread.run(JAFThread.java:71) Caused by: java.lang.IllegalStateException: TokenStream contract violation: reset()/close() call missing, reset() called multiple times, or subclass does not call super.reset(). Please see Javadocs of TokenStream class for more information about the correct consuming workflow. at org.apache.lucene.analysis.Tokenizer$1.read(Tokenizer.java:111) at org.apache.lucene.analysis.core.KeywordTokenizer.incrementToken(KeywordTokenizer.java:68) at org.hibernate.search.indexes.serialization.impl.CopyTokenStream.createAttributeLists(CopyTokenStream.java:85) at org.hibernate.search.indexes.serialization.impl.CopyTokenStream.buildSerializableTokenStream(CopyTokenStream.java:39) at org.hibernate.search.indexes.serialization.spi.LuceneFieldContext.getTokenStream(LuceneFieldContext.java:137) at org.hibernate.search.indexes.serialization.avro.impl.AvroSerializer.addFieldWithTokenStreamData(AvroSerializer.java:281) at org.hibernate.search.indexes.serialization.impl.LuceneWorkSerializerImpl.serializeField(LuceneWorkSerializerImpl.java:237) at org.hibernate.search.indexes.serialization.impl.LuceneWorkSerializerImpl.serializeDocument(LuceneWorkSerializerImpl.java:175) at org.hibernate.search.indexes.serialization.impl.LuceneWorkSerializerImpl.toSerializedModel(LuceneWorkSerializerImpl.java:97) ... 14 more

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    Unknown visitor
    Unknown visitorOnce,