java.lang.RuntimeException: CrawlServer must deserialize in a ToeThread or CheckpointingThread

JIRA | Gordon Mohr | 9 years ago
  1. 0

    Magin got a lot of these doing a recovery from a custom recover log for 2BC supplemental crawl: 09/30/2007 05:57:55 +0000 SEVERE org.archive.crawler.frontier.RecoveryJournal importQueuesFromLog exception during lo java.lang.RuntimeException: CrawlServer must deserialize in a ToeThread or CheckpointingThread at org.archive.crawler.datamodel.CrawlServer.readObject(CrawlServer.java:245) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:974) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1846) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1753) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:351) at com.sleepycat.bind.serial.SerialBinding.entryToObject(SerialBinding.java:122) at com.sleepycat.collections.DataView.makeValue(DataView.java:588) at com.sleepycat.collections.DataCursor.getCurrentValue(DataCursor.java:356) at com.sleepycat.collections.DataCursor.initForPut(DataCursor.java:820) at com.sleepycat.collections.DataCursor.put(DataCursor.java:758) at com.sleepycat.collections.StoredContainer.put(StoredContainer.java:300) at com.sleepycat.collections.StoredMap.put(StoredMap.java:248) at org.archive.util.CachedBdbMap.expungeStaleEntry(CachedBdbMap.java:562) at org.archive.util.CachedBdbMap.expungeStaleEntries(CachedBdbMap.java:533) at org.archive.util.CachedBdbMap.get(CachedBdbMap.java:358) at org.archive.crawler.datamodel.ServerCache.getServerFor(ServerCache.java:93) at org.archive.crawler.datamodel.ServerCache.getServerFor(ServerCache.java:125) at org.archive.crawler.frontier.AbstractFrontier.tally(AbstractFrontier.java:412) at org.archive.crawler.frontier.AbstractFrontier.doJournalAdded(AbstractFrontier.java:435) at org.archive.crawler.frontier.WorkQueueFrontier.receive(WorkQueueFrontier.java:450) at org.archive.crawler.util.SetBasedUriUniqFilter.add(SetBasedUriUniqFilter.java:90) at org.archive.crawler.frontier.WorkQueueFrontier.schedule(WorkQueueFrontier.java:432) at org.archive.crawler.frontier.RecoveryJournal.importQueuesFromLog(RecoveryJournal.java:341) at org.archive.crawler.frontier.RecoveryJournal.access$000(RecoveryJournal.java:61) at org.archive.crawler.frontier.RecoveryJournal$1.run(RecoveryJournal.java:174) at java.lang.Thread.run(Thread.java:619) Ultimately, crawl died with a OOME, probably related.

    JIRA | 9 years ago | Gordon Mohr
    java.lang.RuntimeException: CrawlServer must deserialize in a ToeThread or CheckpointingThread
  2. 0

    Magin got a lot of these doing a recovery from a custom recover log for 2BC supplemental crawl: 09/30/2007 05:57:55 +0000 SEVERE org.archive.crawler.frontier.RecoveryJournal importQueuesFromLog exception during lo java.lang.RuntimeException: CrawlServer must deserialize in a ToeThread or CheckpointingThread at org.archive.crawler.datamodel.CrawlServer.readObject(CrawlServer.java:245) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:974) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1846) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1753) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:351) at com.sleepycat.bind.serial.SerialBinding.entryToObject(SerialBinding.java:122) at com.sleepycat.collections.DataView.makeValue(DataView.java:588) at com.sleepycat.collections.DataCursor.getCurrentValue(DataCursor.java:356) at com.sleepycat.collections.DataCursor.initForPut(DataCursor.java:820) at com.sleepycat.collections.DataCursor.put(DataCursor.java:758) at com.sleepycat.collections.StoredContainer.put(StoredContainer.java:300) at com.sleepycat.collections.StoredMap.put(StoredMap.java:248) at org.archive.util.CachedBdbMap.expungeStaleEntry(CachedBdbMap.java:562) at org.archive.util.CachedBdbMap.expungeStaleEntries(CachedBdbMap.java:533) at org.archive.util.CachedBdbMap.get(CachedBdbMap.java:358) at org.archive.crawler.datamodel.ServerCache.getServerFor(ServerCache.java:93) at org.archive.crawler.datamodel.ServerCache.getServerFor(ServerCache.java:125) at org.archive.crawler.frontier.AbstractFrontier.tally(AbstractFrontier.java:412) at org.archive.crawler.frontier.AbstractFrontier.doJournalAdded(AbstractFrontier.java:435) at org.archive.crawler.frontier.WorkQueueFrontier.receive(WorkQueueFrontier.java:450) at org.archive.crawler.util.SetBasedUriUniqFilter.add(SetBasedUriUniqFilter.java:90) at org.archive.crawler.frontier.WorkQueueFrontier.schedule(WorkQueueFrontier.java:432) at org.archive.crawler.frontier.RecoveryJournal.importQueuesFromLog(RecoveryJournal.java:341) at org.archive.crawler.frontier.RecoveryJournal.access$000(RecoveryJournal.java:61) at org.archive.crawler.frontier.RecoveryJournal$1.run(RecoveryJournal.java:174) at java.lang.Thread.run(Thread.java:619) Ultimately, crawl died with a OOME, probably related.

    JIRA | 9 years ago | Gordon Mohr
    java.lang.RuntimeException: CrawlServer must deserialize in a ToeThread or CheckpointingThread
  3. 0

    When booting jboss: Could not deserialize info timer - ClassNotFoundException

    Stack Overflow | 4 months ago | Tom Brito
    java.lang.RuntimeException: Could not deserialize info in timer
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    source mySql save

    Google Groups | 8 years ago | clogan
    java.lang.RuntimeException: You must include a Client-Id or Create- Client-Id header
  6. 0

    Exception thrown by NettyClientRouter during message decoding

    GitHub | 5 months ago | heidi-ann
    io.netty.handler.codec.DecoderException: java.lang.RuntimeException: Attempt to deserialize a message which is not a CorfuMsg, Marker = 4 but expected 0xC0FC0FC0

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.RuntimeException

      CrawlServer must deserialize in a ToeThread or CheckpointingThread

      at org.archive.crawler.datamodel.CrawlServer.readObject()
    2. org.archive.crawler
      CrawlServer.readObject
      1. org.archive.crawler.datamodel.CrawlServer.readObject(CrawlServer.java:245)
      1 frame
    3. Java RT
      ObjectInputStream.readObject
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      4. java.lang.reflect.Method.invoke(Method.java:597)
      5. java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:974)
      6. java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1846)
      7. java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1753)
      8. java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329)
      9. java.io.ObjectInputStream.readObject(ObjectInputStream.java:351)
      9 frames
    4. Berkeley DB Java Edition
      StoredMap.put
      1. com.sleepycat.bind.serial.SerialBinding.entryToObject(SerialBinding.java:122)
      2. com.sleepycat.collections.DataView.makeValue(DataView.java:588)
      3. com.sleepycat.collections.DataCursor.getCurrentValue(DataCursor.java:356)
      4. com.sleepycat.collections.DataCursor.initForPut(DataCursor.java:820)
      5. com.sleepycat.collections.DataCursor.put(DataCursor.java:758)
      6. com.sleepycat.collections.StoredContainer.put(StoredContainer.java:300)
      7. com.sleepycat.collections.StoredMap.put(StoredMap.java:248)
      7 frames
    5. webarchive-commons
      CachedBdbMap.get
      1. org.archive.util.CachedBdbMap.expungeStaleEntry(CachedBdbMap.java:562)
      2. org.archive.util.CachedBdbMap.expungeStaleEntries(CachedBdbMap.java:533)
      3. org.archive.util.CachedBdbMap.get(CachedBdbMap.java:358)
      3 frames
    6. org.archive.crawler
      RecoveryJournal$1.run
      1. org.archive.crawler.datamodel.ServerCache.getServerFor(ServerCache.java:93)
      2. org.archive.crawler.datamodel.ServerCache.getServerFor(ServerCache.java:125)
      3. org.archive.crawler.frontier.AbstractFrontier.tally(AbstractFrontier.java:412)
      4. org.archive.crawler.frontier.AbstractFrontier.doJournalAdded(AbstractFrontier.java:435)
      5. org.archive.crawler.frontier.WorkQueueFrontier.receive(WorkQueueFrontier.java:450)
      6. org.archive.crawler.util.SetBasedUriUniqFilter.add(SetBasedUriUniqFilter.java:90)
      7. org.archive.crawler.frontier.WorkQueueFrontier.schedule(WorkQueueFrontier.java:432)
      8. org.archive.crawler.frontier.RecoveryJournal.importQueuesFromLog(RecoveryJournal.java:341)
      9. org.archive.crawler.frontier.RecoveryJournal.access$000(RecoveryJournal.java:61)
      10. org.archive.crawler.frontier.RecoveryJournal$1.run(RecoveryJournal.java:174)
      10 frames
    7. Java RT
      Thread.run
      1. java.lang.Thread.run(Thread.java:619)
      1 frame