java.lang.RuntimeException

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • Magin got a lot of these doing a recovery from a custom recover log for 2BC supplemental crawl: 09/30/2007 05:57:55 +0000 SEVERE org.archive.crawler.frontier.RecoveryJournal importQueuesFromLog exception during lo java.lang.RuntimeException: CrawlServer must deserialize in a ToeThread or CheckpointingThread at org.archive.crawler.datamodel.CrawlServer.readObject(CrawlServer.java:245) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:974) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1846) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1753) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:351) at com.sleepycat.bind.serial.SerialBinding.entryToObject(SerialBinding.java:122) at com.sleepycat.collections.DataView.makeValue(DataView.java:588) at com.sleepycat.collections.DataCursor.getCurrentValue(DataCursor.java:356) at com.sleepycat.collections.DataCursor.initForPut(DataCursor.java:820) at com.sleepycat.collections.DataCursor.put(DataCursor.java:758) at com.sleepycat.collections.StoredContainer.put(StoredContainer.java:300) at com.sleepycat.collections.StoredMap.put(StoredMap.java:248) at org.archive.util.CachedBdbMap.expungeStaleEntry(CachedBdbMap.java:562) at org.archive.util.CachedBdbMap.expungeStaleEntries(CachedBdbMap.java:533) at org.archive.util.CachedBdbMap.get(CachedBdbMap.java:358) at org.archive.crawler.datamodel.ServerCache.getServerFor(ServerCache.java:93) at org.archive.crawler.datamodel.ServerCache.getServerFor(ServerCache.java:125) at org.archive.crawler.frontier.AbstractFrontier.tally(AbstractFrontier.java:412) at org.archive.crawler.frontier.AbstractFrontier.doJournalAdded(AbstractFrontier.java:435) at org.archive.crawler.frontier.WorkQueueFrontier.receive(WorkQueueFrontier.java:450) at org.archive.crawler.util.SetBasedUriUniqFilter.add(SetBasedUriUniqFilter.java:90) at org.archive.crawler.frontier.WorkQueueFrontier.schedule(WorkQueueFrontier.java:432) at org.archive.crawler.frontier.RecoveryJournal.importQueuesFromLog(RecoveryJournal.java:341) at org.archive.crawler.frontier.RecoveryJournal.access$000(RecoveryJournal.java:61) at org.archive.crawler.frontier.RecoveryJournal$1.run(RecoveryJournal.java:174) at java.lang.Thread.run(Thread.java:619) Ultimately, crawl died with a OOME, probably related.
    via by Gordon Mohr,
  • Magin got a lot of these doing a recovery from a custom recover log for 2BC supplemental crawl: 09/30/2007 05:57:55 +0000 SEVERE org.archive.crawler.frontier.RecoveryJournal importQueuesFromLog exception during lo java.lang.RuntimeException: CrawlServer must deserialize in a ToeThread or CheckpointingThread at org.archive.crawler.datamodel.CrawlServer.readObject(CrawlServer.java:245) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:974) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1846) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1753) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:351) at com.sleepycat.bind.serial.SerialBinding.entryToObject(SerialBinding.java:122) at com.sleepycat.collections.DataView.makeValue(DataView.java:588) at com.sleepycat.collections.DataCursor.getCurrentValue(DataCursor.java:356) at com.sleepycat.collections.DataCursor.initForPut(DataCursor.java:820) at com.sleepycat.collections.DataCursor.put(DataCursor.java:758) at com.sleepycat.collections.StoredContainer.put(StoredContainer.java:300) at com.sleepycat.collections.StoredMap.put(StoredMap.java:248) at org.archive.util.CachedBdbMap.expungeStaleEntry(CachedBdbMap.java:562) at org.archive.util.CachedBdbMap.expungeStaleEntries(CachedBdbMap.java:533) at org.archive.util.CachedBdbMap.get(CachedBdbMap.java:358) at org.archive.crawler.datamodel.ServerCache.getServerFor(ServerCache.java:93) at org.archive.crawler.datamodel.ServerCache.getServerFor(ServerCache.java:125) at org.archive.crawler.frontier.AbstractFrontier.tally(AbstractFrontier.java:412) at org.archive.crawler.frontier.AbstractFrontier.doJournalAdded(AbstractFrontier.java:435) at org.archive.crawler.frontier.WorkQueueFrontier.receive(WorkQueueFrontier.java:450) at org.archive.crawler.util.SetBasedUriUniqFilter.add(SetBasedUriUniqFilter.java:90) at org.archive.crawler.frontier.WorkQueueFrontier.schedule(WorkQueueFrontier.java:432) at org.archive.crawler.frontier.RecoveryJournal.importQueuesFromLog(RecoveryJournal.java:341) at org.archive.crawler.frontier.RecoveryJournal.access$000(RecoveryJournal.java:61) at org.archive.crawler.frontier.RecoveryJournal$1.run(RecoveryJournal.java:174) at java.lang.Thread.run(Thread.java:619) Ultimately, crawl died with a OOME, probably related.
    via by Gordon Mohr,
    • java.lang.RuntimeException: CrawlServer must deserialize in a ToeThread or CheckpointingThread at org.archive.crawler.datamodel.CrawlServer.readObject(CrawlServer.java:245) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:974) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1846) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1753) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:351) at com.sleepycat.bind.serial.SerialBinding.entryToObject(SerialBinding.java:122) at com.sleepycat.collections.DataView.makeValue(DataView.java:588) at com.sleepycat.collections.DataCursor.getCurrentValue(DataCursor.java:356) at com.sleepycat.collections.DataCursor.initForPut(DataCursor.java:820) at com.sleepycat.collections.DataCursor.put(DataCursor.java:758) at com.sleepycat.collections.StoredContainer.put(StoredContainer.java:300) at com.sleepycat.collections.StoredMap.put(StoredMap.java:248) at org.archive.util.CachedBdbMap.expungeStaleEntry(CachedBdbMap.java:562) at org.archive.util.CachedBdbMap.expungeStaleEntries(CachedBdbMap.java:533) at org.archive.util.CachedBdbMap.get(CachedBdbMap.java:358) at org.archive.crawler.datamodel.ServerCache.getServerFor(ServerCache.java:93) at org.archive.crawler.datamodel.ServerCache.getServerFor(ServerCache.java:125) at org.archive.crawler.frontier.AbstractFrontier.tally(AbstractFrontier.java:412) at org.archive.crawler.frontier.AbstractFrontier.doJournalAdded(AbstractFrontier.java:435) at org.archive.crawler.frontier.WorkQueueFrontier.receive(WorkQueueFrontier.java:450) at org.archive.crawler.util.SetBasedUriUniqFilter.add(SetBasedUriUniqFilter.java:90) at org.archive.crawler.frontier.WorkQueueFrontier.schedule(WorkQueueFrontier.java:432) at org.archive.crawler.frontier.RecoveryJournal.importQueuesFromLog(RecoveryJournal.java:341) at org.archive.crawler.frontier.RecoveryJournal.access$000(RecoveryJournal.java:61) at org.archive.crawler.frontier.RecoveryJournal$1.run(RecoveryJournal.java:174) at java.lang.Thread.run(Thread.java:619)
    No Bugmate found.