java.util.NoSuchElementException

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • GitHub comment 2#237858229
    via GitHub by RhythmNz
    ,
  • Problem with reading a XML file
    via by Emilio Resende,
  • I started heritrix 1.14.4 on Friday morning to crawl only text/html files, beginning with yahoo.com On Monday morning i saw that heritrix just stopped crawling, heap space looks fine. The log says (last entry): 2010-12-10T18:04:49.904Z -5 - http://kr.promotion.yahoo.com/worldcup2010/ RRLLLLLLLE http://kr.sports.yahoo.com/event/wc2010/ no-type #020 - - - err=com.sleepycat.util.RuntimeExceptionWrapper com.sleepycat.util.RuntimeExceptionWrapper: (JE 3.3.82) fetchTarget of 0x50/0x8590e3 parent IN=266461 IN class=com.sleepycat.je.tree.BIN lastFullVersion=0x77/0x6dcaa0 parent.getDirty()=true state=0 com.sleepycat.je.log.LogFileNotFoundException: (JE 3.3.82) 0x50/0x8590e3 (JE 3.3.82) Couldn't open file /home/oli/Desktop/heritrix-1.14.4/jobs/nixdefault-20101210100246162/state/00000050.jdb: /home/oli/Desktop/heritrix-1.14.4/jobs/nixdefault-20101210100246162/state/00000050.jdb (Too many open files) at com.sleepycat.collections.StoredContainer.convertException(StoredContainer.java:466) at com.sleepycat.collections.StoredContainer.getValue(StoredContainer.java:306) at com.sleepycat.collections.StoredMap.get(StoredMap.java:227) at org.archive.util.ObjectIdentityBdbCache.getOrUse(ObjectIdentityBdbCache.java:264) at org.archive.util.ObjectIdentityBdbCache.getOrUse(ObjectIdentityBdbCache.java:75) at org.archive.crawler.datamodel.ServerCache.getServerFor(ServerCache.java:100) at org.archive.crawler.datamodel.ServerCache.getServerFor(ServerCache.java:124) at org.archive.crawler.prefetch.PreconditionEnforcer.considerDnsPreconditions(PreconditionEnforcer.java:227) at org.archive.crawler.prefetch.PreconditionEnforcer.innerProcess(PreconditionEnforcer.java:111) at org.archive.crawler.framework.Processor.process(Processor.java:109) at org.archive.crawler.framework.ToeThread.processCrawlUri(ToeThread.java:306) at org.archive.crawler.framework.ToeThread.run(ToeThread.java:154) but its the date from yesterday. I think there are some file handles left open - i use the write-to-harddisk-writer. I changed the /etc/security/limits.conf regarding the FAQ but the process is still doing nothing. It is just idle, consuming memory. when i go to "reports" in the web frontend i get: An error occured java.util.NoSuchElementException java.util.NoSuchElementException at java.util.TreeMap.key(TreeMap.java:1223) at java.util.TreeMap.firstKey(TreeMap.java:284) at java.util.TreeSet.first(TreeSet.java:394) at org.archive.crawler.framework.ToePool.singleLineReportTo(ToePool.java:276) at org.archive.util.ArchiveUtils.singleLineReport(ArchiveUtils.java:728) at org.archive.crawler.framework.ToePool.singleLineReport(ToePool.java:296) at org.archive.crawler.framework.CrawlController.oneLineReportThreads(CrawlController.java:1628) at org.archive.crawler.admin.CrawlJob.getThreadOneLine(CrawlJob.java:930) at org.archive.crawler.jspc.admin.reports_jsp._jspService(Unknown Source) at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:137) at javax.servlet.http.HttpServlet.service(HttpServlet.java:853) at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:358) at org.mortbay.jetty.servlet.WebApplicationHandler$Chain.doFilter(WebApplicationHandler.java:342) at org.archive.crawler.admin.ui.RootFilter.doFilter(RootFilter.java:67) at org.mortbay.jetty.servlet.WebApplicationHandler$Chain.doFilter(WebApplicationHandler.java:334) at org.mortbay.jetty.servlet.WebApplicationHandler.dispatch(WebApplicationHandler.java:286) at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:567) at org.mortbay.http.HttpContext.handle(HttpContext.java:1807) at org.mortbay.jetty.servlet.WebApplicationContext.handle(WebApplicationContext.java:525) at org.mortbay.http.HttpContext.handle(HttpContext.java:1757) at org.mortbay.http.HttpServer.service(HttpServer.java:879) at org.mortbay.http.HttpConnection.service(HttpConnection.java:789) at org.mortbay.http.HttpConnection.handleNext(HttpConnection.java:960) at org.mortbay.http.HttpConnection.handle(HttpConnection.java:806) at org.mortbay.http.SocketListener.handleConnection(SocketListener.java:218) at org.mortbay.util.ThreadedServer.handle(ThreadedServer.java:300) at org.mortbay.util.ThreadPool$PoolThread.run(ThreadPool.java:511) When i go to "Logs" they are displayed as empty. But they aren't (last entry local-errors.log): 2010-12-11T01:20:43.747Z -2 - http://www.kzone.com.au/js/prototype.js RRLLLLRE http://www.kzone.com.au/ no-type #049 - - - le:IOException@HTTP java.io.IOException: RIS already open for ToeThread #49: http://www.kzone.com.au/js/prototype.js at org.archive.io.RecordingInputStream.open(RecordingInputStream.java:88) at org.archive.util.HttpRecorder.inputWrap(HttpRecorder.java:148) at org.apache.commons.httpclient.HttpConnection.open(HttpConnection.java:756) at org.apache.commons.httpclient.HttpMethodDirector.executeWithRetry(HttpMethodDirector.java:387) at org.apache.commons.httpclient.HttpMethodDirector.executeMethod(HttpMethodDirector.java:171) at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:397) at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:346) at org.archive.crawler.fetcher.FetchHTTP.innerProcess(FetchHTTP.java:500) at org.archive.crawler.framework.Processor.process(Processor.java:109) at org.archive.crawler.framework.ToeThread.processCrawlUri(ToeThread.java:306) at org.archive.crawler.framework.ToeThread.run(ToeThread.java:154) I'm not sure whats going on but i think its one bug which leads to so many problems. I will test it with heritrix 3 soon.
    via by oliver z.,
  • One of our module had an empty source code file (actually someone just commented everything in that file). Maven was able to build the project but sonar crashed with following error: ... [INFO] ------------------------------------------------------------------------ [ERROR] BUILD ERROR [INFO] ------------------------------------------------------------------------ [INFO] Can not execute Sonar [INFO] ------------------------------------------------------------------------ [DEBUG] Trace org.apache.maven.lifecycle.LifecycleExecutionException: Can not execute Sonar at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(DefaultLifecycleExecutor.java:564) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeStandaloneGoal(DefaultLifecycleExecutor.java:493) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoal(DefaultLifecycleExecutor.java:463) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalAndHandleFailures(DefaultLifecycleExecutor.java:311) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegments(DefaultLifecycleExecutor.java:224) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.execute(DefaultLifecycleExecutor.java:143) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:333) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:126) at org.apache.maven.cli.MavenCli.main(MavenCli.java:282) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:585) at org.codehaus.classworlds.Launcher.launchEnhanced(Launcher.java:315) at org.codehaus.classworlds.Launcher.launch(Launcher.java:255) at org.codehaus.classworlds.Launcher.mainWithExitCode(Launcher.java:430) at org.codehaus.classworlds.Launcher.main(Launcher.java:375) Caused by: org.apache.maven.plugin.MojoExecutionException: Can not execute Sonar at org.codehaus.mojo.sonar.Bootstraper.executeMojo(Bootstraper.java:87) at org.codehaus.mojo.sonar.Bootstraper.start(Bootstraper.java:65) at org.codehaus.mojo.sonar.SonarMojo.execute(SonarMojo.java:117) at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPluginManager.java:447) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(DefaultLifecycleExecutor.java:539) ... 16 more Caused by: java.util.NoSuchElementException at java.util.TreeMap.key(TreeMap.java:433) at java.util.TreeMap.firstKey(TreeMap.java:287) at java.util.TreeSet.first(TreeSet.java:407) at org.sonar.squid.entities.Resource.getFirstChild(Resource.java:65) at org.sonar.plugins.squid.SquidMavenCollector.collectFileMeasures(SquidMavenCollector.java:141) at org.sonar.plugins.squid.SquidMavenCollector.collect(SquidMavenCollector.java:89) at org.sonar.mojo.InternalMojo.executeCollectors(InternalMojo.java:294) at org.sonar.mojo.InternalMojo.processModules(InternalMojo.java:176) at org.sonar.mojo.InternalMojo.execute(InternalMojo.java:160) at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPluginManager.java:447) at org.codehaus.mojo.sonar.Bootstraper.executeMojo(Bootstraper.java:82) ... 20 more Work sound is to remove the file.
    via by Rakesh Arora,
  • I started heritrix 1.14.4 on Friday morning to crawl only text/html files, beginning with yahoo.com On Monday morning i saw that heritrix just stopped crawling, heap space looks fine. The log says (last entry): 2010-12-10T18:04:49.904Z -5 - http://kr.promotion.yahoo.com/worldcup2010/ RRLLLLLLLE http://kr.sports.yahoo.com/event/wc2010/ no-type #020 - - - err=com.sleepycat.util.RuntimeExceptionWrapper com.sleepycat.util.RuntimeExceptionWrapper: (JE 3.3.82) fetchTarget of 0x50/0x8590e3 parent IN=266461 IN class=com.sleepycat.je.tree.BIN lastFullVersion=0x77/0x6dcaa0 parent.getDirty()=true state=0 com.sleepycat.je.log.LogFileNotFoundException: (JE 3.3.82) 0x50/0x8590e3 (JE 3.3.82) Couldn't open file /home/oli/Desktop/heritrix-1.14.4/jobs/nixdefault-20101210100246162/state/00000050.jdb: /home/oli/Desktop/heritrix-1.14.4/jobs/nixdefault-20101210100246162/state/00000050.jdb (Too many open files) at com.sleepycat.collections.StoredContainer.convertException(StoredContainer.java:466) at com.sleepycat.collections.StoredContainer.getValue(StoredContainer.java:306) at com.sleepycat.collections.StoredMap.get(StoredMap.java:227) at org.archive.util.ObjectIdentityBdbCache.getOrUse(ObjectIdentityBdbCache.java:264) at org.archive.util.ObjectIdentityBdbCache.getOrUse(ObjectIdentityBdbCache.java:75) at org.archive.crawler.datamodel.ServerCache.getServerFor(ServerCache.java:100) at org.archive.crawler.datamodel.ServerCache.getServerFor(ServerCache.java:124) at org.archive.crawler.prefetch.PreconditionEnforcer.considerDnsPreconditions(PreconditionEnforcer.java:227) at org.archive.crawler.prefetch.PreconditionEnforcer.innerProcess(PreconditionEnforcer.java:111) at org.archive.crawler.framework.Processor.process(Processor.java:109) at org.archive.crawler.framework.ToeThread.processCrawlUri(ToeThread.java:306) at org.archive.crawler.framework.ToeThread.run(ToeThread.java:154) but its the date from yesterday. I think there are some file handles left open - i use the write-to-harddisk-writer. I changed the /etc/security/limits.conf regarding the FAQ but the process is still doing nothing. It is just idle, consuming memory. when i go to "reports" in the web frontend i get: An error occured java.util.NoSuchElementException java.util.NoSuchElementException at java.util.TreeMap.key(TreeMap.java:1223) at java.util.TreeMap.firstKey(TreeMap.java:284) at java.util.TreeSet.first(TreeSet.java:394) at org.archive.crawler.framework.ToePool.singleLineReportTo(ToePool.java:276) at org.archive.util.ArchiveUtils.singleLineReport(ArchiveUtils.java:728) at org.archive.crawler.framework.ToePool.singleLineReport(ToePool.java:296) at org.archive.crawler.framework.CrawlController.oneLineReportThreads(CrawlController.java:1628) at org.archive.crawler.admin.CrawlJob.getThreadOneLine(CrawlJob.java:930) at org.archive.crawler.jspc.admin.reports_jsp._jspService(Unknown Source) at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:137) at javax.servlet.http.HttpServlet.service(HttpServlet.java:853) at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:358) at org.mortbay.jetty.servlet.WebApplicationHandler$Chain.doFilter(WebApplicationHandler.java:342) at org.archive.crawler.admin.ui.RootFilter.doFilter(RootFilter.java:67) at org.mortbay.jetty.servlet.WebApplicationHandler$Chain.doFilter(WebApplicationHandler.java:334) at org.mortbay.jetty.servlet.WebApplicationHandler.dispatch(WebApplicationHandler.java:286) at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:567) at org.mortbay.http.HttpContext.handle(HttpContext.java:1807) at org.mortbay.jetty.servlet.WebApplicationContext.handle(WebApplicationContext.java:525) at org.mortbay.http.HttpContext.handle(HttpContext.java:1757) at org.mortbay.http.HttpServer.service(HttpServer.java:879) at org.mortbay.http.HttpConnection.service(HttpConnection.java:789) at org.mortbay.http.HttpConnection.handleNext(HttpConnection.java:960) at org.mortbay.http.HttpConnection.handle(HttpConnection.java:806) at org.mortbay.http.SocketListener.handleConnection(SocketListener.java:218) at org.mortbay.util.ThreadedServer.handle(ThreadedServer.java:300) at org.mortbay.util.ThreadPool$PoolThread.run(ThreadPool.java:511) When i go to "Logs" they are displayed as empty. But they aren't (last entry local-errors.log): 2010-12-11T01:20:43.747Z -2 - http://www.kzone.com.au/js/prototype.js RRLLLLRE http://www.kzone.com.au/ no-type #049 - - - le:IOException@HTTP java.io.IOException: RIS already open for ToeThread #49: http://www.kzone.com.au/js/prototype.js at org.archive.io.RecordingInputStream.open(RecordingInputStream.java:88) at org.archive.util.HttpRecorder.inputWrap(HttpRecorder.java:148) at org.apache.commons.httpclient.HttpConnection.open(HttpConnection.java:756) at org.apache.commons.httpclient.HttpMethodDirector.executeWithRetry(HttpMethodDirector.java:387) at org.apache.commons.httpclient.HttpMethodDirector.executeMethod(HttpMethodDirector.java:171) at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:397) at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:346) at org.archive.crawler.fetcher.FetchHTTP.innerProcess(FetchHTTP.java:500) at org.archive.crawler.framework.Processor.process(Processor.java:109) at org.archive.crawler.framework.ToeThread.processCrawlUri(ToeThread.java:306) at org.archive.crawler.framework.ToeThread.run(ToeThread.java:154) I'm not sure whats going on but i think its one bug which leads to so many problems. I will test it with heritrix 3 soon.
    via by oliver z.,
  • GitHub comment 156#232200766
    via GitHub by tsember
    ,
    • java.util.NoSuchElementException at java.util.TreeMap.key(Unknown Source) at java.util.TreeMap.firstKey(Unknown Source) at java.util.TreeSet.first(Unknown Source) at org.bukkit.craftbukkit.v1_8_R3.util.HashTreeSet.first(HashTreeSet.java:114) at net.minecraft.server.v1_8_R3.WorldServer.a(WorldServer.java:639) at net.minecraft.server.v1_8_R3.WorldServer.doTick(WorldServer.java:249) at net.minecraft.server.v1_8_R3.MinecraftServer.B(MinecraftServer.java:849) at net.minecraft.server.v1_8_R3.DedicatedServer.B(DedicatedServer.java:381) at net.minecraft.server.v1_8_R3.MinecraftServer.A(MinecraftServer.java:730) at net.minecraft.server.v1_8_R3.MinecraftServer.run(MinecraftServer.java:633) at java.lang.Thread.run(Unknown Source)

    Users with the same issue

    WoodenDoorsWoodenDoors
    2 times, last one,
    Unknown visitor
    Unknown visitor1 times, last one,