Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via opensearchserve by a90a
, 1 year ago
Circular redirect to 'http://.../robots.txt/'
via Jenkins JIRA by Jérémie Charest, 1 year ago
Circular redirect to 'https://artifacts.xxx.com/artifactory/engine-snapshot/com/xxx/iweditor/1.0.0/iweditor-1.0.0.zip'
via JFrog JIRA by J??r??mie Charest, 1 year ago
Circular redirect to 'https://artifacts.xxx.com/artifactory/engine-snapshot/com/xxx/iweditor/1.0.0/iweditor-1.0.0.zip'
via Jenkins JIRA by Jérémie Charest, 1 year ago
Circular redirect to 'https://artifacts.xxx.com/artifactory/engine-snapshot/com/xxx/iweditor/1.0.0/iweditor-1.0.0.zip'
via JFrog JIRA by Jérémie Charest, 1 year ago
Circular redirect to 'https://artifacts.xxx.com/artifactory/engine-snapshot/com/xxx/iweditor/1.0.0/iweditor-1.0.0.zip'
via GitHub by Guite
, 1 year ago
Circular redirect to 'http://www.jnario.org/updates/releases/p2.index'
org.apache.http.client.CircularRedirectException: Circular redirect to 'http://.../robots.txt/'	at org.apache.http.impl.client.DefaultRedirectStrategy.getLocationURI(DefaultRedirectStrategy.java:174)	at org.apache.http.impl.client.DefaultRedirectStrategy.getRedirect(DefaultRedirectStrategy.java:217)	at com.jaeksoft.searchlib.crawler.web.spider.HttpAbstract.getRedirectLocation(HttpAbstract.java:222)	at com.jaeksoft.searchlib.crawler.web.spider.HttpDownloader.getDownloadItem(HttpDownloader.java:94)	at com.jaeksoft.searchlib.crawler.web.spider.HttpDownloader.request(HttpDownloader.java:123)	at com.jaeksoft.searchlib.crawler.web.spider.HttpDownloader.request(HttpDownloader.java:159)	at com.jaeksoft.searchlib.crawler.web.spider.HttpDownloader.get(HttpDownloader.java:175)	at com.jaeksoft.searchlib.crawler.web.spider.Crawl.download(Crawl.java:283)	at com.jaeksoft.searchlib.crawler.web.robotstxt.RobotsTxtCache.getRobotsTxt(RobotsTxtCache.java:107)	at com.jaeksoft.searchlib.crawler.web.spider.Crawl.checkRobotTxtAllow(Crawl.java:236)	at com.jaeksoft.searchlib.crawler.web.process.WebCrawlThread.crawl(WebCrawlThread.java:181)	at com.jaeksoft.searchlib.crawler.web.process.WebCrawlThread.runner(WebCrawlThread.java:126)	at com.jaeksoft.searchlib.process.ThreadAbstract.run(ThreadAbstract.java:291)	at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)	at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)	at java.lang.Thread.run(Unknown Source)