java.io.IOException: Job failed!

Apache's JIRA Issue Tracker | Francesco Capponi | 6 months ago
  1. 0

    I'm have been having this problem for a while and I had to rollback using the old solr clean instead of the newer version. Once it inserts/update correctly every document in Nutch, when it tries to clean, it returns error 255: {quote} 2016-05-30 10:13:04,992 WARN output.FileOutputCommitter - Output Path is null in setupJob() 2016-05-30 10:13:07,284 INFO indexer.IndexWriters - Adding org.apache.nutch.indexwriter.solr.SolrIndexWriter 2016-05-30 10:13:08,114 INFO solr.SolrMappingReader - source: content dest: content 2016-05-30 10:13:08,114 INFO solr.SolrMappingReader - source: title dest: title 2016-05-30 10:13:08,114 INFO solr.SolrMappingReader - source: host dest: host 2016-05-30 10:13:08,114 INFO solr.SolrMappingReader - source: segment dest: segment 2016-05-30 10:13:08,114 INFO solr.SolrMappingReader - source: boost dest: boost 2016-05-30 10:13:08,114 INFO solr.SolrMappingReader - source: digest dest: digest 2016-05-30 10:13:08,114 INFO solr.SolrMappingReader - source: tstamp dest: tstamp 2016-05-30 10:13:08,133 INFO solr.SolrIndexWriter - SolrIndexer: deleting 15/15 documents 2016-05-30 10:13:08,919 WARN output.FileOutputCommitter - Output Path is null in cleanupJob() 2016-05-30 10:13:08,937 WARN mapred.LocalJobRunner - job_local662730477_0001 java.lang.Exception: java.lang.IllegalStateException: Connection pool shut down at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462) at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529) Caused by: java.lang.IllegalStateException: Connection pool shut down at org.apache.http.util.Asserts.check(Asserts.java:34) at org.apache.http.pool.AbstractConnPool.lease(AbstractConnPool.java:169) at org.apache.http.pool.AbstractConnPool.lease(AbstractConnPool.java:202) at org.apache.http.impl.conn.PoolingClientConnectionManager.requestConnection(PoolingClientConnectionManager.java:184) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:415) at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863) at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82) at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:106) at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57) at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:480) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:241) at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:230) at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:150) at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:483) at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:464) at org.apache.nutch.indexwriter.solr.SolrIndexWriter.commit(SolrIndexWriter.java:190) at org.apache.nutch.indexwriter.solr.SolrIndexWriter.close(SolrIndexWriter.java:178) at org.apache.nutch.indexer.IndexWriters.close(IndexWriters.java:115) at org.apache.nutch.indexer.CleaningJob$DeleterReducer.close(CleaningJob.java:120) at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392) at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) 2016-05-30 10:13:09,299 ERROR indexer.CleaningJob - CleaningJob: java.io.IOException: Job failed! at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:836) at org.apache.nutch.indexer.CleaningJob.delete(CleaningJob.java:172) at org.apache.nutch.indexer.CleaningJob.run(CleaningJob.java:195) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.nutch.indexer.CleaningJob.main(CleaningJob.java:206) {quote}

    Apache's JIRA Issue Tracker | 6 months ago | Francesco Capponi
    java.io.IOException: Job failed!
  2. 0

    Apache Nutch 1.12 with Apache Solr 6.2.1 give an error

    Stack Overflow | 1 month ago | btaek
    java.io.IOException: Job failed!
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

  4. 0

    avro error

    GitHub | 3 years ago | priyolahiri
    java.io.IOException: Job failed!

    9 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.io.IOException

      Job failed!

      at org.apache.hadoop.mapred.JobClient.runJob()
    2. Hadoop
      JobClient.runJob
      1. org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:836)
      1 frame
    3. Apache Nutch
      CleaningJob.run
      1. org.apache.nutch.indexer.CleaningJob.delete(CleaningJob.java:172)
      2. org.apache.nutch.indexer.CleaningJob.run(CleaningJob.java:195)
      2 frames
    4. Hadoop
      ToolRunner.run
      1. org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
      1 frame
    5. Apache Nutch
      CleaningJob.main
      1. org.apache.nutch.indexer.CleaningJob.main(CleaningJob.java:206)
      1 frame