Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Stack Overflow by Keshav Rathi
, 1 year ago
Found unrecoverable error [127.0.0.1:9200] returned Bad Request(400) - failed to parse; Bailing out..
via Stack Overflow by salvob
, 5 months ago
Found unrecoverable error [127.0.0.1:9200] returned Bad Request(400) - failed to parse; Bailing out..
via GitHub by JustUse
, 7 months ago
Found unrecoverable error [10.0.0.12:9200] returned Bad Request(400) - failed to parse, document is empty; Bailing out..
via GitHub by JustUse
, 7 months ago
Found unrecoverable error [10.0.0.12:9200] returned Bad Request(400) - failed to parse, document is empty; Bailing out..
via Stack Overflow by severine
, 8 months ago
Found unrecoverable error [127.0.0.1:9200] returned Bad Request(400) - failed to parse; Bailing out..
via Stack Overflow by Rahul
, 1 year ago
Found unrecoverable error [xx.xxx.xx.xx:10200] returned Bad Request(400) - failed to parse; Bailing out..
org.elasticsearch.hadoop.rest.EsHadoopInvalidRequest: Found unrecoverable error [127.0.0.1:9200] returned Bad Request(400) - failed to parse; Bailing out..	at org.elasticsearch.hadoop.rest.RestClient.processBulkResponse(RestClient.java:250)	at org.elasticsearch.hadoop.rest.RestClient.bulk(RestClient.java:202)	at org.elasticsearch.hadoop.rest.RestRepository.tryFlush(RestRepository.java:220)	at org.elasticsearch.hadoop.rest.RestRepository.flush(RestRepository.java:242)	at org.elasticsearch.hadoop.rest.RestRepository.close(RestRepository.java:267)	at org.elasticsearch.hadoop.rest.RestService$PartitionWriter.close(RestService.java:120)	at org.elasticsearch.spark.rdd.EsRDDWriter$$anonfun$write$1.apply(EsRDDWriter.scala:42)	at org.elasticsearch.spark.rdd.EsRDDWriter$$anonfun$write$1.apply(EsRDDWriter.scala:42)	at org.apache.spark.TaskContext$$anon$1.onTaskCompletion(TaskContext.scala:123)	at org.apache.spark.TaskContextImpl$$anonfun$markTaskCompleted$1.apply(TaskContextImpl.scala:97)	at org.apache.spark.TaskContextImpl$$anonfun$markTaskCompleted$1.apply(TaskContextImpl.scala:95)	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)	at org.apache.spark.TaskContextImpl.markTaskCompleted(TaskContextImpl.scala:95)	at org.apache.spark.scheduler.Task.run(Task.scala:99)	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)	at java.lang.Thread.run(Thread.java:745)