java.lang.Exception: org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: []

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Stack Overflow by Amit Valse
, 1 year ago
org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: []
via Stack Overflow by SUDARSHAN
, 5 months ago
java.io.IOException: Could not find a deserializer for the Value class: 'org.apache.hadoop.hbase.client.Result'. Please ensure that the configuration 'io.serializations' is properly configured, if you're using custom serialization.
via JIRA by Geoff Minerbo, 1 year ago
org.apache.pig.backend.executionengine.ExecException: ERROR 2108: Could not determine data type of field: [B@3982a033
via Google Groups by Dmitriy Morozov, 2 years ago
cascading.tuple.TupleException: given TupleEntry fields: [{?}:UNKNOWN] do not match the operation declaredFields: [{2}:'id', 'comments'], operations must emit tuples that match the fields they declare as output
via JIRA by Geoff Minerbo, 1 year ago
org.apache.pig.backend.executionengine.ExecException: ERROR 2108: Could not determine data type of field: [B@3982a033
via Stack Overflow by Green
, 6 months ago
org.apache.pig.backend.executionengine.ExecException: ERROR 2055: Received Error while processing the map plan: 'testpy22.py ' failed with exit status: 1
org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: []
at org.elasticsearch.client.transport.TransportClientNodesService.ensureNodesAreAvailable(TransportClientNodesService.java:278)
at org.elasticsearch.client.transport.TransportClientNodesService.execute(TransportClientNodesService.java:197)
at org.elasticsearch.client.transport.support.InternalTransportClient.execute(InternalTransportClient.java:106)
at org.elasticsearch.client.support.AbstractClient.bulk(AbstractClient.java:163)
at org.elasticsearch.client.transport.TransportClient.bulk(TransportClient.java:364)
at org.elasticsearch.action.bulk.BulkRequestBuilder.doExecute(BulkRequestBuilder.java:164)
at org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:91)
at org.apache.nutch.indexwriter.elastic.ElasticIndexWriter.commit(ElasticIndexWriter.java:208)
at org.apache.nutch.indexwriter.elastic.ElasticIndexWriter.close(ElasticIndexWriter.java:226)
at org.apache.nutch.indexer.IndexWriters.close(IndexWriters.java:114)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:650)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:767)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
at java.util.concurrent.FutureTaselasticsearch.action.bulk.BulkRequestBuilder.doExecute(BulkRequestBuilder.java:164)
at org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:91)
at org.apache.nutch.indexwriter.elastic.ElasticIndexWriter.commit(ElasticIndexWriter.java:208)
at org.apache.nutch.indexwriter.elastic.ElasticIndexWriter.close(ElasticIndexWriter.java:226)
at org.apache.nutch.indexer.IndexWriters.close(IndexWriters.java:114)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:650)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:767)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

Users with the same issue

Once, 3 weeks ago
2 times, 3 months ago
4 times, 6 months ago
30 times, 1 year ago
2 times, 1 year ago

Write tip

Know the solutions? Share your knowledge to help other developers to debug faster.