org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException

Connection error (check network and/or proxy settings)- all nodes failed; tried [[127.0.0.1:9200]]

Samebug tips0

There are no available Samebug tips for this exception. If you know how to solve this issue, help other users by writing a short tip.

Don't give up yet. Paste your full stack trace to get a solution.

Solutions on the web40

  • via GitHub by bwf93
    , 9 months ago
    Connection error (check network and/or proxy settings)- all nodes failed; tried [[127.0.0.1:9200]]
  • Connection error (check network and/or proxy settings)- all nodes failed; tried [[127.0.0.1:9200]]
  • via GitHub by bwf93
    , 9 months ago
    Connection error (check network and/or proxy settings)- all nodes failed; tried [[127.0.0.1:9200]]
  • Stack trace

    • org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error (check network and/or proxy settings)- all nodes failed; tried [[127.0.0.1:9200]] at org.elasticsearch.hadoop.rest.NetworkClient.execute(NetworkClient.java:150) at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:444) at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:436) at org.elasticsearch.hadoop.rest.RestRepository.scroll(RestRepository.java:363) at org.elasticsearch.hadoop.rest.ScrollQuery.hasNext(ScrollQuery.java:92) at org.elasticsearch.spark.rdd.AbstractEsRDDIterator.hasNext(AbstractEsRDDIterator.scala:43) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408) at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source) at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:370) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408) at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:125) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47) at org.apache.spark.scheduler.Task.run(Task.scala:86) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)[ERROR]

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    2 times, 4 months ago
    Unknown user
    Once, 11 months ago
    Unknown user
    Once, 1 year ago