org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error (check network and/or proxy settings)- all nodes failed; tried [[127.0.0.1:9200]]

GitHub | bwf93 | 4 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    GitHub comment 868#254860752

    GitHub | 4 months ago | bwf93
    org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error (check network and/or proxy settings)- all nodes failed; tried [[127.0.0.1:9200]]
  2. 0

    Spark SQL support broken in rc1

    GitHub | 4 months ago | bwf93
    org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error (check network and/or proxy settings)- all nodes failed; tried [[127.0.0.1:9200]]
  3. 0

    shard preference concatenation with | gives query error

    GitHub | 4 months ago | megri
    org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error (check network and/or proxy settings)- all nodes failed; tried [[localhost:9200]]
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException

      Connection error (check network and/or proxy settings)- all nodes failed; tried [[127.0.0.1:9200]]

      at org.elasticsearch.hadoop.rest.NetworkClient.execute()
    2. Elasticsearch Hadoop
      ScrollQuery.hasNext
      1. org.elasticsearch.hadoop.rest.NetworkClient.execute(NetworkClient.java:150)
      2. org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:444)
      3. org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:436)
      4. org.elasticsearch.hadoop.rest.RestRepository.scroll(RestRepository.java:363)
      5. org.elasticsearch.hadoop.rest.ScrollQuery.hasNext(ScrollQuery.java:92)
      5 frames
    3. Elasticsearch Spark
      AbstractEsRDDIterator.hasNext
      1. org.elasticsearch.spark.rdd.AbstractEsRDDIterator.hasNext(AbstractEsRDDIterator.scala:43)
      1 frame
    4. Scala
      Iterator$$anon$11.hasNext
      1. scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
      1 frame
    5. Spark Project Catalyst
      GeneratedClass$GeneratedIterator.processNext
      1. org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)
      1 frame
    6. Spark Project SQL
      WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext
      1. org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
      2. org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:370)
      2 frames
    7. Scala
      Iterator$$anon$11.hasNext
      1. scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
      1 frame
    8. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:125)
      2. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79)
      3. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)
      4. org.apache.spark.scheduler.Task.run(Task.scala:86)
      5. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
      5 frames
    9. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      3. java.lang.Thread.run(Thread.java:745)[ERROR]
      3 frames