java.net.ConnectException: Call From ip-10-0-1-215.us-west-1.compute.internal/10.0.1.215 to 52.8.16.250:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

Stack Overflow | dbl001 | 7 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Spark EC2 Connection refused

    Stack Overflow | 7 months ago | dbl001
    java.net.ConnectException: Call From ip-10-0-1-215.us-west-1.compute.internal/10.0.1.215 to 52.8.16.250:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
  2. 0

    Unable to create hive table from spark

    Stack Overflow | 1 year ago | Koushik Chandra
    java.net.ConnectException: Call From ubuntu/127.0.0.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
  3. 0

    Spark 1.6.3 configuration on Hadoop 2.7.3 in fully distributed mode

    Stack Overflow | 3 weeks ago | Mahmud
    java.net.ConnectException: Call From master/10.0.0.1 to master:8021 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Spark cluster computing framework

    gmane.org | 1 year ago
    java.net.ConnectException: Connection refused; For more details see:
  6. 0

    After namenode failed, some regions stuck in Closed state

    Google Groups | 1 year ago | Heng Chen
    java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

    2 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.net.ConnectException

      Call From ip-10-0-1-215.us-west-1.compute.internal/10.0.1.215 to 52.8.16.250:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

      at sun.reflect.NativeConstructorAccessorImpl.newInstance0()
    2. Java RT
      Constructor.newInstance
      1. sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      2. sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
      3. sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      4. java.lang.reflect.Constructor.newInstance(Constructor.java:423)
      4 frames
    3. Hadoop
      ProtobufRpcEngine$Invoker.invoke
      1. org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
      2. org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:731)
      3. org.apache.hadoop.ipc.Client.call(Client.java:1472)
      4. org.apache.hadoop.ipc.Client.call(Client.java:1399)
      5. org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
      5 frames
    4. com.sun.proxy
      $Proxy32.getFileInfo
      1. com.sun.proxy.$Proxy32.getFileInfo(Unknown Source)
      1 frame
    5. Apache Hadoop HDFS
      ClientNamenodeProtocolTranslatorPB.getFileInfo
      1. org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
      1 frame
    6. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:498)
      4 frames
    7. Hadoop
      RetryInvocationHandler.invoke
      1. org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
      2. org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
      2 frames
    8. com.sun.proxy
      $Proxy33.getFileInfo
      1. com.sun.proxy.$Proxy33.getFileInfo(Unknown Source)
      1 frame
    9. Apache Hadoop HDFS
      DistributedFileSystem$18.doCall
      1. org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
      2. org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
      3. org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
      3 frames
    10. Hadoop
      FileSystemLinkResolver.resolve
      1. org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
      1 frame
    11. Apache Hadoop HDFS
      DistributedFileSystem.getFileStatus
      1. org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
      1 frame
    12. Hadoop
      FileSystem.globStatus
      1. org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
      2. org.apache.hadoop.fs.Globber.glob(Globber.java:252)
      3. org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1644)
      3 frames
    13. Hadoop
      FileInputFormat.getSplits
      1. org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:292)
      2. org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:264)
      3. org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:385)
      3 frames
    14. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:120)
      2. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      3 frames
    15. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    16. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      4 frames
    17. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    18. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      2. org.apache.spark.rdd.PartitionwiseSampledRDD.getPartitions(PartitionwiseSampledRDD.scala:58)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      4 frames
    19. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    20. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      4 frames
    21. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    22. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      4 frames
    23. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    24. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      4 frames
    25. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    26. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      4 frames
    27. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    28. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      4 frames
    29. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    30. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      4 frames
    31. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    32. Spark
      RDD.zipWithUniqueId
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      2. org.apache.spark.rdd.RDD$$anonfun$zipWithUniqueId$1.apply(RDD.scala:1286)
      3. org.apache.spark.rdd.RDD$$anonfun$zipWithUniqueId$1.apply(RDD.scala:1285)
      4. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
      5. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
      6. org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
      7. org.apache.spark.rdd.RDD.zipWithUniqueId(RDD.scala:1285)
      7 frames
    33. com.cloudera.datascience
      ParseWikipedia$.documentTermMatrix
      1. com.cloudera.datascience.lsa.ParseWikipedia$.documentTermMatrix(ParseWikipedia.scala:48)
      1 frame
    34. Unknown
      $iwC.<init>
      1. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$RunLSA$.preprocessing(<console>:111)
      2. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:71)
      3. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:80)
      4. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:82)
      5. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:84)
      6. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:86)
      7. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:88)
      8. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:90)
      9. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:92)
      10. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:94)
      11. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:96)
      12. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:98)
      13. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:100)
      14. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:102)
      15. $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:104)
      16. $iwC$$iwC$$iwC$$iwC.<init>(<console>:106)
      17. $iwC$$iwC$$iwC.<init>(<console>:108)
      18. $iwC$$iwC.<init>(<console>:110)
      19. $iwC.<init>(<console>:112)
      19 frames