java.lang.IllegalArgumentException: java.net.UnknownHostException: namenode1.hdfs.mesos

GitHub | martinstuder | 4 months ago
  1. 0

    GitHub comment 613#239890055

    GitHub | 4 months ago | martinstuder
    java.lang.IllegalArgumentException: java.net.UnknownHostException: namenode1.hdfs.mesos
  2. 0

    When reading text file from file system, Spark still tries to connect to HDFS

    Stack Overflow | 5 months ago | ktaube
    java.lang.IllegalArgumentException: java.net.UnknownHostException: namenode1.hdfs.mesos
  3. 0

    Do I need to install hdfs to run my Spark jobs

    GitHub | 10 months ago | yogeshnath
    java.lang.IllegalArgumentException: java.net.UnknownHostException: namenode1.hdfs.mesos
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    GitHub comment 250#219496747

    GitHub | 7 months ago | radek1st
    java.lang.IllegalArgumentException: java.net.UnknownHostException: namenode1.hdfs.mesos
  6. 0

    Hbase - hadoop handling DNS blips

    Google Groups | 2 years ago | Arun Mishra
    java.io.IOException: cannot get log writer

  1. Anup Ash 1 times, last 4 months ago
13 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.IllegalArgumentException

    java.net.UnknownHostException: namenode1.hdfs.mesos

    at org.apache.hadoop.security.SecurityUtil.buildTokenService()
  2. Hadoop
    SecurityUtil.buildTokenService
    1. org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:377)
    1 frame
  3. Apache Hadoop HDFS
    ConfiguredFailoverProxyProvider.getProxy
    1. org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:240)
    2. org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider.getProxy(ConfiguredFailoverProxyProvider.java:124)
    2 frames
  4. Hadoop
    RetryProxy.create
    1. org.apache.hadoop.io.retry.RetryInvocationHandler.<init>(RetryInvocationHandler.java:74)
    2. org.apache.hadoop.io.retry.RetryInvocationHandler.<init>(RetryInvocationHandler.java:65)
    3. org.apache.hadoop.io.retry.RetryProxy.create(RetryProxy.java:58)
    3 frames
  5. Apache Hadoop HDFS
    DistributedFileSystem.initialize
    1. org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:152)
    2. org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:579)
    3. org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:524)
    4. org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:146)
    4 frames
  6. Hadoop
    FileSystem.get
    1. org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2397)
    2. org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
    3. org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
    4. org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
    5. org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
    6. org.apache.hadoop.fs.FileSystem.get(FileSystem.java:167)
    6 frames
  7. Hadoop
    FileInputFormat.setInputPaths
    1. org.apache.hadoop.mapred.JobConf.getWorkingDirectory(JobConf.java:653)
    2. org.apache.hadoop.mapred.FileInputFormat.setInputPaths(FileInputFormat.java:427)
    3. org.apache.hadoop.mapred.FileInputFormat.setInputPaths(FileInputFormat.java:400)
    3 frames
  8. Spark
    HadoopRDD$$anonfun$getJobConf$6.apply
    1. org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$33.apply(SparkContext.scala:1015)
    2. org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$33.apply(SparkContext.scala:1015)
    3. org.apache.spark.rdd.HadoopRDD$$anonfun$getJobConf$6.apply(HadoopRDD.scala:176)
    4. org.apache.spark.rdd.HadoopRDD$$anonfun$getJobConf$6.apply(HadoopRDD.scala:176)
    4 frames
  9. Scala
    Option.map
    1. scala.Option.map(Option.scala:145)
    1 frame
  10. Spark
    RDD$$anonfun$partitions$2.apply
    1. org.apache.spark.rdd.HadoopRDD.getJobConf(HadoopRDD.scala:176)
    2. org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:195)
    3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
    4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
    4 frames
  11. Scala
    Option.getOrElse
    1. scala.Option.getOrElse(Option.scala:120)
    1 frame
  12. Spark
    RDD$$anonfun$partitions$2.apply
    1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
    2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
    3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
    4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
    4 frames
  13. Scala
    Option.getOrElse
    1. scala.Option.getOrElse(Option.scala:120)
    1 frame
  14. Spark
    RDD$$anonfun$partitions$2.apply
    1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
    2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
    3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
    4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
    4 frames
  15. Scala
    Option.getOrElse
    1. scala.Option.getOrElse(Option.scala:120)
    1 frame
  16. Spark
    RDD$$anonfun$partitions$2.apply
    1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
    2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
    3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
    4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
    4 frames
  17. Scala
    Option.getOrElse
    1. scala.Option.getOrElse(Option.scala:120)
    1 frame
  18. Spark
    RDD$$anonfun$partitions$2.apply
    1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
    2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
    3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
    4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
    4 frames
  19. Scala
    Option.getOrElse
    1. scala.Option.getOrElse(Option.scala:120)
    1 frame
  20. Spark
    RDD.first
    1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
    2. org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1307)
    3. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
    4. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
    5. org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
    6. org.apache.spark.rdd.RDD.take(RDD.scala:1302)
    7. org.apache.spark.rdd.RDD$$anonfun$first$1.apply(RDD.scala:1342)
    8. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
    9. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
    10. org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
    11. org.apache.spark.rdd.RDD.first(RDD.scala:1341)
    11 frames
  21. Unknown
    $iwC.<init>
    1. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
    2. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:41)
    3. $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:43)
    4. $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:45)
    5. $iwC$$iwC$$iwC$$iwC.<init>(<console>:47)
    6. $iwC$$iwC$$iwC.<init>(<console>:49)
    7. $iwC$$iwC.<init>(<console>:51)
    8. $iwC.<init>(<console>:53)
    8 frames