java.lang.IllegalArgumentException: java.net.UnknownHostException: user

cloudera.com | 8 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    java.lang.IllegalArgumentException: java.net.Unkno... - Cloudera Community

    cloudera.com | 8 months ago
    java.lang.IllegalArgumentException: java.net.UnknownHostException: user
  2. 0

    java.lang.IllegalArgumentException: java.net.Unkno... - Cloudera Community

    cloudera.com | 1 month ago
    java.lang.IllegalArgumentException: java.net.UnknownHostException: user
  3. 0

    Apache Spark Sql issue in multi node hadoop cluster

    Stack Overflow | 2 years ago | wazza
    java.lang.IllegalArgumentException: java.net.UnknownHostException: hadoopcluster
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Solved: Can "show tables" but don't "SELECT FROM" Hive tab... - Cloudera Community

    cloudera.com | 8 months ago
    java.lang.IllegalArgumentException: java.net.UnknownHostException: quickstart.cloudera
  6. 0

    Doesn't work with HA Namenodes (CDH 4.3)

    GitHub | 3 years ago | nikore
    java.lang.RuntimeException: java.io.IOException: java.lang.IllegalArgumentException: java.net.UnknownHostException: sf-cluster

  1. Anup Ash 1 times, last 9 months ago
13 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.IllegalArgumentException

    java.net.UnknownHostException: user

    at org.apache.hadoop.security.SecurityUtil.buildTokenService()
  2. Hadoop
    SecurityUtil.buildTokenService
    1. org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:377)
    1 frame
  3. Apache Hadoop HDFS
    DistributedFileSystem.initialize
    1. org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:237)
    2. org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:141)
    3. org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:576)
    4. org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:521)
    5. org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:146)
    5 frames
  4. Hadoop
    Path.getFileSystem
    1. org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2397)
    2. org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
    3. org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
    4. org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
    5. org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
    6. org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
    6 frames
  5. Hadoop
    FileInputFormat.getSplits
    1. org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:221)
    2. org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:270)
    2 frames
  6. Spark
    RDD$$anonfun$partitions$2.apply
    1. org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:140)
    2. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:207)
    3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
    3 frames
  7. Scala
    Option.getOrElse
    1. scala.Option.getOrElse(Option.scala:120)
    1 frame
  8. Spark
    RDD$$anonfun$partitions$2.apply
    1. org.apache.spark.rdd.RDD.partitions(RDD.scala:205)
    2. org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
    3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:207)
    4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
    4 frames
  9. Scala
    Option.getOrElse
    1. scala.Option.getOrElse(Option.scala:120)
    1 frame
  10. Spark
    RDD$$anonfun$partitions$2.apply
    1. org.apache.spark.rdd.RDD.partitions(RDD.scala:205)
    2. org.apache.spark.rdd.FlatMappedRDD.getPartitions(FlatMappedRDD.scala:30)
    3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:207)
    4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
    4 frames
  11. Scala
    Option.getOrElse
    1. scala.Option.getOrElse(Option.scala:120)
    1 frame
  12. Spark
    RDD$$anonfun$partitions$2.apply
    1. org.apache.spark.rdd.RDD.partitions(RDD.scala:205)
    2. org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
    3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:207)
    4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
    4 frames
  13. Scala
    Option.getOrElse
    1. scala.Option.getOrElse(Option.scala:120)
    1 frame
  14. Spark
    PairRDDFunctions.reduceByKey
    1. org.apache.spark.rdd.RDD.partitions(RDD.scala:205)
    2. org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:58)
    3. org.apache.spark.rdd.PairRDDFunctions.reduceByKey(PairRDDFunctions.scala:355)
    3 frames
  15. Unknown
    $iwC.<init>
    1. $iwC$$iwC$$iwC$$iwC.<init>(<console>:14)
    2. $iwC$$iwC$$iwC.<init>(<console>:19)
    3. $iwC$$iwC.<init>(<console>:21)
    4. $iwC.<init>(<console>:23)
    4 frames