java.lang.reflect.UndeclaredThrowableException: Unknown exception in doAs

Apache's JIRA Issue Tracker | Thomas Graves | 3 years ago
  1. 0

    [jira] [Commented] (SPARK-1407) EventLogging to HDFS doesn't work properly on yarn

    apache.org | 1 year ago
    java.lang.reflect.UndeclaredThrowableException: Unknown exception in doAs
  2. 0

    [SPARK-1407] EventLogging to HDFS doesn't work properly on yarn - ASF JIRA

    apache.org | 1 year ago
    java.lang.reflect.UndeclaredThrowableException: Unknown exception in doAs
  3. 0

    [jira] [Commented] (SPARK-1407) EventLogging to HDFS doesn't work properly on yarn

    apache.org | 1 year ago
    java.lang.reflect.UndeclaredThrowableException: Unknown exception in doAs
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    When running on spark on yarn and accessing an HDFS file (like in the SparkHdfsLR example) while using the event logging configured to write logs to HDFS, it throws an exception at the end of the application. SPARK_JAVA_OPTS=-Dspark.eventLog.enabled=true -Dspark.eventLog.dir=hdfs:///history/spark/ 14/04/03 13:41:31 INFO yarn.ApplicationMaster$$anon$1: Invoking sc stop from shutdown hook Exception in thread "Thread-41" java.io.IOException: Filesystem closed at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:398) at org.apache.hadoop.hdfs.DFSOutputStream.hflush(DFSOutputStream.java:1465) at org.apache.hadoop.hdfs.DFSOutputStream.sync(DFSOutputStream.java:1450) at org.apache.hadoop.fs.FSDataOutputStream.sync(FSDataOutputStream.java:116) at org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:137) at org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:137) at scala.Option.foreach(Option.scala:236) at org.apache.spark.util.FileLogger.flush(FileLogger.scala:137) at org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:69) at org.apache.spark.scheduler.EventLoggingListener.onApplicationEnd(EventLoggingListener.scala:101) at org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$13.apply(SparkListenerBus.scala:67) at org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$13.apply(SparkListenerBus.scala:67) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) at org.apache.spark.scheduler.SparkListenerBus$class.postToAll(SparkListenerBus.scala:67) at org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:31) at org.apache.spark.scheduler.LiveListenerBus.post(LiveListenerBus.scala:78) at org.apache.spark.SparkContext.postApplicationEnd(SparkContext.scala:1081) at org.apache.spark.SparkContext.stop(SparkContext.scala:828) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$1.run(ApplicationMaster.scala:460)

    Apache's JIRA Issue Tracker | 3 years ago | Thomas Graves
    java.lang.reflect.UndeclaredThrowableException: Unknown exception in doAs
  6. 0

    [HADOOP-9593] stack trace printed at ERROR for all yarn clients without hadoop.home set - ASF JIRA

    apache.org | 11 months ago
    java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.io.IOException

      Can't replace _HOST pattern since client address is null

      at org.apache.hadoop.security.SecurityUtil.getServerPrincipal()
    2. Hadoop
      RPC.getProxy
      1. org.apache.hadoop.security.SecurityUtil.getServerPrincipal(SecurityUtil.java:255)
      2. org.apache.hadoop.ipc.Client$ConnectionId.getRemotePrincipal(Client.java:1326)
      3. org.apache.hadoop.ipc.Client$ConnectionId.getConnectionId(Client.java:1298)
      4. org.apache.hadoop.ipc.WritableRpcEngine$Invoker.<init>(WritableRpcEngine.java:183)
      5. org.apache.hadoop.ipc.WritableRpcEngine.getProxy(WritableRpcEngine.java:236)
      6. org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:441)
      7. org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:387)
      8. org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364)
      8 frames
    3. Apache Hadoop HDFS
      DistributedFileSystem.initialize
      1. org.apache.hadoop.hdfs.DFSUtil.createRPCNamenode(DFSUtil.java:642)
      2. org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:346)
      3. org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:319)
      4. org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:110)
      4 frames
    4. Hadoop
      FileSystem.get
      1. org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2160)
      2. org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:85)
      3. org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2194)
      4. org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2176)
      5. org.apache.hadoop.fs.FileSystem.get(FileSystem.java:306)
      5 frames
    5. Spark
      SparkContext.<init>
      1. org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1022)
      2. org.apache.spark.util.FileLogger.<init>(FileLogger.scala:51)
      3. org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:49)
      4. org.apache.spark.SparkContext.<init>(SparkContext.scala:172)
      5. org.apache.spark.SparkContext.<init>(SparkContext.scala:96)
      5 frames
    6. Spark Examples
      SparkPi.main
      1. org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
      2. org.apache.spark.examples.SparkPi.main(SparkPi.scala)
      2 frames
    7. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:601)
      4 frames
    8. Spark Project YARN Stable API
      ApplicationMaster$$anon$2$$anonfun$run$1.apply$mcV$sp
      1. org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2$$anonfun$run$1.apply$mcV$sp(ApplicationMaster.scala:198)
      1 frame
    9. Spark
      SparkHadoopUtil$$anon$1.run
      1. org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:43)
      2. org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:42)
      2 frames
    10. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:415)
      2 frames
    11. Hadoop
      UserGroupInformation.doAs
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1262)
      1 frame
    12. Spark
      SparkHadoopUtil.runAsUser
      1. org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.scala:42)
      1 frame
    13. Spark Project YARN Stable API
      ApplicationMaster$$anon$2.run
      1. org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:192)
      1 frame