java.lang.reflect.UndeclaredThrowableException

Unknown exception in doAs

Samebug tips0

We couldn't find tips for this exception.

Don't give up yet. Paste your full stack trace to get a solution.

Solutions on the web5

  • Stack trace

    • java.lang.reflect.UndeclaredThrowableException: Unknown exception in doAs at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1275) at org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.scala:42) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:192) Caused by: java.security.PrivilegedActionException: java.lang.reflect.InvocationTargetException at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1262) ... 2 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2$$anonfun$run$1.apply$mcV$sp(ApplicationMaster.scala:198) at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:43) at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:42) ... 5 more Caused by: java.io.IOException: Can't replace _HOST pattern since client address is null at org.apache.hadoop.security.SecurityUtil.getServerPrincipal(SecurityUtil.java:255) at org.apache.hadoop.ipc.Client$ConnectionId.getRemotePrincipal(Client.java:1326) at org.apache.hadoop.ipc.Client$ConnectionId.getConnectionId(Client.java:1298) at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.<init>(WritableRpcEngine.java:183) at org.apache.hadoop.ipc.WritableRpcEngine.getProxy(WritableRpcEngine.java:236) at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:441) at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:387) at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364) at org.apache.hadoop.hdfs.DFSUtil.createRPCNamenode(DFSUtil.java:642) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:346) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:319) at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:110) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2160) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:85) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2194) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2176) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:306) at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1022) at org.apache.spark.util.FileLogger.<init>(FileLogger.scala:51) at org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:49) at org.apache.spark.SparkContext.<init>(SparkContext.scala:172) at org.apache.spark.SparkContext.<init>(SparkContext.scala:96) at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31) at org.apache.spark.examples.SparkPi.main(SparkPi.scala) ... 12 more

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    We couldn't find other users who have seen this exception.