java.lang.reflect.InvocationTargetException

GitHub | MumoNobert | 3 weeks ago
tip
Do you know that we can give you better hits? Get more relevant results from Samebug’s stack trace search.
  1. 0

    Running SparkR through RStudio

    Stack Overflow | 6 months ago | Nick Knauer
    java.lang.reflect.InvocationTargetException
  2. 0

    GitHub comment 277#255768229

    GitHub | 4 months ago | trinker
    java.lang.reflect.InvocationTargetException
  3. 0

    copy_to Error

    GitHub | 4 months ago | yarrick19
    java.lang.reflect.InvocationTargetException
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    GitHub comment 263#255374445

    GitHub | 4 months ago | trinker
    java.lang.reflect.InvocationTargetException

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.RuntimeException

      java.lang.RuntimeException: Error while running command to get file permissions : java.io.IOException: (null) entry in command string: null ls -F C:\tmp\hive

      at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute()
    2. Hadoop
      RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission
      1. org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:770)
      2. org.apache.hadoop.util.Shell.execCommand(Shell.java:866)
      3. org.apache.hadoop.util.Shell.execCommand(Shell.java:849)
      4. org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
      5. org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:659)
      6. org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:634)
      6 frames
    3. Hive Query Language
      SessionState.start
      1. org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:599)
      2. org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
      3. org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
      3 frames
    4. org.apache.spark
      HiveClientImpl.<init>
      1. org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:189)
      1 frame
    5. Java RT
      Constructor.newInstance
      1. sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      2. sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
      3. sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
      4. java.lang.reflect.Constructor.newInstance(Unknown Source)
      4 frames
    6. org.apache.spark
      IsolatedClientLoader.createClient
      1. org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
      1 frame
    7. Spark Project Hive
      HiveSessionState.analyzer
      1. org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
      2. org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
      3. org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
      4. org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
      5. org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
      6. org.apache.spark.sql.hive.HiveSharedState.externalCatalog(HiveSharedState.scala:45)
      7. org.apache.spark.sql.hive.HiveSessionState.catalog$lzycompute(HiveSessionState.scala:50)
      8. org.apache.spark.sql.hive.HiveSessionState.catalog(HiveSessionState.scala:48)
      9. org.apache.spark.sql.hive.HiveSessionState$$anon$1.<init>(HiveSessionState.scala:63)
      10. org.apache.spark.sql.hive.HiveSessionState.analyzer$lzycompute(HiveSessionState.scala:63)
      11. org.apache.spark.sql.hive.HiveSessionState.analyzer(HiveSessionState.scala:62)
      11 frames
    8. Spark Project SQL
      SparkSession.sql
      1. org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
      2. org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
      3. org.apache.spark.sql.SparkSession.sql(SparkSession.scala:582)
      3 frames
    9. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
      4. java.lang.reflect.Method.invoke(Unknown Source)
      4 frames
    10. sparklyr
      BackendHandler.channelRead0
      1. sparklyr.Invoke$.invoke(invoke.scala:94)
      2. sparklyr.StreamHandler$.handleMethodCall(stream.scala:89)
      3. sparklyr.StreamHandler$.read(stream.scala:55)
      4. sparklyr.BackendHandler.channelRead0(handler.scala:49)
      5. sparklyr.BackendHandler.channelRead0(handler.scala:14)
      5 frames
    11. Netty
      AbstractChannelHandlerContext.invokeChannelRead
      1. io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
      2. io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
      3. io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
      4. io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
      5. io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
      6. io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
      7. io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244)
      8. io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
      8 frames