org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

Stack Overflow | theScalaGuy | 6 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Concat in SparkSQL

    Stack Overflow | 6 months ago | theScalaGuy
    org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
  2. 0

    Cannot connect to hive through scala in spark sql

    Stack Overflow | 3 months ago | Coinnigh
    org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
  3. 0

    Error in importing table from teradata to hive using teradata-connector

    Stack Overflow | 4 months ago | Kaustubh Deshpande
    java.lang.NoSuchMethodError: org.apache.hadoop.hive.shims.HadoopShims.getUGIForConf(Lorg/apache/hadoop/conf/Configuration;)Lorg/apache/hadoop/security/UserGroupInformation;
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.NoSuchMethodError

      org.apache.thrift.EncodingUtils.setBit(BIZ)B

      at org.apache.hadoop.hive.metastore.api.PrivilegeGrantInfo.setCreateTimeIsSet()
    2. Hive Metastore
      HiveMetaStoreClient.<init>
      1. org.apache.hadoop.hive.metastore.api.PrivilegeGrantInfo.setCreateTimeIsSet(PrivilegeGrantInfo.java:245)
      2. org.apache.hadoop.hive.metastore.api.PrivilegeGrantInfo.<init>(PrivilegeGrantInfo.java:163)
      3. org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles_core(HiveMetaStore.java:675)
      4. org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles(HiveMetaStore.java:645)
      5. org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:462)
      6. org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
      7. org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
      8. org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
      9. org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
      9 frames
    3. Hive Query Language
      SessionHiveMetaStoreClient.<init>
      1. org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
      1 frame
    4. Java RT
      Constructor.newInstance
      1. sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      2. sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
      3. sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      4. java.lang.reflect.Constructor.newInstance(Constructor.java:526)
      4 frames
    5. Hive Metastore
      RetryingMetaStoreClient.getProxy
      1. org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
      2. org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
      3. org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
      4. org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
      4 frames
    6. Hive Query Language
      SessionState.start
      1. org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
      2. org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
      3. org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
      4. org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
      5. org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
      6. org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
      6 frames
    7. org.apache.spark
      ClientWrapper.<init>
      1. org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194)
      1 frame
    8. Java RT
      Constructor.newInstance
      1. sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      2. sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
      3. sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      4. java.lang.reflect.Constructor.newInstance(Constructor.java:526)
      4 frames
    9. org.apache.spark
      IsolatedClientLoader.createClient
      1. org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:249)
      1 frame
    10. Spark Project Hive
      HiveContext.<init>
      1. org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:327)
      2. org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:237)
      3. org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:441)
      4. org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:226)
      5. org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:229)
      6. org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
      6 frames
    11. Unknown
      SimpleApp.main
      1. SimpleApp$.<init>(SimpleApp.scala:50)
      2. SimpleApp$.<clinit>(SimpleApp.scala)
      3. SimpleApp.main(SimpleApp.scala)
      3 frames
    12. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:606)
      4 frames
    13. Spark
      SparkSubmit.main
      1. org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
      2. org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
      3. org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
      4. org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
      5. org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      5 frames