Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Samebug tips

  1. ,

    This is a checked exception that wraps an exception thrown by an invoked method or constructor. You can use the getCause() method to retrieve the original exception.

Solutions on the web

via spark-user by Cheng, Hao, 1 year ago
via nabble.com by Unknown author, 1 year ago
java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
via Stack Overflow by Unknown author, 2 years ago
java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
via rssing.com by Unknown author, 2 years ago
java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
via Stack Overflow by Shubzumt
, 3 months ago
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
via Stack Overflow by user2159301
, 6 months ago
org.apache.hadoop.security.AccessControlException: User <user_name>(user id 50005586) has been denied access to create <user_name>
java.lang.reflect.InvocationTargetException: 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:62)	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)	at org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:235)	at org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:231)	at scala.Option.orElse(Option.scala:257)	at org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scala:231)	at org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:229)	at org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:229)	at org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:229)	at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:292)	at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)	at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:248)	at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:91)	at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:90)	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)	at org.apache.spark.sql.SQLContext.(SQLContext.scala:90)	at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:72)	at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:51)	at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:56)	at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)	at java.lang.reflect.Method.invoke(Method.java:606)	at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)