Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Samebug tips

  1. ,
    Expert tip

    This might be an issue with the file location in the Spark submit command. Try it with

    spark-submit --master spark://master:7077 \
         hello_world_from_pyspark.py {file location}
    
    
  2. ,
    Expert tip

    Check if you've set a name in Application -> Run. If you didn't, the generated XML is gonna have missing information and then this exception will be thrown.

Solutions on the web

via Stack Overflow by prometheus2305
, 1 year ago
java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
via Stack Overflow by Sahil Sareen
, 1 year ago
java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
via Stack Overflow by parag dharmadhikari
, 7 months ago
java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------
via GitHub by bertomartin
, 2 years ago
java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
via Stack Overflow by SHC
, 1 year ago
java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
via Stack Overflow by Balakrishna D
, 8 months ago
java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
java.lang.ClassNotFoundException: org.datanucleus.api.jdo.JDOPersistenceManagerFactory	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)	at java.lang.Class.forName0(Native Method)	at java.lang.Class.forName(Class.java:348)	at javax.jdo.JDOHelper$18.run(JDOHelper.java:2018)	at javax.jdo.JDOHelper$18.run(JDOHelper.java:2016)	at java.security.AccessController.doPrivileged(Native Method)	at javax.jdo.JDOHelper.forName(JDOHelper.java:2015)	at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1162)	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)	at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310)	at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339)	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248)	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223)	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)	at org.apache.hadoop.hive.metastore.RawStoreProxy.(RawStoreProxy.java:58)	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:497)	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:475)	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523)	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.(HiveMetaStore.java:356)	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.(RetryingHMSHandler.java:54)	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)	at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944)	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:171)	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)	at java.lang.reflect.Constructor.newInstance(Constructor.java:422)	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:62)	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:109)	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)	at java.lang.reflect.Method.invoke(Method.java:497)	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)