java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes

Stack Overflow | user6608138 | 3 months ago
  1. 1

    SparkSQL+Hive+Hbase+HbaseIntegration doesn't work

    Stack Overflow | 3 months ago | user6608138
    java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes
  2. 0

    Confusion Installing Phoenix Spark Plugin / Various Errors

    phoenix-user | 1 year ago | Cox, Jonathan A
    java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
  3. 0

    Hadoop HBase user's mailing list ()

    gmane.org | 11 months ago
    java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Hadoop HBase user's mailing list ()

    gmane.org | 1 year ago
    java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
  6. 0

    Confusion Installing Phoenix Spark Plugin / Various Errors

    phoenix-user | 1 year ago | Cox, Jonathan A
    java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.NoClassDefFoundError

      org/apache/hadoop/hbase/util/Bytes

      at org.apache.hadoop.hive.hbase.HBaseSerDe.parseColumnsMapping()
    2. org.apache.hadoop
      HBaseSerDe.initialize
      1. org.apache.hadoop.hive.hbase.HBaseSerDe.parseColumnsMapping(HBaseSerDe.java:184)
      2. org.apache.hadoop.hive.hbase.HBaseSerDeParameters.<init>(HBaseSerDeParameters.java:73)
      3. org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:117)
      3 frames
    3. Hive Serde
      SerDeUtils.initializeSerDe
      1. org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:53)
      2. org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:521)
      2 frames
    4. Hive Metastore
      MetaStoreUtils.getDeserializer
      1. org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:391)
      1 frame
    5. Hive Query Language
      Table.getCols
      1. org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276)
      2. org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258)
      3. org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605)
      3 frames
    6. org.apache.spark
      ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply
      1. org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:331)
      2. org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:326)
      2 frames
    7. Scala
      Option.map
      1. scala.Option.map(Option.scala:145)
      1 frame
    8. org.apache.spark
      ClientWrapper.getTable
      1. org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1.apply(ClientWrapper.scala:326)
      2. org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1.apply(ClientWrapper.scala:321)
      3. org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:279)
      4. org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:226)
      5. org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:225)
      6. org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:268)
      7. org.apache.spark.sql.hive.client.ClientWrapper.getTableOption(ClientWrapper.scala:321)
      8. org.apache.spark.sql.hive.client.ClientInterface$class.getTable(ClientInterface.scala:122)
      9. org.apache.spark.sql.hive.client.ClientWrapper.getTable(ClientWrapper.scala:60)
      9 frames
    9. Spark Project Hive
      HiveContext$$anon$2.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$$super$lookupRelation
      1. org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:384)
      2. org.apache.spark.sql.hive.HiveContext$$anon$2.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$$super$lookupRelation(HiveContext.scala:457)
      2 frames
    10. Spark Project Catalyst
      OverrideCatalog$class.lookupRelation
      1. org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookupRelation(Catalog.scala:161)
      1 frame
    11. Spark Project Hive
      HiveContext$$anon$2.lookupRelation
      1. org.apache.spark.sql.hive.HiveContext$$anon$2.lookupRelation(HiveContext.scala:457)
      1 frame
    12. Spark Project Catalyst
      Analyzer$ResolveRelations$.getTable
      1. org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.getTable(Analyzer.scala:303)
      1 frame