java.io.IOException: java.lang.reflect.InvocationTargetException

Apache's JIRA Issue Tracker | Raymond Lau | 2 years ago
  1. 0

    [HIVE-7787] Reading Parquet file with enum in Thrift Encoding throws NoSuchFieldError - ASF JIRA

    apache.org | 11 months ago
    java.io.IOException: java.lang.reflect.InvocationTargetException
  2. 0

    When reading Parquet file, where the original Thrift schema contains a struct with an enum, this causes the following error (full stack trace blow): {code} java.lang.NoSuchFieldError: DECIMAL. {code} Example Thrift Schema: {code} enum MyEnumType { EnumOne, EnumTwo, EnumThree } struct MyStruct { 1: optional MyEnumType myEnumType; 2: optional string field2; 3: optional string field3; } struct outerStruct { 1: optional list<MyStruct> myStructs } {code} Hive Table: {code} CREATE EXTERNAL TABLE mytable ( mystructs array<struct<myenumtype: string, field2: string, field3: string>> ) ROW FORMAT SERDE 'parquet.hive.serde.ParquetHiveSerDe' STORED AS INPUTFORMAT 'parquet.hive.DeprecatedParquetInputFormat' OUTPUTFORMAT 'parquet.hive.DeprecatedParquetOutputFormat' ; {code} Error Stack trace: {code} Java stack trace for Hive 0.12: Caused by: java.lang.NoSuchFieldError: DECIMAL at org.apache.hadoop.hive.ql.io.parquet.convert.ETypeConverter.getNewConverter(ETypeConverter.java:146) at org.apache.hadoop.hive.ql.io.parquet.convert.HiveGroupConverter.getConverterFromDescription(HiveGroupConverter.java:31) at org.apache.hadoop.hive.ql.io.parquet.convert.ArrayWritableGroupConverter.<init>(ArrayWritableGroupConverter.java:45) at org.apache.hadoop.hive.ql.io.parquet.convert.HiveGroupConverter.getConverterFromDescription(HiveGroupConverter.java:34) at org.apache.hadoop.hive.ql.io.parquet.convert.DataWritableGroupConverter.<init>(DataWritableGroupConverter.java:64) at org.apache.hadoop.hive.ql.io.parquet.convert.DataWritableGroupConverter.<init>(DataWritableGroupConverter.java:47) at org.apache.hadoop.hive.ql.io.parquet.convert.HiveGroupConverter.getConverterFromDescription(HiveGroupConverter.java:36) at org.apache.hadoop.hive.ql.io.parquet.convert.DataWritableGroupConverter.<init>(DataWritableGroupConverter.java:64) at org.apache.hadoop.hive.ql.io.parquet.convert.DataWritableGroupConverter.<init>(DataWritableGroupConverter.java:40) at org.apache.hadoop.hive.ql.io.parquet.convert.DataWritableRecordConverter.<init>(DataWritableRecordConverter.java:32) at org.apache.hadoop.hive.ql.io.parquet.read.DataWritableReadSupport.prepareForRead(DataWritableReadSupport.java:128) at parquet.hadoop.InternalParquetRecordReader.initialize(InternalParquetRecordReader.java:142) at parquet.hadoop.ParquetRecordReader.initializeInternalReader(ParquetRecordReader.java:118) at parquet.hadoop.ParquetRecordReader.initialize(ParquetRecordReader.java:107) at org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>(ParquetRecordReaderWrapper.java:92) at org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>(ParquetRecordReaderWrapper.java:66) at org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader(MapredParquetInputFormat.java:51) at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.<init>(CombineHiveRecordReader.java:65) ... 16 more {code}

    Apache's JIRA Issue Tracker | 2 years ago | Raymond Lau
    java.io.IOException: java.lang.reflect.InvocationTargetException
  3. 0

    MnistExample causes an IllegalStateException

    GitHub | 2 years ago | dims12
    java.lang.RuntimeException: java.lang.IllegalStateException: Hidden unit type must either be rectified linear or binary
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    IllegalStateException: Either 'userTransaction' or... must be specified - Spring Forum

    spring.io | 6 months ago
    java.lang.IllegalStateException: Either 'userTransaction' or 'userTransactionName' or 'transactionManager' or 'transacti onManagerName' must be specified
  6. 0

    Solr schema "Already closed" and related errors after base DSE/Solr Node setup

    Stack Overflow | 2 years ago | kaiyzen
    org.apache.solr.common.SolrException: Unable to create core: snet_data.location_test1

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.IllegalStateException

      Field count must be either 1 or 2: 3

      at org.apache.hadoop.hive.ql.io.parquet.convert.ArrayWritableGroupConverter.<init>()
    2. Hive Query Language
      DataWritableReadSupport.prepareForRead
      1. org.apache.hadoop.hive.ql.io.parquet.convert.ArrayWritableGroupConverter.<init>(ArrayWritableGroupConverter.java:38)
      2. org.apache.hadoop.hive.ql.io.parquet.convert.HiveGroupConverter.getConverterFromDescription(HiveGroupConverter.java:34)
      3. org.apache.hadoop.hive.ql.io.parquet.convert.DataWritableGroupConverter.<init>(DataWritableGroupConverter.java:64)
      4. org.apache.hadoop.hive.ql.io.parquet.convert.DataWritableGroupConverter.<init>(DataWritableGroupConverter.java:47)
      5. org.apache.hadoop.hive.ql.io.parquet.convert.HiveGroupConverter.getConverterFromDescription(HiveGroupConverter.java:36)
      6. org.apache.hadoop.hive.ql.io.parquet.convert.DataWritableGroupConverter.<init>(DataWritableGroupConverter.java:64)
      7. org.apache.hadoop.hive.ql.io.parquet.convert.DataWritableGroupConverter.<init>(DataWritableGroupConverter.java:40)
      8. org.apache.hadoop.hive.ql.io.parquet.convert.DataWritableRecordConverter.<init>(DataWritableRecordConverter.java:35)
      9. org.apache.hadoop.hive.ql.io.parquet.read.DataWritableReadSupport.prepareForRead(DataWritableReadSupport.java:152)
      9 frames
    3. Parquet
      ParquetRecordReader.initialize
      1. parquet.hadoop.InternalParquetRecordReader.initialize(InternalParquetRecordReader.java:142)
      2. parquet.hadoop.ParquetRecordReader.initializeInternalReader(ParquetRecordReader.java:118)
      3. parquet.hadoop.ParquetRecordReader.initialize(ParquetRecordReader.java:107)
      3 frames
    4. Hive Query Language
      CombineHiveRecordReader.<init>
      1. org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>(ParquetRecordReaderWrapper.java:92)
      2. org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>(ParquetRecordReaderWrapper.java:66)
      3. org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader(MapredParquetInputFormat.java:71)
      4. org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.<init>(CombineHiveRecordReader.java:65)
      4 frames