java.lang.RuntimeException: Parquet record is malformed: empty fields are illegal, the field should be ommited completely instead

GitHub | nezihyigitbasi | 8 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    GitHub comment 5934#242461851

    GitHub | 8 months ago | nezihyigitbasi
    java.lang.RuntimeException: Parquet record is malformed: empty fields are illegal, the field should be ommited completely instead
  2. 0

    Avro schema with empty arrays and maps

    GitHub | 4 years ago | davidzchen
    parquet.io.ParquetEncodingException: empty fields are illegal, the field should be ommited completely instead
  3. 0

    The insert will fail with following stack: {noformat} by: parquet.io.ParquetEncodingException: empty fields are illegal, the field should be ommited completely instead at parquet.io.MessageColumnIO$MessageColumnIORecordConsumer.endField(MessageColumnIO.java:271) at org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriter$ListDataWriter.write(DataWritableWriter.java:271) at org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriter$GroupDataWriter.write(DataWritableWriter.java:199) at org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriter$MessageDataWriter.write(DataWritableWriter.java:215) at org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriter.write(DataWritableWriter.java:88) at org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriteSupport.write(DataWritableWriteSupport.java:59) at org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriteSupport.write(DataWritableWriteSupport.java:31) at parquet.hadoop.InternalParquetRecordWriter.write(InternalParquetRecordWriter.java:116) at parquet.hadoop.ParquetRecordWriter.write(ParquetRecordWriter.java:123) at parquet.hadoop.ParquetRecordWriter.write(ParquetRecordWriter.java:42) at org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.write(ParquetRecordWriterWrapper.java:111) at org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.write(ParquetRecordWriterWrapper.java:124) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:697) {noformat} Reproduce: {noformat} create table test_small ( key string, arrayValues array<string>) stored as parquet; insert into table test_small select 'abcd', array() from src limit 1; {noformat}

    Apache's JIRA Issue Tracker | 1 year ago | Yongzhi Chen
    parquet.io.ParquetEncodingException: empty fields are illegal, the field should be ommited completely instead
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

    1 unregistered visitors

    Root Cause Analysis

    1. parquet.io.ParquetEncodingException

      empty fields are illegal, the field should be ommited completely instead

      at parquet.io.MessageColumnIO$MessageColumnIORecordConsumer.endField()
    2. Parquet
      MessageColumnIO$MessageColumnIORecordConsumer.endField
      1. parquet.io.MessageColumnIO$MessageColumnIORecordConsumer.endField(MessageColumnIO.java:244)
      1 frame
    3. Hive Query Language
      DataWritableWriteSupport.write
      1. org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriter.writeMap(DataWritableWriter.java:241)
      2. org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriter.writeValue(DataWritableWriter.java:116)
      3. org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriter.writeGroupFields(DataWritableWriter.java:89)
      4. org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriter.write(DataWritableWriter.java:60)
      5. org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriteSupport.write(DataWritableWriteSupport.java:59)
      6. org.apache.hadoop.hive.ql.io.parquet.write.DataWritableWriteSupport.write(DataWritableWriteSupport.java:31)
      6 frames
    4. Parquet
      ParquetRecordWriter.write
      1. parquet.hadoop.InternalParquetRecordWriter.write(InternalParquetRecordWriter.java:121)
      2. parquet.hadoop.ParquetRecordWriter.write(ParquetRecordWriter.java:123)
      3. parquet.hadoop.ParquetRecordWriter.write(ParquetRecordWriter.java:42)
      3 frames
    5. Hive Query Language
      ParquetRecordWriterWrapper.write
      1. org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.write(ParquetRecordWriterWrapper.java:111)
      2. org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.write(ParquetRecordWriterWrapper.java:124)
      2 frames
    6. com.facebook.presto
      HivePageSink.appendPage
      1. com.facebook.presto.hive.HivePageSink$HiveRecordWriter.addRow(HivePageSink.java:747)
      2. com.facebook.presto.hive.HivePageSink.doAppend(HivePageSink.java:411)
      3. com.facebook.presto.hive.HivePageSink.lambda$appendPage$2(HivePageSink.java:390)
      4. com.facebook.presto.hive.authentication.NoHdfsAuthentication.doAs(NoHdfsAuthentication.java:23)
      5. com.facebook.presto.hive.HdfsEnvironment.doAs(HdfsEnvironment.java:76)
      6. com.facebook.presto.hive.HivePageSink.appendPage(HivePageSink.java:390)
      6 frames
    7. presto-spi
      ClassLoaderSafeConnectorPageSink.appendPage
      1. com.facebook.presto.spi.classloader.ClassLoaderSafeConnectorPageSink.appendPage(ClassLoaderSafeConnectorPageSink.java:42)
      1 frame
    8. presto-main
      TaskExecutor$Runner.run
      1. com.facebook.presto.operator.TableWriterOperator.addInput(TableWriterOperator.java:207)
      2. com.facebook.presto.operator.Driver.processInternal(Driver.java:384)
      3. com.facebook.presto.operator.Driver.processFor(Driver.java:301)
      4. com.facebook.presto.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:622)
      5. com.facebook.presto.execution.TaskExecutor$PrioritizedSplitRunner.process(TaskExecutor.java:529)
      6. com.facebook.presto.execution.TaskExecutor$Runner.run(TaskExecutor.java:665)
      6 frames
    9. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames