java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":{"transactionid":0,"bucketid":-1,"rowid":0}},"value":null}

Stack Overflow | Jack Wenger | 8 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    ACID transactions on data added from Spark not working

    Stack Overflow | 8 months ago | Jack Wenger
    java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":{"transactionid":0,"bucketid":-1,"rowid":0}},"value":null}
  2. 0

    Hive Runtime Error while Writting into ES table

    GitHub | 2 years ago | maha543
    java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{},"value":{"_col0":"10-04-2015 15:00:00","_col1":"Beijing"}}
  3. 0

    How Many Hive Dynamic Partitions are Needed?

    Stack Overflow | 2 years ago | Mike Wise
    java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveFatalException: [Error 20004]: Fatal error occurred when node tried to create too many dynamic partitions. The maximum number of dynamic partitions is controlled by hive.exec.max.dynamic.partitions and hive.exec.max.dynamic.partitions.pernode. Maximum was set to: 20000
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Hive insert query with dynamic partitioning

    Stack Overflow | 1 year ago | Anil Ekambram
    java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{},"value":{"_col0":15,"_col1":"user4","_col2":"2016-05-06 06:31:48","_col3":"B"}}
  6. 0

    We hit a regression when upgrading from CDH 5.3.3 to CDH 5.4.2. Queries which use CLUSTERED BY tables are failing with a null pointer exception: {code} 2015-05-26 01:26:35,729 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{},"value":{"_col0":235015151,"_col1":10,"_col2":3}} at org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:265) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{},"value":{"_col0":235015151,"_col1":10,"_col2":3}} at org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:253) ... 7 more Caused by: java.lang.NullPointerException at org.apache.hadoop.hive.ql.exec.FileSinkOperator.findWriterOffset(FileSinkOperator.java:761) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:689) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) at org.apache.hadoop.hive.ql.exec.ExtractOperator.processOp(ExtractOperator.java:45) at org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:244) ... 7 more {code} We tracked the issue down to https://issues.apache.org/jira/browse/HIVE-10538 . Applying that patch in a local build of the CDH Hive RPMs fixed the issue for us. Any chance of this being rolled into the next CDH update?

    Cloudera Open Source | 2 years ago | Stephen Veiss
    java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{},"value":{"_col0":235015151,"_col1":10,"_col2":3}}

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.ArrayIndexOutOfBoundsException

      -1

      at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp()
    2. Hive Query Language
      ExecReducer.reduce
      1. org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:723)
      2. org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)
      3. org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
      4. org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:244)
      4 frames
    3. Hadoop
      YarnChild$2.run
      1. org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444)
      2. org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
      3. org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
      3 frames
    4. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:415)
      2 frames
    5. Hadoop
      UserGroupInformation.doAs
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
      1 frame
    6. Hadoop
      YarnChild.main
      1. org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
      1 frame