org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":129390185139228,"reducesinkkey1":"00008AF10000000063CA6F"},"value":{"_col0":"00008AF10000000063CA6F","_col1":"2011-07-27 22:48:52","_col2":129390185139228,"_col3":2006,"_col4":4100,"_col5":"10017388=6","_col6":1063,"_col7":"NULL","_col8":"address.com","_col9":"NULL","_col10":"NULL"},"alias":0}

Apache's JIRA Issue Tracker | Sergey Tryuber | 5 years ago
  1. 0

    I execute a huge query on a table with a lot of 2-level partitions. There is a perl reducer in my query. Maps worked ok, but every reducer fails with the following exception: 2011-08-11 04:58:29,865 INFO org.apache.hadoop.hive.ql.exec.ScriptOperator: Executing [/usr/bin/perl, <reducer.pl>, <my_argument>] 2011-08-11 04:58:29,866 INFO org.apache.hadoop.hive.ql.exec.ScriptOperator: tablename=null 2011-08-11 04:58:29,866 INFO org.apache.hadoop.hive.ql.exec.ScriptOperator: partname=null 2011-08-11 04:58:29,866 INFO org.apache.hadoop.hive.ql.exec.ScriptOperator: alias=null 2011-08-11 04:58:29,935 FATAL ExecReducer: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":129390185139228,"reducesinkkey1":"00008AF10000000063CA6F"},"value":{"_col0":"00008AF10000000063CA6F","_col1":"2011-07-27 22:48:52","_col2":129390185139228,"_col3":2006,"_col4":4100,"_col5":"10017388=6","_col6":1063,"_col7":"NULL","_col8":"address.com","_col9":"NULL","_col10":"NULL"},"alias":0} at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:256) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:468) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:416) at org.apache.hadoop.mapred.Child$4.run(Child.java:268) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115) at org.apache.hadoop.mapred.Child.main(Child.java:262) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Cannot initialize ScriptOperator at org.apache.hadoop.hive.ql.exec.ScriptOperator.processOp(ScriptOperator.java:320) at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:744) at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84) at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:744) at org.apache.hadoop.hive.ql.exec.ExtractOperator.processOp(ExtractOperator.java:45) at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471) at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:247) ... 7 more Caused by: java.io.IOException: Cannot run program "/usr/bin/perl": java.io.IOException: error=7, Argument list too long at java.lang.ProcessBuilder.start(ProcessBuilder.java:460) at org.apache.hadoop.hive.ql.exec.ScriptOperator.processOp(ScriptOperator.java:279) ... 15 more Caused by: java.io.IOException: java.io.IOException: error=7, Argument list too long at java.lang.UNIXProcess.<init>(UNIXProcess.java:148) at java.lang.ProcessImpl.start(ProcessImpl.java:65) at java.lang.ProcessBuilder.start(ProcessBuilder.java:453) ... 16 more It seems to me, I found the cause. ScriptOperator.java puts a lot of configs as environment variables to the child reduce process. One of variables is mapred.input.dir, which in my case more than 150KB. There are a huge amount of input directories in this variable. In short, the problem is that Linux (up to 2.6.23 kernel version) limits summary size of environment variables for child processes to 132KB. This problem could be solved by upgrading the kernel. But strings limitations still be 132KB per string in environment variable. So such huge variable doesn't work even on my home computer (2.6.32). You can read more information on (http://www.kernel.org/doc/man-pages/online/pages/man2/execve.2.html). For now all our work has been stopped because of this problem and I can't find the solution. The only solution, which seems to me more reasonable is to get rid of this variable in reducers.

    Apache's JIRA Issue Tracker | 5 years ago | Sergey Tryuber
    org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":129390185139228,"reducesinkkey1":"00008AF10000000063CA6F"},"value":{"_col0":"00008AF10000000063CA6F","_col1":"2011-07-27 22:48:52","_col2":129390185139228,"_col3":2006,"_col4":4100,"_col5":"10017388=6","_col6":1063,"_col7":"NULL","_col8":"address.com","_col9":"NULL","_col10":"NULL"},"alias":0}
  2. 0

    [Hive-dev] [jira] [Created] (HIVE-2372) java.io.IOException: error=7, Argument list too long - Grokbase

    grokbase.com | 8 months ago
    org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":129390185139228,"reducesinkkey1":"00008AF10000000063CA6F"},"value":{"_col0":"00008AF10000000063CA6F","_col1":"2011-07-27 22:48:52","_col2":129390185139228,"_col3":2006,"_col4":4100,"_col5":"10017388=6","_col6":1063,"_col7":"NULL","_col8":"address.com","_col9":"NULL","_col10":"NULL"},"alias":0}
  3. 0

    I execute a huge query on a table with a lot of 2-level partitions. There is a perl reducer in my query. Maps worked ok, but every reducer fails with the following exception: 2011-08-11 04:58:29,865 INFO org.apache.hadoop.hive.ql.exec.ScriptOperator: Executing [/usr/bin/perl, <reducer.pl>, <my_argument>] 2011-08-11 04:58:29,866 INFO org.apache.hadoop.hive.ql.exec.ScriptOperator: tablename=null 2011-08-11 04:58:29,866 INFO org.apache.hadoop.hive.ql.exec.ScriptOperator: partname=null 2011-08-11 04:58:29,866 INFO org.apache.hadoop.hive.ql.exec.ScriptOperator: alias=null 2011-08-11 04:58:29,935 FATAL ExecReducer: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":129390185139228,"reducesinkkey1":"00008AF10000000063CA6F"},"value":{"_col0":"00008AF10000000063CA6F","_col1":"2011-07-27 22:48:52","_col2":129390185139228,"_col3":2006,"_col4":4100,"_col5":"10017388=6","_col6":1063,"_col7":"NULL","_col8":"address.com","_col9":"NULL","_col10":"NULL"},"alias":0} at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:256) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:468) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:416) at org.apache.hadoop.mapred.Child$4.run(Child.java:268) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115) at org.apache.hadoop.mapred.Child.main(Child.java:262) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Cannot initialize ScriptOperator at org.apache.hadoop.hive.ql.exec.ScriptOperator.processOp(ScriptOperator.java:320) at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:744) at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84) at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:744) at org.apache.hadoop.hive.ql.exec.ExtractOperator.processOp(ExtractOperator.java:45) at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471) at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:247) ... 7 more Caused by: java.io.IOException: Cannot run program "/usr/bin/perl": java.io.IOException: error=7, Argument list too long at java.lang.ProcessBuilder.start(ProcessBuilder.java:460) at org.apache.hadoop.hive.ql.exec.ScriptOperator.processOp(ScriptOperator.java:279) ... 15 more Caused by: java.io.IOException: java.io.IOException: error=7, Argument list too long at java.lang.UNIXProcess.<init>(UNIXProcess.java:148) at java.lang.ProcessImpl.start(ProcessImpl.java:65) at java.lang.ProcessBuilder.start(ProcessBuilder.java:453) ... 16 more It seems to me, I found the cause. ScriptOperator.java puts a lot of configs as environment variables to the child reduce process. One of variables is mapred.input.dir, which in my case more than 150KB. There are a huge amount of input directories in this variable. In short, the problem is that Linux (up to 2.6.23 kernel version) limits summary size of environment variables for child processes to 132KB. This problem could be solved by upgrading the kernel. But strings limitations still be 132KB per string in environment variable. So such huge variable doesn't work even on my home computer (2.6.32). You can read more information on (http://www.kernel.org/doc/man-pages/online/pages/man2/execve.2.html). For now all our work has been stopped because of this problem and I can't find the solution. The only solution, which seems to me more reasonable is to get rid of this variable in reducers.

    Apache's JIRA Issue Tracker | 5 years ago | Sergey Tryuber
    org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{"reducesinkkey0":129390185139228,"reducesinkkey1":"00008AF10000000063CA6F"},"value":{"_col0":"00008AF10000000063CA6F","_col1":"2011-07-27 22:48:52","_col2":129390185139228,"_col3":2006,"_col4":4100,"_col5":"10017388=6","_col6":1063,"_col7":"NULL","_col8":"address.com","_col9":"NULL","_col10":"NULL"},"alias":0}
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    bulk size text parameters in jenkins report error

    Server Fault | 3 years ago | valpa
    java.io.IOException: Cannot run program "/bin/sh" (in directory "/var/lib/jenkins/workspace/TEST-save_txt_to_file"): java.io.IOException: error=7, Argument list too long
  6. 0

    rmr2 failed because Rscript error

    GitHub | 4 years ago | zhanxw
    java.lang.RuntimeException: Error in configuring object

  1. eti22 1 times, last 1 month ago
  2. asdasd 24 times, last 3 months ago
  3. rexgreenza 8 times, last 3 months ago
  4. Benj-AD 1 times, last 4 months ago
  5. Akshay 4 times, last 6 months ago
95 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.io.IOException

    java.io.IOException: error=7, Argument list too long

    at java.lang.UNIXProcess.<init>()
  2. Java RT
    ProcessBuilder.start
    1. java.lang.UNIXProcess.<init>(UNIXProcess.java:148)
    2. java.lang.ProcessImpl.start(ProcessImpl.java:65)
    3. java.lang.ProcessBuilder.start(ProcessBuilder.java:453)
    3 frames
  3. Hive Query Language
    ExecReducer.reduce
    1. org.apache.hadoop.hive.ql.exec.ScriptOperator.processOp(ScriptOperator.java:279)
    2. org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
    3. org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:744)
    4. org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
    5. org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
    6. org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:744)
    7. org.apache.hadoop.hive.ql.exec.ExtractOperator.processOp(ExtractOperator.java:45)
    8. org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471)
    9. org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:247)
    9 frames
  4. Hadoop
    Child$4.run
    1. org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:468)
    2. org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:416)
    3. org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    3 frames
  5. Java RT
    Subject.doAs
    1. java.security.AccessController.doPrivileged(Native Method)
    2. javax.security.auth.Subject.doAs(Subject.java:396)
    2 frames
  6. Hadoop
    UserGroupInformation.doAs
    1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
    1 frame
  7. Hadoop
    Child.main
    1. org.apache.hadoop.mapred.Child.main(Child.java:262)
    1 frame