Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Samebug tips

  1. ,

    Thrown by the methods of the String class to indicate that an index is either negative, or greater than the size of the string itself. You are probably using the wrong index when accessing substring of the array.

  2. ,
    via GitHub by Omertron

    You need to make sure that capitalisation of the case for the plugin in the properties file is exactly: "AllocinePlugin" and not "allocineplugin"

Solutions on the web

via elastic.co by Unknown author, 1 year ago
org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"givename":"kumar","surname":"krishnan","city":"adurai","state":"amilNadu"}
via Stack Overflow by Zack
, 1 year ago
org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
via Stack Overflow by Anuj jain
, 3 months ago
org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
via GitHub by pricecarl
, 2 years ago
org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row <data removed>
via Stack Overflow by Vijetha
, 1 month ago
org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"line":"<a>"}
via gmane.org by Unknown author, 2 years ago
org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row { - Removed -}
java.lang.StringIndexOutOfBoundsException: String index out of range: -1	at java.lang.String.substring(String.java:1911)	at org.elasticsearch.hadoop.rest.RestClient.discoverNodes(RestClient.java:110)	at org.elasticsearch.hadoop.rest.InitializationUtils.discoverNodesIfNeeded(InitializationUtils.java:58)	at org.elasticsearch.hadoop.rest.RestService.createWriter(RestService.java:374)	at org.elasticsearch.hadoop.mr.EsOutputFormat$EsRecordWriter.init(EsOutputFormat.java:173)	at org.elasticsearch.hadoop.hive.EsHiveOutputFormat$EsHiveRecordWriter.write(EsHiveOutputFormat.java:58)	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:695)	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)	at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)	at org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:95)	at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:157)	at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:497)	at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:170)	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)	at java.security.AccessController.doPrivileged(Native Method)	at javax.security.auth.Subject.doAs(Subject.java:415)	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)