java.lang.IllegalArgumentException: offset (0) + length (4) exceed the capacity of the array: 2

GitHub | finch0001 | 7 months ago
Do you know that we can give you better hits? Get more relevant results from Samebug’s stack trace search.
  1. 0

    hbase-indexer cant convert numberic column to solr numberic field ???

    GitHub | 7 months ago | finch0001
    java.lang.IllegalArgumentException: offset (0) + length (4) exceed the capacity of the array: 2
  2. 0

    I was exposed to an exception in the tracing code, during my test setup of Phoenix in the following code: {code} 58062 [defaultRpcServer.handler=2,queue=0,port=53950] WARN org.apache.hadoop.ipc.RpcServer - defaultRpcServer.handler=2,queue=0,port=53950: caught: java.lang.IllegalArgumentException: offset (0) + length (4) exceed the capacity of the array: 3 at org.apache.hadoop.hbase.util.Bytes.explainWrongLengthOrOffset( at org.apache.hadoop.hbase.util.Bytes.toInt( at org.apache.hadoop.hbase.util.Bytes.toInt( at org.apache.phoenix.trace.TracingCompat.readAnnotation( at org.apache.phoenix.trace.TraceMetricSource.receiveSpan( at org.cloudera.htrace.Tracer.deliver( at org.cloudera.htrace.impl.MilliSpan.stop( at org.cloudera.htrace.TraceScope.close( at at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop( at org.apache.hadoop.hbase.ipc.RpcExecutor$ at {code} It is related to the following line of code where we interpret all KV annotation values as byte-wise integers here: Here is where HBase is adding a non-integer KV annotation: The fix should be simple, but I am not aware of all the related issues in changing this. cc [~jesse_yates], [], [~giacomotaylor]

    Apache's JIRA Issue Tracker | 2 years ago | Dan Di Spaltro
    java.lang.IllegalArgumentException: offset (0) + length (4) exceed the capacity of the array: 3
  3. 0

    I have create a table via hive in hbase. When I insert integer data into the table, it can be retrieved by hive or hbase, but can not be retrieved correctly via phoenix, and the error is : java.lang.IllegalArgumentException: offset (715) + length (8) exceed the capacity of the array: 720 at org.apache.hadoop.hbase.util.Bytes.explainWrongLengthOrOffset( at org.apache.hadoop.hbase.util.Bytes.toLong( at org.apache.hadoop.hbase.util.Bytes.toDouble( at com.salesforce.phoenix.schema.PDataType$UnsignedDoubleCodec.decodeDouble( at com.salesforce.phoenix.schema.PDataType$18.toObject( at com.salesforce.phoenix.schema.PDataType.toObject( at com.salesforce.phoenix.schema.PDataType.toObject( at com.salesforce.phoenix.schema.PDataType.toObject( at com.salesforce.phoenix.compile.ExpressionProjector.getValue( at com.salesforce.phoenix.jdbc.PhoenixResultSet.getObject( at sqlline.SqlLine$Rows$Row.<init>( at sqlline.SqlLine$IncrementalRows.hasNext( at sqlline.SqlLine$TableOutputFormat.print( at sqlline.SqlLine.print( at sqlline.SqlLine$Commands.execute( at sqlline.SqlLine$Commands.sql( at sqlline.SqlLine.dispatch( at sqlline.SqlLine.begin( at sqlline.SqlLine.mainWithInputRedirection( at sqlline.SqlLine.main( The data type in hive is integer and the data type in phoenix is unsigned_int. Also, if insert data by phoenix, it will be displayed correctly in phoenix but null in hive or hbase. How can I resolve this problem? Any help will be appreciated.

    Apache's JIRA Issue Tracker | 3 years ago | liuziliang
    java.lang.IllegalArgumentException: offset (75) + length (4) exceed the capacity of the array: 76
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Universal Image Loader : IllegalArgumentException when using FileNameGenerator with extension

    Stack Overflow | 2 years ago
    java.lang.IllegalArgumentException: keys must match regex [a-z0-9_-] {1,64}: "1828294.jpg" at com.nostra13.universalimageloader.cache.disc.impl.ext.DiskLruCache.valida teKey( at com.nostra13.universalimageloader.cache.disc.impl.ext.DiskLruCache.get(Di at com.nostra13.universalimageloader.cache.disc.impl.ext.LruDiscCache.get(Lr at com.nostra13.universalimageloader.core.ImageLoaderEngine$ at java.util.concurrent.ThreadPoolExecutor.runWorker( :1112) at java.util.concurrent.ThreadPoolExecutor$ a:587)
  6. 0

    [elasticsearch] elasticsearch couchdb-river startup issues - Grokbase | 7 months ago
    java.lang.IllegalArgumentException: URI can't be null. at at at at at org.elasticsearch.river.couchdb.CouchdbRiver$

    3 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.IllegalArgumentException

      offset (0) + length (4) exceed the capacity of the array: 2

      at org.apache.hadoop.hbase.util.Bytes.explainWrongLengthOrOffset()
    2. HBase
      1. org.apache.hadoop.hbase.util.Bytes.explainWrongLengthOrOffset(
      2. org.apache.hadoop.hbase.util.Bytes.toInt(
      3. org.apache.hadoop.hbase.util.Bytes.toInt(
      3 frames
    3. com.ngdata.hbaseindexer
      1. com.ngdata.hbaseindexer.parse.ByteArrayValueMappers$1.mapInternal(
      2. com.ngdata.hbaseindexer.parse.ByteArrayValueMappers$
      3. com.ngdata.hbaseindexer.morphline.ExtractHBaseCellsBuilder$Mapping.extractWithSingleOutputField(
      4. com.ngdata.hbaseindexer.morphline.ExtractHBaseCellsBuilder$Mapping.apply(
      5. com.ngdata.hbaseindexer.morphline.ExtractHBaseCellsBuilder$ExtractHBaseCells.doProcess(
      5 frames
    4. Kite Morphlines Core
      1. org.kitesdk.morphline.base.AbstractCommand.process(
      2. org.kitesdk.morphline.base.AbstractCommand.doProcess(
      3. org.kitesdk.morphline.base.AbstractCommand.process(
      3 frames
    5. com.ngdata.hbaseindexer
      3. com.ngdata.hbaseindexer.indexer.Indexer$RowBasedIndexer.calculateIndexUpdates(
      4. com.ngdata.hbaseindexer.indexer.Indexer.indexRowData(
      5. com.ngdata.hbaseindexer.indexer.IndexingEventListener.processEvents(
      5 frames
    6. com.ngdata.sep
      1. com.ngdata.sep.impl.SepEventExecutor$
      1 frame
    7. Java RT
      1. java.util.concurrent.Executors$
      3. java.util.concurrent.ThreadPoolExecutor.runWorker(
      4. java.util.concurrent.ThreadPoolExecutor$
      5 frames