java.lang.Integer

Google Groups | Poorna Chandra | 6 months ago
  1. 0

    Variable arguments will be converted to WrappedArray implicitly with proper component type, but under below case, for ArrayWrapper.updateOne method, the argument will be converted to WrappedArray$ofRef (should be WrappedArray$ofDouble) discarding the declared type parameter [T] (Double here, for example). The code below is similar to Scala's collections library code, and works under Scala 2.9.x. {code} import scala.collection.mutable.WrappedArray import scala.reflect.ClassTag object Main { def main(args: Array[String]) { val test = new ArrayWrapper[Double] test.update(1.0) // ok test.updateOne(1.0) // failure with ArrayStoreExcpetion } } class ArrayWrapper[T: ClassTag] { val array = new Array[T](2) // elem will be boxed to java.lang.Object (java.lang.Double), then wrapped as // Array[Object] and passed to scala.LowPriorityImplicits.genericWrapArray[T](xs: Array[T]) // becaues of update(elems: T*), and causes ArrayStoreException. def updateOne(elem: T) { update(elem) } // ok when called outside directly def update(elems: T*) { updateAll(elems) } def updateAll(elems: scala.collection.Traversable[T]) { println(elems.getClass) // class scala.collection.mutable.WrappedArray$ofDouble -- when via update // class scala.collection.mutable.WrappedArray$ofRef -- when via updateOne elems match { case xs: WrappedArray[T] => println(xs.array) // [D@504ad009 --- when via update // [Ljava.lang.Object;@3b87bd31 --- when via updateOne array(0) = xs.array(0) // ok System.arraycopy(xs.array, 0, array, 0, xs.length) // ok --- when via update // failure with java.lang.ArrayStoreException --- when via updateOne println("== ok ==") case _ => } } } {code}

    Scala JIRA | 4 years ago | Caoyuan Deng
    java.lang.Integer
  2. Speed up your debug routine!

    Automated exception search integrated into your IDE

  3. 0

    better error messages > smaller stack traces

    Google Groups | 6 years ago | stuart....@gmail.com
    java.lang.Integer
  4. 0

    Auto-Incremento PostgreSQL com chave primária TYPE serial

    Google Groups | 6 years ago | Cristofer Sousa
    java.lang.Integer

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.Integer

      No message provided

      at co.cask.cdap.format.RecordPutTransformer.setField()
    2. co.cask.cdap
      RecordPutTransformer.toPut
      1. co.cask.cdap.format.RecordPutTransformer.setField(RecordPutTransformer.java:104)[cdap-formats-3.4.1.jar:na]
      2. co.cask.cdap.format.RecordPutTransformer.toPut(RecordPutTransformer.java:83)[cdap-formats-3.4.1.jar:na]
      2 frames
    3. co.cask.hydrator
      HBaseSink.transform
      1. co.cask.hydrator.plugin.sink.HBaseSink.transform(HBaseSink.java:150)[1465530168450-0/:na]
      2. co.cask.hydrator.plugin.sink.HBaseSink.transform(HBaseSink.java:57)[1465530168450-0/:na]
      2 frames
    4. co.cask.cdap
      ETLMapReduce$ETLMapper.map
      1. co.cask.cdap.etl.common.TrackedTransform.transform(TrackedTransform.java:59)[cdap-etl-core-3.4.1.jar:na]
      2. co.cask.cdap.etl.common.TransformExecutor.executeTransformation(TransformExecutor.java:86)[cdap-etl-core-3.4.1.jar:na]
      3. co.cask.cdap.etl.common.TransformExecutor.executeTransformation(TransformExecutor.java:90)[cdap-etl-core-3.4.1.jar:na]
      4. co.cask.cdap.etl.common.TransformExecutor.runOneIteration(TransformExecutor.java:49)[cdap-etl-core-3.4.1.jar:na]
      5. co.cask.cdap.etl.batch.mapreduce.TransformRunner.transform(TransformRunner.java:154)[cdap-etl-batch-3.4.1.jar:na]
      6. co.cask.cdap.etl.batch.mapreduce.ETLMapReduce$ETLMapper.map(ETLMapReduce.java:299)[cdap-etl-batch-3.4.1.jar:na]
      6 frames
    5. Hadoop
      Mapper.run
      1. org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)[org.apache.hadoop.hadoop-mapreduce-client-core-2.3.0.jar:na]
      1 frame
    6. co.cask.cdap
      MapperWrapper.run
      1. co.cask.cdap.internal.app.runtime.batch.MapperWrapper.run(MapperWrapper.java:117)[co.cask.cdap.cdap-app-fabric-3.4.1.jar:na]
      1 frame
    7. Hadoop
      LocalJobRunnerWithFix$Job$MapTaskRunnable.run
      1. org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)[org.apache.hadoop.hadoop-mapreduce-client-core-2.3.0.jar:na]
      2. org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)[org.apache.hadoop.hadoop-mapreduce-client-core-2.3.0.jar:na]
      3. org.apache.hadoop.mapred.LocalJobRunnerWithFix$Job$MapTaskRunnable.run(LocalJobRunnerWithFix.java:243)[co.cask.cdap.cdap-app-fabric-3.4.1.jar:na]
      3 frames
    8. Java RT
      Thread.run
      1. java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)[na:1.7.0_75]
      2. java.util.concurrent.FutureTask.run(FutureTask.java:262)[na:1.7.0_75]
      3. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)[na:1.7.0_75]
      4. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)[na:1.7.0_75]
      5. java.lang.Thread.run(Thread.java:745)[na:1.7.0_75]
      5 frames
    9. Hadoop
      LocalJobRunnerWithFix$Job.run
      1. org.apache.hadoop.mapred.LocalJobRunnerWithFix$Job.runTasks(LocalJobRunnerWithFix.java:465)
      2. org.apache.hadoop.mapred.LocalJobRunnerWithFix$Job.run(LocalJobRunnerWithFix.java:524)
      2 frames