java.lang.ArrayIndexOutOfBoundsException

1840


Solutions on the web3

Solution icon of bugzilla
via Eclipse Bugzilla by mikekucera, 1 year ago
1840

Solution icon of github
via GitHub by lossyrob
, 7 months ago
1840


Stack trace

  • java.lang.ArrayIndexOutOfBoundsException: 1840 at geotrellis.raster.reproject.RowTransform$.geotrellis$raster$reproject$RowTransform$$computeApprox(RowTransform.scala:76) at geotrellis.raster.reproject.RowTransform$.geotrellis$raster$reproject$RowTransform$$computeApprox(RowTransform.scala:68) at geotrellis.raster.reproject.RowTransform$.geotrellis$raster$reproject$RowTransform$$computeApprox(RowTransform.scala:68) at geotrellis.raster.reproject.RowTransform$.geotrellis$raster$reproject$RowTransform$$computeApprox(RowTransform.scala:68) at geotrellis.raster.reproject.RowTransform$$anonfun$approximate$1.apply(RowTransform.scala:34) at geotrellis.raster.reproject.RowTransform$$anonfun$approximate$1.apply(RowTransform.scala:24) at geotrellis.raster.reproject.Reproject$.apply(Reproject.scala:70) at geotrellis.raster.reproject.package$ReprojectExtentsion.reproject(package.scala:22) at geotrellis.raster.reproject.package$ReprojectExtentsion.reproject(package.scala:16) at geotrellis.spark.ingest.Ingest$$anonfun$geotrellis$spark$ingest$Ingest$$_reproject$1$1.apply(Ingest.scala:63) at geotrellis.spark.ingest.Ingest$$anonfun$geotrellis$spark$ingest$Ingest$$_reproject$1$1.apply(Ingest.scala:60) at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at scala.collection.TraversableOnce$class.reduceLeft(TraversableOnce.scala:172) at scala.collection.AbstractIterator.reduceLeft(Iterator.scala:1157) at org.apache.spark.rdd.RDD$$anonfun$17.apply(RDD.scala:790) at org.apache.spark.rdd.RDD$$anonfun$17.apply(RDD.scala:788) at org.apache.spark.SparkContext$$anonfun$23.apply(SparkContext.scala:1116) at org.apache.spark.SparkContext$$anonfun$23.apply(SparkContext.scala:1116) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:111) at org.apache.spark.scheduler.Task.run(Task.scala:51) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:187) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745)

Write tip

You have a different solution? A short tip here would help you and many other users who saw this issue last week.

Users with the same issue

You are the first who have seen this exception. Write a tip to help other users and build your expert profile.