java.util.NoSuchElementException

tip

Check if the field you try to read really exists in the database. If it is optional, just use com.mongodb.casbah.commons.MongoDBObject#getAs


rp

You have a different solution? A short tip here would help you and many other users who saw this issue last week.

  • java.util.NoSuchElementException: key not found: node at scala.collection.MapLike$class.default(MapLike.scala:228) at scala.collection.AbstractMap.default(Map.scala:58) at scala.collection.MapLike$class.apply(MapLike.scala:141) at scala.collection.AbstractMap.apply(Map.scala:58) at org.elasticsearch.spark.sql.RowValueReader$class.addToBuffer(RowValueReader.scala:32) at org.elasticsearch.spark.sql.ScalaRowValueReader.addToBuffer(ScalaRowValueReader.scala:9) at org.elasticsearch.spark.sql.ScalaRowValueReader.addToMap(ScalaRowValueReader.scala:16) at org.elasticsearch.hadoop.serialization.ScrollReader.map(ScrollReader.java:596) at org.elasticsearch.hadoop.serialization.ScrollReader.read(ScrollReader.java:519) at org.elasticsearch.hadoop.serialization.ScrollReader.list(ScrollReader.java:560) at org.elasticsearch.hadoop.serialization.ScrollReader.read(ScrollReader.java:522) at org.elasticsearch.hadoop.serialization.ScrollReader.map(ScrollReader.java:596) at org.elasticsearch.hadoop.serialization.ScrollReader.read(ScrollReader.java:519) at org.elasticsearch.hadoop.serialization.ScrollReader.readHitAsMap(ScrollReader.java:339) at org.elasticsearch.hadoop.serialization.ScrollReader.readHit(ScrollReader.java:290) at org.elasticsearch.hadoop.serialization.ScrollReader.read(ScrollReader.java:186) at org.elasticsearch.hadoop.serialization.ScrollReader.read(ScrollReader.java:165) at org.elasticsearch.hadoop.rest.RestRepository.scroll(RestRepository.java:403) at org.elasticsearch.hadoop.rest.ScrollQuery.hasNext(ScrollQuery.java:76) at org.elasticsearch.spark.rdd.AbstractEsRDDIterator.hasNext(AbstractEsRDDIterator.scala:46) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48) at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103) at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47) at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273) at scala.collection.AbstractIterator.to(Iterator.scala:1157) at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265) at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157) at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252) at scala.collection.AbstractIterator.toArray(Iterator.scala:1157) at org.apache.spark.rdd.RDD$$anonfun$17.apply(RDD.scala:797) at org.apache.spark.rdd.RDD$$anonfun$17.apply(RDD.scala:797) at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1353) at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1353) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61) at org.apache.spark.scheduler.Task.run(Task.scala:56) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:200) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

Users with the same issue

Handemelindo
35 times, last one,
Unknown visitor1 times, last one,
Nikolay Rybak
12 times, last one,
Unknown visitor3 times, last one,
Unknown visitor1 times, last one,
13 more bugmates