java.util.NoSuchElementException

key not found: hqapi1-integration-tests/src/test/java/org/hyperic/hq/hqapi1/test/AgentTransferPlugin_test.java

Samebug tips1

Check if the field you try to read really exists in the database. If it is optional, just use com.mongodb.casbah.commons.MongoDBObject#getAs


rprp

Don't give up yet. Our experts can help. Paste your full stack trace to get a solution.

Solutions on the web281

  • Stack trace

    • java.util.NoSuchElementException: key not found: hqapi1-integration-tests/src/test/java/org/hyperic/hq/hqapi1/test/AgentTransferPlugin_test.java at scala.collection.MapLike$class.default(MapLike.scala:228) at scala.collection.AbstractMap.default(Map.scala:58) at scala.collection.mutable.HashMap.apply(HashMap.scala:64) at com.kodebeagle.model.JavaFileInfo.fileDetails(JavaRepo.scala:126) at com.kodebeagle.spark.RepoAnalyzerJob$.handleJavaRepoFiles(RepoAnalyzerJob.scala:189) at com.kodebeagle.spark.RepoAnalyzerJob$.com$kodebeagle$spark$RepoAnalyzerJob$$handleJavaRepos(RepoAnalyzerJob.scala:145) at com.kodebeagle.spark.RepoAnalyzerJob$$anonfun$handleJavaIndices$1$$anonfun$apply$4.apply(RepoAnalyzerJob.scala:122) at com.kodebeagle.spark.RepoAnalyzerJob$$anonfun$handleJavaIndices$1$$anonfun$apply$4.apply(RepoAnalyzerJob.scala:122) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at com.kodebeagle.spark.RepoAnalyzerJob$$anonfun$handleJavaIndices$1.apply(RepoAnalyzerJob.scala:122) at com.kodebeagle.spark.RepoAnalyzerJob$$anonfun$handleJavaIndices$1.apply(RepoAnalyzerJob.scala:120) at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920) at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$33.apply(RDD.scala:920) at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858) at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:89) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    HandemelindoHandemelindo
    2 times, last one
    rprp
    Once,
    tyson925tyson925
    Once,
    Unknown visitor
    Unknown visitorOnce,
    michallosmichallos
    4 times, last one
    13 more bugmates