java.lang.NullPointerException

Exception #2939965

Samebug tips0

We couldn't find tips for this exception.

Don't give up yet. Paste your full stack trace to get a solution.

Solutions on the web2

  • via GitHub by nabeelxy
    ,
  • via GitHub by nabeelxy
    ,
  • Stack trace

    • java.lang.NullPointerException at sparkle.graph.BeliefPropagation$$anonfun$2.apply(BeliefPropagation.scala:48) at sparkle.graph.BeliefPropagation$$anonfun$2.apply(BeliefPropagation.scala:47) at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) at org.apache.spark.graphx.impl.EdgePartition.map(EdgePartition.scala:185) at org.apache.spark.graphx.impl.GraphImpl$$anonfun$7.apply(GraphImpl.scala:156) at org.apache.spark.graphx.impl.GraphImpl$$anonfun$7.apply(GraphImpl.scala:155) at org.apache.spark.graphx.impl.EdgeRDDImpl$$anonfun$mapEdgePartitions$1.apply(EdgeRDDImpl.scala:121) at org.apache.spark.graphx.impl.EdgeRDDImpl$$anonfun$mapEdgePartitions$1.apply(EdgeRDDImpl.scala:118) at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$17.apply(RDD.scala:710) at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$17.apply(RDD.scala:710) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300) at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:69) at org.apache.spark.rdd.RDD.iterator(RDD.scala:262) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300) at org.apache.spark.rdd.RDD.iterator(RDD.scala:264) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    We couldn't find other users who have seen this exception.