org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0: java.lang.NullPointerException

Stack Overflow | Dan Wild | 7 months ago
tip
Do you know that we can give you better hits? Get more relevant results from Samebug’s stack trace search.
  1. 0

    Accessing a kererized remote HBASE cluster from Spark

    Stack Overflow | 7 months ago | Dan Wild
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0: java.lang.NullPointerException
  2. 0

    NullPointerException during connection creation.

    GitHub | 8 months ago | AbhiMadav
    org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, cdh52.vm.com): java.lang.NullPointerException

    Root Cause Analysis

    1. org.apache.spark.SparkException

      Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0: java.lang.NullPointerException

      at org.apache.hadoop.hbase.security.UserProvider.instantiate()
    2. HBase
      UserProvider.instantiate
      1. org.apache.hadoop.hbase.security.UserProvider.instantiate(UserProvider.java:43)
      1 frame
    3. HBase - Client
      ConnectionFactory.createConnection
      1. org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:214)
      2. org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119)
      2 frames
    4. org.apache.spark
      HBaseTableScanRDD$$anonfun$7.apply
      1. org.apache.spark.sql.execution.datasources.hbase.TableResource.init(HBaseResources.scala:125)
      2. org.apache.spark.sql.execution.datasources.hbase.ReferencedResource$class.liftedTree1$1(HBaseResources.scala:57)
      3. org.apache.spark.sql.execution.datasources.hbase.ReferencedResource$class.acquire(HBaseResources.scala:54)
      4. org.apache.spark.sql.execution.datasources.hbase.TableResource.acquire(HBaseResources.scala:120)
      5. org.apache.spark.sql.execution.datasources.hbase.ReferencedResource$class.releaseOnException(HBaseResources.scala:74)
      6. org.apache.spark.sql.execution.datasources.hbase.TableResource.releaseOnException(HBaseResources.scala:120)
      7. org.apache.spark.sql.execution.datasources.hbase.TableResource.getScanner(HBaseResources.scala:144)
      8. org.apache.spark.sql.execution.datasources.hbase.HBaseTableScanRDD$$anonfun$7.apply(HBaseTableScan.scala:267)
      9. org.apache.spark.sql.execution.datasources.hbase.HBaseTableScanRDD$$anonfun$7.apply(HBaseTableScan.scala:266)
      9 frames
    5. Scala
      ForkJoinWorkerThread.run
      1. scala.collection.parallel.mutable.ParArray$Map.leaf(ParArray.scala:658)
      2. scala.collection.parallel.Task$$anonfun$tryLeaf$1.apply$mcV$sp(Tasks.scala:54)
      3. scala.collection.parallel.Task$$anonfun$tryLeaf$1.apply(Tasks.scala:53)
      4. scala.collection.parallel.Task$$anonfun$tryLeaf$1.apply(Tasks.scala:53)
      5. scala.collection.parallel.Task$class.tryLeaf(Tasks.scala:56)
      6. scala.collection.parallel.mutable.ParArray$Map.tryLeaf(ParArray.scala:650)
      7. scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask$class.compute(Tasks.scala:165)
      8. scala.collection.parallel.AdaptiveWorkStealingForkJoinTasks$WrappedTask.compute(Tasks.scala:514)
      9. scala.concurrent.forkjoin.RecursiveAction.exec(RecursiveAction.java:160)
      10. scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
      11. scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
      12. scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
      13. scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
      13 frames