org.apache.spark.storage.BlockFetchException: Failed to fetch block from 2 locations. Most recent failure cause:

spark-user | أنس الليثي | 7 months ago
  1. 0

    Re: Spark Streaming Job get killed after running for about 1 hour

    spark-user | 7 months ago | أنس الليثي
    org.apache.spark.storage.BlockFetchException: Failed to fetch block from 2 locations. Most recent failure cause:
  2. 0

    Spark, mail # user - Spark Streaming Job get killed after running for about 1 hour - 2016-04-24, 09:34

    search-hadoop.com | 7 months ago
    org.apache.spark.storage.BlockFetchException: Failed to fetch block from 2 locations. Most recent failure cause: at org.apache.spark.storage.BlockManager$$anonfun$doGetRemote$2.apply(BlockManager.scala:595) at org.apache.spark.storage.BlockManager$$anonfun$doGetRemote$2.apply(BlockManager.scala:585) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
  3. 0

    Spark Streaming job get killed after running for about 1 hour | Web Development

    bighow.org | 7 months ago
    org.apache.spark.storage.BlockFetchException: Failed to fetch block from 2 locations. Most recent failure cause: at org.apache.spark.storage.BlockManager$$anonfun$doGetRemote$2.apply(BlockManager.scala:595) at org.apache.spark.storage.BlockManager$$anonfun$doGetRemote$2.apply(BlockManager.scala:585) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) at org.apache.spark.storage.BlockManager.doGetRemote(BlockManager.scala:585) at org.apache.spark.storage.BlockManager.getRemote(BlockManager.scala:570) at org.apache.spark.storage.BlockManager.get(BlockManager.scala:630)
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

    2 unregistered visitors

    Root Cause Analysis

    1. org.apache.spark.storage.BlockFetchException

      Failed to fetch block from 2 locations. Most recent failure cause:

      at org.apache.spark.storage.BlockManager$$anonfun$doGetRemote$2.apply()
    2. Spark
      BlockManager$$anonfun$doGetRemote$2.apply
      1. org.apache.spark.storage.BlockManager$$anonfun$doGetRemote$2.apply(BlockManager.scala:595)
      2. org.apache.spark.storage.BlockManager$$anonfun$doGetRemote$2.apply(BlockManager.scala:585)
      2 frames
    3. Scala
      ArrayBuffer.foreach
      1. scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
      2. scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
      2 frames
    4. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.storage.BlockManager.doGetRemote(BlockManager.scala:585)
      2. org.apache.spark.storage.BlockManager.getRemote(BlockManager.scala:570)
      3. org.apache.spark.storage.BlockManager.get(BlockManager.scala:630)
      4. org.apache.spark.rdd.BlockRDD.compute(BlockRDD.scala:48)
      5. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      6. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      7. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
      8. org.apache.spark.scheduler.Task.run(Task.scala:89)
      9. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
      9 frames
    5. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames