java.lang.OutOfMemoryError: Java heap space

mahout-user | Jaume GalĂ­ | 1 year ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Exception in task 0.0 in stage 13.0 (TID 13) java.lang.OutOfMemoryError: Java heap space

    mahout-user | 1 year ago | jgali@konodrac.com
    java.lang.OutOfMemoryError: Java heap space
  2. Speed up your debug routine!

    Automated exception search integrated into your IDE

  3. 0

    Re: Exception in task 0.0 in stage 13.0 (TID 13) java.lang.OutOfMemoryError: Java heap space

    mahout-user | 1 year ago | Dmitriy Lyubimov
    java.lang.OutOfMemoryError: Java heap space
  4. 0

    Re: Exception in task 0.0 in stage 13.0 (TID 13) java.lang.OutOfMemoryError: Java heap space

    mahout-user | 1 year ago | Dmitriy Lyubimov
    java.lang.OutOfMemoryError: Java heap space

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.OutOfMemoryError

      Java heap space

      at org.apache.mahout.math.DenseMatrix.<init>()
    2. Mahout Math
      DenseMatrix.<init>
      1. org.apache.mahout.math.DenseMatrix.<init>(DenseMatrix.java:66)
      1 frame
    3. org.apache.mahout
      package$$anonfun$blockify$1.apply
      1. org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:70)
      2. org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:59)
      2 frames
    4. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
      2. org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$20.apply(RDD.scala:710)
      3. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      4. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      5. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      6. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      7. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      8. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      9. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      10. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      11. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      12. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      13. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      14. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      15. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      16. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      17. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      18. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
      19. org.apache.spark.scheduler.Task.run(Task.scala:89)
      20. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
      20 frames
    5. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames