java.lang.InterruptedException

GitHub | xinghaihu | 3 months ago
  1. 0

    [jvm-packages] xgboost training fails on spark

    GitHub | 3 months ago | xinghaihu
    java.lang.InterruptedException
  2. 0

    GitHub comment 940#193866537

    GitHub | 9 months ago | ypopkov
    java.lang.InterruptedException
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

  4. 0

    GitHub comment 966#197290945

    GitHub | 9 months ago | astrude
    java.lang.InterruptedException
  5. 0

    GitHub comment 966#197385951

    GitHub | 9 months ago | astrude
    java.lang.InterruptedException

  1. treefolk 1 times, last 1 week ago
  2. danleyb2Interintel 1 times, last 2 weeks ago
  3. filpgame 1 times, last 1 month ago
  4. Nikolay Rybak 4 times, last 4 months ago
  5. Handemelindo 1 times, last 4 months ago
6 more registered users
18 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.InterruptedException

    No message provided

    at java.lang.Object.wait()
  2. Java RT
    Object.wait
    1. java.lang.Object.wait(Native Method)
    2. java.lang.Object.wait(Object.java:502)
    2 frames
  3. Spark
    RDD.foreachPartition
    1. org.apache.spark.scheduler.JobWaiter.awaitResult(JobWaiter.scala:102)
    2. org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:612)
    3. org.apache.spark.SparkContext.runJob(SparkContext.scala:1839)
    4. org.apache.spark.SparkContext.runJob(SparkContext.scala:1852)
    5. org.apache.spark.SparkContext.runJob(SparkContext.scala:1865)
    6. org.apache.spark.SparkContext.runJob(SparkContext.scala:1936)
    7. org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:920)
    8. org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:918)
    9. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
    10. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
    11. org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
    12. org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:918)
    12 frames
  4. ml.dmlc.xgboost4j
    XGBoost$$anon$2.run
    1. ml.dmlc.xgboost4j.scala.spark.XGBoost$$anon$2.run(XGBoost.scala:152)
    1 frame