Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via GitHub by tcarette
, 1 year ago
This exception has no message.
via GitHub by geoHeil
, 7 months ago
This exception has no message.
via GitHub by hendrik-weiler
, 1 year ago
This exception has no message.
via GitHub by lucifer6642
, 2 years ago
This exception has no message.
via gossamer-threads.com by Unknown author, 1 year ago
This exception has no message.
java.lang.InterruptedException: 	at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1302)	at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:202)	at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:218)	at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153)	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:623)	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1871)	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1884)	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1897)	at org.apache.spark.SparkContext.runJob(SparkContext.scala:1911)	at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:883)	at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:881)	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)	at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)	at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:881)	at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anon$2.run(XGBoost.scala:205)