Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Stack Overflow by Decula
, 2 years ago
via GitHub by VincenzoFerme
, 1 year ago
This exception has no message.
via GitHub by yxzf
, 1 year ago
via hortonworks.com by Unknown author, 1 year ago
via GitHub by moxious
, 2 months ago
This exception has no message.
via https://bugzilla.redhat.com/bugzilla/ by Jitka Kozana, 1 year ago
java.lang.InterruptedException: 	at java.util.concurrent.locks.AbstractQueuedSynchronizer.tryAcquireSharedNanos(AbstractQueuedSynchronizer.java:1325)	at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:208)	at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:218)	at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)	at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)	at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)	at scala.concurrent.Await$.result(package.scala:107)	at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)	at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:101)	at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:77)	at org.apache.spark.MapOutputTracker.askTracker(MapOutputTracker.scala:110)	at org.apache.spark.MapOutputTracker.sendTracker(MapOutputTracker.scala:120)	at org.apache.spark.MapOutputTrackerMaster.stop(MapOutputTracker.scala:462)	at org.apache.spark.SparkEnv.stop(SparkEnv.scala:93)	at org.apache.spark.SparkContext$$anonfun$stop$12.apply$mcV$sp(SparkContext.scala:1756)	at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1229)	at org.apache.spark.SparkContext.stop(SparkContext.scala:1755)	at org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend.dead(SparkDeploySchedulerBackend.scala:127)	at org.apache.spark.deploy.client.AppClient$ClientEndpoint.markDead(AppClient.scala:264)	at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anon$2$$anonfun$run$1.apply$mcV$sp(AppClient.scala:134)	at org.apache.spark.util.Utils$.tryOrExit(Utils.scala:1163)	at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anon$2.run(AppClient.scala:129)	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)	at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)	at java.lang.Thread.run(Thread.java:745)