java.lang.InterruptedException

Apache's JIRA Issue Tracker | Mike Beyer | 2 years ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Apache Spark stopping JVM when master not available

    Stack Overflow | 12 months ago | era
    java.lang.InterruptedException
  2. 0

    Spark, unable to connect to master using submit script

    Stack Overflow | 5 months ago | dashQ
    java.lang.InterruptedException
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

  4. 0

    Is it necessary to submit spark application jar?

    Stack Overflow | 1 year ago | Marcin Lagowski
    java.lang.InterruptedException
  5. 0

    GitHub comment 38#195912077

    GitHub | 12 months ago | VincenzoFerme
    java.lang.InterruptedException

  1. jk 1 times, last 11 months ago
2 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.InterruptedException

    No message provided

    at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedNanos()
  2. Java RT
    AbstractQueuedSynchronizer.tryAcquireSharedNanos
    1. java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedNanos(AbstractQueuedSynchronizer.java:1039)
    2. java.util.concurrent.locks.AbstractQueuedSynchronizer.tryAcquireSharedNanos(AbstractQueuedSynchronizer.java:1328)
    2 frames
  3. Scala
    Await$.result
    1. scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:208)
    2. scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:218)
    3. scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
    4. scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
    5. scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
    6. scala.concurrent.Await$.result(package.scala:107)
    6 frames
  4. Spark
    ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1$$anonfun$apply$mcV$sp$2.apply
    1. org.apache.spark.storage.BlockManagerMaster.removeBroadcast(BlockManagerMaster.scala:137)
    2. org.apache.spark.broadcast.TorrentBroadcast$.unpersist(TorrentBroadcast.scala:227)
    3. org.apache.spark.broadcast.TorrentBroadcastFactory.unbroadcast(TorrentBroadcastFactory.scala:45)
    4. org.apache.spark.broadcast.BroadcastManager.unbroadcast(BroadcastManager.scala:66)
    5. org.apache.spark.ContextCleaner.doCleanupBroadcast(ContextCleaner.scala:185)
    6. org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1$$anonfun$apply$mcV$sp$2.apply(ContextCleaner.scala:147)
    7. org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1$$anonfun$apply$mcV$sp$2.apply(ContextCleaner.scala:138)
    7 frames
  5. Scala
    Option.foreach
    1. scala.Option.foreach(Option.scala:236)
    1 frame
  6. Spark
    ContextCleaner$$anon$3.run
    1. org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:138)
    2. org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply(ContextCleaner.scala:134)
    3. org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply(ContextCleaner.scala:134)
    4. org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1550)
    5. org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:133)
    6. org.apache.spark.ContextCleaner$$anon$3.run(ContextCleaner.scala:65)
    6 frames