java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]

Stack Overflow | solarenqu | 5 months ago
  1. 0

    Spark 1.6.2 Failed on lunch windows 7 32bit

    Stack Overflow | 5 months ago | solarenqu
    java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
  2. 0

    Apache Spark User List - Futures timed out after 10000 milliseconds

    nabble.com | 1 year ago
    java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
  3. 0

    RE: SparkSQL 1.2.0 sources API error

    spark-user | 2 years ago | Cheng, Hao
    java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    RE: SparkSQL 1.2.0 sources API error

    spark-user | 2 years ago | Cheng, Hao
    java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
  6. 0

    RE: SparkSQL 1.2.0 sources API error

    spark-user | 2 years ago | Cheng, Hao
    java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]

  1. Nikolay Rybak 1 times, last 1 month ago
  2. poroszd 12 times, last 2 months ago
  3. tyson925 6 times, last 2 months ago
  4. kid 1 times, last 4 months ago
  5. Handemelindo 1 times, last 4 months ago
4 more registered users
28 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.util.concurrent.TimeoutException

    Futures timed out after [10000 milliseconds]

    at scala.concurrent.impl.Promise$DefaultPromise.ready()
  2. Scala
    Await$.result
    1. scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
    2. scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
    3. scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
    4. scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
    5. scala.concurrent.Await$.result(package.scala:107)
    5 frames
  3. Akka Remote
    RemoteActorRefProvider.init
    1. akka.remote.Remoting.start(Remoting.scala:179)
    2. akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
    2 frames
  4. Akka Actor
    ActorSystem$.apply
    1. akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:620)
    2. akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:617)
    3. akka.actor.ActorSystemImpl._start(ActorSystem.scala:617)
    4. akka.actor.ActorSystemImpl.start(ActorSystem.scala:634)
    5. akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
    6. akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
    6 frames
  5. Spark
    Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp
    1. org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
    2. org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
    3. org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52)
    4. org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2024)
    4 frames
  6. Scala
    Range.foreach$mVc$sp
    1. scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
    1 frame
  7. Spark
    SparkContext.<init>
    1. org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2015)
    2. org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55)
    3. org.apache.spark.SparkEnv$.create(SparkEnv.scala:266)
    4. org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
    5. org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288)
    6. org.apache.spark.SparkContext.<init>(SparkContext.scala:457)
    6 frames
  8. Spark REPL
    SparkILoop.createSparkContext
    1. org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
    1 frame
  9. $line3
    $eval.$print
    1. $line3.$read$$iwC$$iwC.<init>(<console>:15)
    2. $line3.$read$$iwC.<init>(<console>:24)
    3. $line3.$read.<init>(<console>:26)
    4. $line3.$read$.<init>(<console>:30)
    5. $line3.$read$.<clinit>(<console>)
    6. $line3.$eval$.<init>(<console>:7)
    7. $line3.$eval$.<clinit>(<console>)
    8. $line3.$eval.$print(<console>)
    8 frames
  10. Java RT
    Method.invoke
    1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    4. java.lang.reflect.Method.invoke(Method.java:606)
    4 frames
  11. Spark REPL
    SparkILoop.initializeSpark
    1. org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
    2. org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
    3. org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
    4. org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
    5. org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
    6. org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
    7. org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
    8. org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
    9. org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
    10. org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
    11. org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
    12. org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
    13. org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
    13 frames