java.lang.RuntimeException: while asking for ForwardTaskOp(2016-09-27T09:02:01.392Z,instance [mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6],MesosUpdate(Instance(instance [mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6],AgentInfo(wrk.fritz.box,Some(9a01f2b3-0086-4386-8042-51ba3f811281-S0),List()),InstanceState(Running,2016-09-27T09:01:31.249Z,2016-09-27T08:45:52.570Z,Some(true)),Map(task [mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6.sleep1] -> LaunchedEphemeral(task [mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6.sleep1],AgentInfo(wrk.fritz.box,Some(9a01f2b3-0086-4386-8042-51ba3f811281-S0),List()),2016-09-27T08:45:52.570Z,Status(2016-09-27T09:01:31.076Z,Some(2016-09-27T09:01:31.241Z),Some(task_id { value: "mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6.sleep1" } state: TASK_RUNNING slave_id { value: "9a01f2b3-0086-4386-8042-51ba3f811281-S0" } timestamp: 1.4749668912414E9 executor_id { value: "instance-mypod2.fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6" } healthy: true source: SOURCE_EXECUTOR uuid: "\2373\215KW\251O\345\214pp\200w!\203\343" container_status { network_infos { ip_addresses { ip_address: "192.168.178.41" } } executor_pid: 66208 } ),Running),List()), task [mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6.sleep2] -> LaunchedEphemeral(task [mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6.sleep2],AgentInfo(wrk.fritz.box,Some(9a01f2b3-0086-4386-8042-51ba3f811281-S0),List()),2016-09-27T08:45:52.570Z,Status(2016-09-27T09:01:31.076Z,Some(2016-09-27T09:01:31.241Z),Some(task_id { value: "mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6.sleep2" } state: TASK_RUNNING slave_id { value: "9a01f2b3-0086-4386-8042-51ba3f811281-S0" } timestamp: 1.474966891241516E9 executor_id { value: "instance-mypod2.fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6" } healthy: true source: SOURCE_EXECUTOR uuid: "\200\365\230vj\231N\376\260\331\360W\2701\212\"" container_status { network_infos { ip_addresses { ip_address: "192.168.178.41" } } executor_pid: 66208 } ),Running),List()))),Finished,task_id { value: "mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6.sleep2" } state: TASK_FINISHED message: "Command exited with status 0" slave_id { value: "9a01f2b3-0086-4386-8042-51ba3f811281-S0" } timestamp: 1.474966899411525E9 executor_id { value: "instance-mypod2.fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6" } healthy: true source: SOURCE_EXECUTOR uuid: "`\226\n\255\f\313K\244\275N\365\362\253+\031\211" container_status { network_infos { ip_addresses { ip_address: "192.168.178.41" } } executor_pid: 66208 } ,2016-09-27T09:01:51.360Z)) on runSpec [/mypod2] and instance [mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6]

GitHub | aquamatthias | 7 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    [Pods] Exception during ForwardTaskOp

    GitHub | 7 months ago | aquamatthias
    java.lang.RuntimeException: while asking for ForwardTaskOp(2016-09-27T09:02:01.392Z,instance [mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6],MesosUpdate(Instance(instance [mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6],AgentInfo(wrk.fritz.box,Some(9a01f2b3-0086-4386-8042-51ba3f811281-S0),List()),InstanceState(Running,2016-09-27T09:01:31.249Z,2016-09-27T08:45:52.570Z,Some(true)),Map(task [mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6.sleep1] -> LaunchedEphemeral(task [mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6.sleep1],AgentInfo(wrk.fritz.box,Some(9a01f2b3-0086-4386-8042-51ba3f811281-S0),List()),2016-09-27T08:45:52.570Z,Status(2016-09-27T09:01:31.076Z,Some(2016-09-27T09:01:31.241Z),Some(task_id { value: "mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6.sleep1" } state: TASK_RUNNING slave_id { value: "9a01f2b3-0086-4386-8042-51ba3f811281-S0" } timestamp: 1.4749668912414E9 executor_id { value: "instance-mypod2.fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6" } healthy: true source: SOURCE_EXECUTOR uuid: "\2373\215KW\251O\345\214pp\200w!\203\343" container_status { network_infos { ip_addresses { ip_address: "192.168.178.41" } } executor_pid: 66208 } ),Running),List()), task [mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6.sleep2] -> LaunchedEphemeral(task [mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6.sleep2],AgentInfo(wrk.fritz.box,Some(9a01f2b3-0086-4386-8042-51ba3f811281-S0),List()),2016-09-27T08:45:52.570Z,Status(2016-09-27T09:01:31.076Z,Some(2016-09-27T09:01:31.241Z),Some(task_id { value: "mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6.sleep2" } state: TASK_RUNNING slave_id { value: "9a01f2b3-0086-4386-8042-51ba3f811281-S0" } timestamp: 1.474966891241516E9 executor_id { value: "instance-mypod2.fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6" } healthy: true source: SOURCE_EXECUTOR uuid: "\200\365\230vj\231N\376\260\331\360W\2701\212\"" container_status { network_infos { ip_addresses { ip_address: "192.168.178.41" } } executor_pid: 66208 } ),Running),List()))),Finished,task_id { value: "mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6.sleep2" } state: TASK_FINISHED message: "Command exited with status 0" slave_id { value: "9a01f2b3-0086-4386-8042-51ba3f811281-S0" } timestamp: 1.474966899411525E9 executor_id { value: "instance-mypod2.fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6" } healthy: true source: SOURCE_EXECUTOR uuid: "`\226\n\255\f\313K\244\275N\365\362\253+\031\211" container_status { network_infos { ip_addresses { ip_address: "192.168.178.41" } } executor_pid: 66208 } ,2016-09-27T09:01:51.360Z)) on runSpec [/mypod2] and instance [mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6]

    Root Cause Analysis

    1. java.lang.IllegalStateException

      instance [mypod2.instance-fb7c1a2a-8490-11e6-862b-2a2bf4e4eca6] of app [/mypod2] does not exist

      at mesosphere.marathon.core.task.tracker.impl.InstanceOpProcessorImpl$InstanceUpdateOpResolver$$anonfun$updateExistingInstance$1.apply()
    2. mesosphere.marathon.core
      InstanceOpProcessorImpl$InstanceUpdateOpResolver$$anonfun$updateExistingInstance$1.apply
      1. mesosphere.marathon.core.task.tracker.impl.InstanceOpProcessorImpl$InstanceUpdateOpResolver$$anonfun$updateExistingInstance$1.apply(InstanceOpProcessorImpl.scala:65)
      2. mesosphere.marathon.core.task.tracker.impl.InstanceOpProcessorImpl$InstanceUpdateOpResolver$$anonfun$updateExistingInstance$1.apply(InstanceOpProcessorImpl.scala:59)
      2 frames
    3. Scala
      CallbackRunnable.run
      1. scala.util.Success$$anonfun$map$1.apply(Try.scala:237)
      2. scala.util.Try$.apply(Try.scala:192)
      3. scala.util.Success.map(Try.scala:237)
      4. scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:237)
      5. scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:237)
      6. scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
      6 frames
    4. Akka Actor
      BatchingExecutor$BlockableBatch$$anonfun$run$1.apply
      1. akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
      2. akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:91)
      3. akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
      4. akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
      4 frames
    5. Scala
      BlockContext$.withBlockContext
      1. scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
      1 frame
    6. Akka Actor
      ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec
      1. akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:90)
      2. akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:39)
      3. akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:409)
      3 frames
    7. Scala
      ForkJoinWorkerThread.run
      1. scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
      2. scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
      3. scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
      4. scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
      4 frames