java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@4e3e11b9 rejected from java.util.concurrent.ScheduledThreadPoolExecutor@2c829dbc[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]

Talend Open Integration Solution | lei ju | 10 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Talend Open Integration Solution | 10 months ago | lei ju
    java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@4e3e11b9 rejected from java.util.concurrent.ScheduledThreadPoolExecutor@2c829dbc[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]
  2. 0

    This was found in the stress test on Bolt 80. I'm unclear as to whether this is expected behavior. Li as suggested adding --master=yarn-client to the spark submit as a possible fix {code} [alex.leblang@e1102 PersonalScripts]$ spark-submit --class com.cloudera.recordservice.examples.spark.RecordCount --properties-file /etc/recordservice/conf/spark.conf /home/alex.leblang/recordservice-client-0.3.0-cdh5.7.x/lib/recordservice-examples-spark-0.3.0-cdh5.7.x.jar "select count(*) from tpcds_50_text.store_sales" 16/04/13 18:20:00 INFO spark.SparkContext: Running Spark version 1.6.0 16/04/13 18:20:13 INFO spark.SecurityManager: Changing view acls to: alex.leblang 16/04/13 18:20:13 INFO spark.SecurityManager: Changing modify acls to: alex.leblang 16/04/13 18:20:13 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(alex.leblang); users with modify permissions: Set(alex.leblang) 16/04/13 18:20:16 INFO util.Utils: Successfully started service 'sparkDriver' on port 55001. 16/04/13 18:20:20 INFO slf4j.Slf4jLogger: Slf4jLogger started 16/04/13 18:20:21 INFO Remoting: Starting remoting 16/04/13 18:20:22 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@10.20.126.102:44466] 16/04/13 18:20:22 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriverActorSystem@10.20.126.102:44466] 16/04/13 18:20:22 INFO util.Utils: Successfully started service 'sparkDriverActorSystem' on port 44466. 16/04/13 18:20:22 INFO spark.SparkEnv: Registering MapOutputTracker 16/04/13 18:20:22 INFO spark.SparkEnv: Registering BlockManagerMaster 16/04/13 18:20:22 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-de8e5436-587d-4064-9ded-3b25cae306eb 16/04/13 18:20:22 INFO storage.MemoryStore: MemoryStore started with capacity 530.3 MB 16/04/13 18:20:23 INFO spark.SparkEnv: Registering OutputCommitCoordinator 16/04/13 18:20:25 WARN thread.QueuedThreadPool: 1 threads could not be stopped 16/04/13 18:20:26 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041. 16/04/13 18:20:26 WARN util.Utils: Service 'SparkUI' could not bind on port 4041. Attempting port 4042. 16/04/13 18:20:26 WARN util.Utils: Service 'SparkUI' could not bind on port 4042. Attempting port 4043. 16/04/13 18:20:27 WARN util.Utils: Service 'SparkUI' could not bind on port 4043. Attempting port 4044. 16/04/13 18:20:27 WARN util.Utils: Service 'SparkUI' could not bind on port 4044. Attempting port 4045. 16/04/13 18:20:27 WARN util.Utils: Service 'SparkUI' could not bind on port 4045. Attempting port 4046. 16/04/13 18:20:27 WARN thread.QueuedThreadPool: 3 threads could not be stopped 16/04/13 18:20:27 WARN util.Utils: Service 'SparkUI' could not bind on port 4046. Attempting port 4047. 16/04/13 18:20:27 WARN util.Utils: Service 'SparkUI' could not bind on port 4047. Attempting port 4048. 16/04/13 18:20:28 WARN util.Utils: Service 'SparkUI' could not bind on port 4048. Attempting port 4049. 16/04/13 18:20:28 WARN util.Utils: Service 'SparkUI' could not bind on port 4049. Attempting port 4050. 16/04/13 18:20:28 WARN util.Utils: Service 'SparkUI' could not bind on port 4050. Attempting port 4051. 16/04/13 18:20:28 WARN util.Utils: Service 'SparkUI' could not bind on port 4051. Attempting port 4052. 16/04/13 18:20:29 WARN util.Utils: Service 'SparkUI' could not bind on port 4052. Attempting port 4053. 16/04/13 18:20:29 WARN util.Utils: Service 'SparkUI' could not bind on port 4053. Attempting port 4054. 16/04/13 18:20:29 WARN util.Utils: Service 'SparkUI' could not bind on port 4054. Attempting port 4055. 16/04/13 18:20:29 WARN util.Utils: Service 'SparkUI' could not bind on port 4055. Attempting port 4056. 16/04/13 18:20:30 ERROR ui.SparkUI: Failed to bind SparkUI java.net.BindException: Address already in use: Service 'SparkUI' failed after 16 retries! at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:444) at sun.nio.ch.Net.bind(Net.java:436) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187) at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316) at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265) at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64) at org.spark-project.jetty.server.Server.doStart(Server.java:293) at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64) at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:283) at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:293) at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:293) at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1989) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1980) at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:293) at org.apache.spark.ui.WebUI.bind(WebUI.scala:137) at org.apache.spark.SparkContext$$anonfun$14.apply(SparkContext.scala:492) at org.apache.spark.SparkContext$$anonfun$14.apply(SparkContext.scala:492) at scala.Option.foreach(Option.scala:236) at org.apache.spark.SparkContext.<init>(SparkContext.scala:492) at com.cloudera.recordservice.examples.spark.RecordCount$.main(RecordCount.scala:51) at com.cloudera.recordservice.examples.spark.RecordCount.main(RecordCount.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 16/04/13 18:20:30 INFO storage.DiskBlockManager: Shutdown hook called 16/04/13 18:20:30 INFO util.ShutdownHookManager: Shutdown hook called 16/04/13 18:20:30 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-df684e69-7a81-4a0f-917a-6f214dffb79c 16/04/13 18:20:30 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-df684e69-7a81-4a0f-917a-6f214dffb79c/userFiles-d5524d6b-79c4-44b4-9438-c10630711404 [alex.leblang@e1102 PersonalScripts]$ {code}

    Cloudera Open Source | 11 months ago | Alex Leblang
    java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5f9c03e7 rejected from java.util.concurrent.ScheduledThreadPoolExecutor@7bc463ba[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]
  3. 0

    This was found in the stress test on Bolt 80. I'm unclear as to whether this is expected behavior. Li as suggested adding --master=yarn-client to the spark submit as a possible fix {code} [alex.leblang@e1102 PersonalScripts]$ spark-submit --class com.cloudera.recordservice.examples.spark.RecordCount --properties-file /etc/recordservice/conf/spark.conf /home/alex.leblang/recordservice-client-0.3.0-cdh5.7.x/lib/recordservice-examples-spark-0.3.0-cdh5.7.x.jar "select count(*) from tpcds_50_text.store_sales" 16/04/13 18:20:00 INFO spark.SparkContext: Running Spark version 1.6.0 16/04/13 18:20:13 INFO spark.SecurityManager: Changing view acls to: alex.leblang 16/04/13 18:20:13 INFO spark.SecurityManager: Changing modify acls to: alex.leblang 16/04/13 18:20:13 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(alex.leblang); users with modify permissions: Set(alex.leblang) 16/04/13 18:20:16 INFO util.Utils: Successfully started service 'sparkDriver' on port 55001. 16/04/13 18:20:20 INFO slf4j.Slf4jLogger: Slf4jLogger started 16/04/13 18:20:21 INFO Remoting: Starting remoting 16/04/13 18:20:22 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@10.20.126.102:44466] 16/04/13 18:20:22 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriverActorSystem@10.20.126.102:44466] 16/04/13 18:20:22 INFO util.Utils: Successfully started service 'sparkDriverActorSystem' on port 44466. 16/04/13 18:20:22 INFO spark.SparkEnv: Registering MapOutputTracker 16/04/13 18:20:22 INFO spark.SparkEnv: Registering BlockManagerMaster 16/04/13 18:20:22 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-de8e5436-587d-4064-9ded-3b25cae306eb 16/04/13 18:20:22 INFO storage.MemoryStore: MemoryStore started with capacity 530.3 MB 16/04/13 18:20:23 INFO spark.SparkEnv: Registering OutputCommitCoordinator 16/04/13 18:20:25 WARN thread.QueuedThreadPool: 1 threads could not be stopped 16/04/13 18:20:26 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041. 16/04/13 18:20:26 WARN util.Utils: Service 'SparkUI' could not bind on port 4041. Attempting port 4042. 16/04/13 18:20:26 WARN util.Utils: Service 'SparkUI' could not bind on port 4042. Attempting port 4043. 16/04/13 18:20:27 WARN util.Utils: Service 'SparkUI' could not bind on port 4043. Attempting port 4044. 16/04/13 18:20:27 WARN util.Utils: Service 'SparkUI' could not bind on port 4044. Attempting port 4045. 16/04/13 18:20:27 WARN util.Utils: Service 'SparkUI' could not bind on port 4045. Attempting port 4046. 16/04/13 18:20:27 WARN thread.QueuedThreadPool: 3 threads could not be stopped 16/04/13 18:20:27 WARN util.Utils: Service 'SparkUI' could not bind on port 4046. Attempting port 4047. 16/04/13 18:20:27 WARN util.Utils: Service 'SparkUI' could not bind on port 4047. Attempting port 4048. 16/04/13 18:20:28 WARN util.Utils: Service 'SparkUI' could not bind on port 4048. Attempting port 4049. 16/04/13 18:20:28 WARN util.Utils: Service 'SparkUI' could not bind on port 4049. Attempting port 4050. 16/04/13 18:20:28 WARN util.Utils: Service 'SparkUI' could not bind on port 4050. Attempting port 4051. 16/04/13 18:20:28 WARN util.Utils: Service 'SparkUI' could not bind on port 4051. Attempting port 4052. 16/04/13 18:20:29 WARN util.Utils: Service 'SparkUI' could not bind on port 4052. Attempting port 4053. 16/04/13 18:20:29 WARN util.Utils: Service 'SparkUI' could not bind on port 4053. Attempting port 4054. 16/04/13 18:20:29 WARN util.Utils: Service 'SparkUI' could not bind on port 4054. Attempting port 4055. 16/04/13 18:20:29 WARN util.Utils: Service 'SparkUI' could not bind on port 4055. Attempting port 4056. 16/04/13 18:20:30 ERROR ui.SparkUI: Failed to bind SparkUI java.net.BindException: Address already in use: Service 'SparkUI' failed after 16 retries! at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:444) at sun.nio.ch.Net.bind(Net.java:436) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187) at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316) at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265) at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64) at org.spark-project.jetty.server.Server.doStart(Server.java:293) at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64) at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:283) at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:293) at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:293) at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1989) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1980) at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:293) at org.apache.spark.ui.WebUI.bind(WebUI.scala:137) at org.apache.spark.SparkContext$$anonfun$14.apply(SparkContext.scala:492) at org.apache.spark.SparkContext$$anonfun$14.apply(SparkContext.scala:492) at scala.Option.foreach(Option.scala:236) at org.apache.spark.SparkContext.<init>(SparkContext.scala:492) at com.cloudera.recordservice.examples.spark.RecordCount$.main(RecordCount.scala:51) at com.cloudera.recordservice.examples.spark.RecordCount.main(RecordCount.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 16/04/13 18:20:30 INFO storage.DiskBlockManager: Shutdown hook called 16/04/13 18:20:30 INFO util.ShutdownHookManager: Shutdown hook called 16/04/13 18:20:30 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-df684e69-7a81-4a0f-917a-6f214dffb79c 16/04/13 18:20:30 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-df684e69-7a81-4a0f-917a-6f214dffb79c/userFiles-d5524d6b-79c4-44b4-9438-c10630711404 [alex.leblang@e1102 PersonalScripts]$ {code}

    Cloudera Open Source | 11 months ago | Alex Leblang
    java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5f9c03e7 rejected from java.util.concurrent.ScheduledThreadPoolExecutor@7bc463ba[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    [0.9.0] Too fast leadership changes in overlord causes multiple leaders and service failure

    GitHub | 10 months ago | drcrallen
    java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
  6. 0

    solr - java.util.concurrent.RejectedExecutionException

    Stack Overflow | 8 months ago | Aaron Rumery
    org.apache.solr.common.SolrException: Exception writing document id 305380266306546_343677655810140 to the index; possible analysis error.
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.util.concurrent.RejectedExecutionException

    Task java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@4e3e11b9 rejected from java.util.concurrent.ScheduledThreadPoolExecutor@2c829dbc[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]

    at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution()
  2. Java RT
    ScheduledThreadPoolExecutor.schedule
    1. java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2047)
    2. java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:823)
    3. java.util.concurrent.ScheduledThreadPoolExecutor.delayedExecute(ScheduledThreadPoolExecutor.java:326)
    4. java.util.concurrent.ScheduledThreadPoolExecutor.schedule(ScheduledThreadPoolExecutor.java:533)
    4 frames
  3. org.apache.spark
    RpcEndpointRef.askWithRetry
    1. org.apache.spark.rpc.netty.NettyRpcEnv.ask(NettyRpcEnv.scala:238)
    2. org.apache.spark.rpc.netty.NettyRpcEndpointRef.ask(NettyRpcEnv.scala:509)
    3. org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:100)
    4. org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:77)
    4 frames
  4. Spark Project Streaming
    JavaStreamingContext.stop
    1. org.apache.spark.streaming.scheduler.ReceiverTracker.stop(ReceiverTracker.scala:170)
    2. org.apache.spark.streaming.scheduler.JobScheduler.stop(JobScheduler.scala:93)
    3. org.apache.spark.streaming.StreamingContext.stop(StreamingContext.scala:709)
    4. org.apache.spark.streaming.StreamingContext.stop(StreamingContext.scala:682)
    5. org.apache.spark.streaming.api.java.JavaStreamingContext.stop(JavaStreamingContext.scala:662)
    5 frames
  5. bigdata.spark_0_1
    spark.main
    1. bigdata.spark_0_1.spark.runJobInTOS(spark.java:898)
    2. bigdata.spark_0_1.spark.main(spark.java:773)
    2 frames