Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via Stack Overflow by jsosnowski
, 1 year ago
com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;
via Stack Overflow by Dhinesh
, 9 months ago
com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;
via Stack Overflow by Raghu K Nair
, 1 year ago
com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;
via Stack Overflow by theMadKing
, 1 year ago
com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;
via GitHub by boeboe
, 1 year ago
com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;
via GitHub by gvdm90
, 10 months ago
com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;
java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;	at org.elasticsearch.threadpool.ThreadPool.(ThreadPool.java:190)	at org.elasticsearch.client.transport.TransportClient$Builder.build(TransportClient.java:131)	at com.abc.App.main(App.java:44)	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)	at java.lang.reflect.Method.invoke(Method.java:498)	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)