scheduler.TaskSetManager: Lost task 0.0 in stage 1.0 (TID 16, hdp115-yarn.prod.ne.lan): java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;

GitHub | arturekbb | 7 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Gauva version conflict when using with Apache Spark

    GitHub | 7 months ago | arturekbb
    scheduler.TaskSetManager: Lost task 0.0 in stage 1.0 (TID 16, hdp115-yarn.prod.ne.lan): java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;

    Root Cause Analysis

    1. scheduler.TaskSetManager

      Lost task 0.0 in stage 1.0 (TID 16, hdp115-yarn.prod.ne.lan): java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;

      at org.elasticsearch.threadpool.ThreadPool.<clinit>()
    2. ElasticSearch
      TransportClient$Builder.build
      1. org.elasticsearch.threadpool.ThreadPool.<clinit>(ThreadPool.java:190)
      2. org.elasticsearch.client.transport.TransportClient$Builder.build(TransportClient.java:131)
      2 frames
    3. com.sksamuel.elastic4s
      ElasticClient$.remote
      1. com.sksamuel.elastic4s.ElasticClient$.transport(ElasticClient.scala:103)
      2. com.sksamuel.elastic4s.ElasticClient$.remote(ElasticClient.scala:111)
      2 frames
    4. com.hgintelligence
      MyJob$$anonfun$doMyJobOnRDD$3.apply
      1. com.hgintelligence.EsFetcher.<init>(EsFetcher.scala:18)
      2. com.hgintelligence.MyJob$$anonfun$doMyJobOnRDD$3.apply(MyJob.scala:47)
      3. com.hgintelligence.MyJob$$anonfun$doMyJobOnRDD$3.apply(MyJob.scala:46)
      3 frames
    5. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:878)
      2. org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:878)
      3. org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1767)
      4. org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1767)
      5. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63)
      6. org.apache.spark.scheduler.Task.run(Task.scala:70)
      7. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
      7 frames
    6. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames