util.Utils: uncaught error in thread SparkListenerBus, stopping SparkContext 2015-11-26 09:39:41,441 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(569)) - java.lang.AbstractMethodError 2015-11-26 09:39:41,441 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(569)) - at org.apache.spark.scheduler.SparkListenerBus$class.onPostEvent(SparkListenerBus.scala:62)

Apache's JIRA Issue Tracker | Amithsha | 2 years ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    Hi all, Recently i have configured Spark 1.2.0 and my environment is hadoop 2.6.0 hive 1.1.0 Here i have tried hive on Spark while executing insert into i am getting the following g error. Query ID = hadoop2_20150313162828_8764adad-a8e4-49da-9ef5-35e4ebd6bc63 Total jobs = 1 Launching Job 1 out of 1 In order to change the average load for a reducer (in bytes): set hive.exec.reducers.bytes.per.reducer=<number> In order to limit the maximum number of reducers: set hive.exec.reducers.max=<number> In order to set a constant number of reducers: set mapreduce.job.reduces=<number> Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)' FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask Have added the spark-assembly jar in hive lib And also in hive console using the command add jar followed by the steps set spark.home=/opt/spark-1.2.1/; add jar /opt/spark-1.2.1/assembly/target/scala-2.10/spark-assembly-1.2.1-hadoop2.4.0.jar; set hive.execution.engine=spark; set spark.master=spark://xxxxxxx:7077; set spark.eventLog.enabled=true; set spark.executor.memory=512m; set spark.serializer=org.apache.spark.serializer.KryoSerializer; Can anyone suggest!!!! Thanks & Regards Amithsha

    Apache's JIRA Issue Tracker | 2 years ago | Amithsha
    util.Utils: uncaught error in thread SparkListenerBus, stopping SparkContext 2015-11-26 09:39:41,441 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(569)) - java.lang.AbstractMethodError 2015-11-26 09:39:41,441 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(569)) - at org.apache.spark.scheduler.SparkListenerBus$class.onPostEvent(SparkListenerBus.scala:62)

    Root Cause Analysis

    1. util.Utils

      uncaught error in thread SparkListenerBus, stopping SparkContext 2015-11-26 09:39:41,441 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(569)) - java.lang.AbstractMethodError 2015-11-26 09:39:41,441 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(569)) - at org.apache.spark.scheduler.SparkListenerBus$class.onPostEvent(SparkListenerBus.scala:62)

      at org.apache.spark.scheduler.LiveListenerBus.onPostEvent()
    2. Spark
      AsynchronousListenerBus$$anon$1.run
      1. org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
      2. org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
      3. org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:56)
      4. org.apache.spark.util.AsynchronousListenerBus.postToAll(AsynchronousListenerBus.scala:37)
      5. org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(AsynchronousListenerBus.scala:79)
      6. org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1136)
      7. org.apache.spark.util.AsynchronousListenerBus$$anon$1.run(AsynchronousListenerBus.scala:63)
      7 frames