storage.DiskBlockManager: Shutdown hook called16/06/07 23:16:02 INFO util.ShutdownHookManager: Shutdown hook called16/06/07 23:16:02 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-488f385b-c163-40dc-a41f-56c9361a9aea/executor-2ca7dcbd-49b0-4377-ac27-017b5b0c8ea2/spark-70302962-396b-41b2-815f-b5b0dd9b886f16/06/07 23:16:02 INFO executor.CoarseGrainedExecutorBackend: Got assigned task 116/06/07 23:16:02 INFO executor.Executor: Running task 1.0 in stage 0.0 (TID 1)16/06/07 23:16:02 INFO rdd.HadoopRDD: Input split: hdfs://master:8020/usr/data/miniData.txt:129+12916/06/07 23:16:02 ERROR executor.Executor: Exception in task 1.0 in stage 0.0 (TID 1)java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;

aboutyun.com | 4 months ago
  1. 0

    spark运行中问题。-Hadoop2|YARN-about云开发

    aboutyun.com | 4 months ago
    storage.DiskBlockManager: Shutdown hook called16/06/07 23:16:02 INFO util.ShutdownHookManager: Shutdown hook called16/06/07 23:16:02 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-488f385b-c163-40dc-a41f-56c9361a9aea/executor-2ca7dcbd-49b0-4377-ac27-017b5b0c8ea2/spark-70302962-396b-41b2-815f-b5b0dd9b886f16/06/07 23:16:02 INFO executor.CoarseGrainedExecutorBackend: Got assigned task 116/06/07 23:16:02 INFO executor.Executor: Running task 1.0 in stage 0.0 (TID 1)16/06/07 23:16:02 INFO rdd.HadoopRDD: Input split: hdfs://master:8020/usr/data/miniData.txt:129+12916/06/07 23:16:02 ERROR executor.Executor: Exception in task 1.0 in stage 0.0 (TID 1)java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;

    Root Cause Analysis

    1. storage.DiskBlockManager

      Shutdown hook called16/06/07 23:16:02 INFO util.ShutdownHookManager: Shutdown hook called16/06/07 23:16:02 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-488f385b-c163-40dc-a41f-56c9361a9aea/executor-2ca7dcbd-49b0-4377-ac27-017b5b0c8ea2/spark-70302962-396b-41b2-815f-b5b0dd9b886f16/06/07 23:16:02 INFO executor.CoarseGrainedExecutorBackend: Got assigned task 116/06/07 23:16:02 INFO executor.Executor: Running task 1.0 in stage 0.0 (TID 1)16/06/07 23:16:02 INFO rdd.HadoopRDD: Input split: hdfs://master:8020/usr/data/miniData.txt:129+12916/06/07 23:16:02 ERROR executor.Executor: Exception in task 1.0 in stage 0.0 (TID 1)java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;

      at com.ghost.scala.DataTranslate$.AnalyData()
    2. com.ghost.scala
      DataTranslate$$anonfun$1.apply
      1. com.ghost.scala.DataTranslate$.AnalyData(DataTranslate.scala:90)
      2. com.ghost.scala.DataTranslate$$anonfun$1.apply(DataTranslate.scala:29)
      3. com.ghost.scala.DataTranslate$$anonfun$1.apply(DataTranslate.scala:29)
      3 frames
    3. Scala
      Iterator$$anon$11.hasNext
      1. scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
      2. scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
      3. scala.collection.Iterator$$anon$14.hasNext(Iterator.scala:388)
      4. scala.collection.Iterator$$anon$14.hasNext(Iterator.scala:388)
      5. scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
      6. scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
      6 frames
    4. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:203)
      2. org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:73)
      3. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
      4. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
      5. org.apache.spark.scheduler.Task.run(Task.scala:88)
      6. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
      6 frames
    5. Java RT
      ThreadPoolExecutor$Worker.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      2 frames