storage.DiskBlockManager

Shutdown hook called16/06/07 23:16:02 INFO util.ShutdownHookManager: Shutdown hook called16/06/07 23:16:02 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-488f385b-c163-40dc-a41f-56c9361a9aea/executor-2ca7dcbd-49b0-4377-ac27-017b5b0c8ea2/spark-70302962-396b-41b2-815f-b5b0dd9b886f16/06/07 23:16:02 INFO executor.CoarseGrainedExecutorBackend: Got assigned task 116/06/07 23:16:02 INFO executor.Executor: Running task 1.0 in stage 0.0 (TID 1)16/06/07 23:16:02 INFO rdd.HadoopRDD: Input split: hdfs://master:8020/usr/data/miniData.txt:129+12916/06/07 23:16:02 ERROR executor.Executor: Exception in task 1.0 in stage 0.0 (TID 1)java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;

Solutions on the web2

  • via aboutyun.com by Unknown author, 11 months ago
    split: hdfs://master:8020/usr/data/miniData.txt:129+12916/06/07 23:16:02 ERROR executor.Executor: Exception in task 1.0 in stage 0.0 (TID 1)java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
  • Shutdown hook called Exception in Hive Logs 2015-07-28 16:05:04,543 ERROR [pool-3-thread-5]: metastore.RetryingHMSHandler (RetryingHMSHandler.java:invoke(155)) - MetaException(message:java.lang.NullPointerException)
  • Stack trace

    • storage.DiskBlockManager: Shutdown hook called16/06/07 23:16:02 INFO util.ShutdownHookManager: Shutdown hook called16/06/07 23:16:02 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-488f385b-c163-40dc-a41f-56c9361a9aea/executor-2ca7dcbd-49b0-4377-ac27-017b5b0c8ea2/spark-70302962-396b-41b2-815f-b5b0dd9b886f16/06/07 23:16:02 INFO executor.CoarseGrainedExecutorBackend: Got assigned task 116/06/07 23:16:02 INFO executor.Executor: Running task 1.0 in stage 0.0 (TID 1)16/06/07 23:16:02 INFO rdd.HadoopRDD: Input split: hdfs://master:8020/usr/data/miniData.txt:129+12916/06/07 23:16:02 ERROR executor.Executor: Exception in task 1.0 in stage 0.0 (TID 1)java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef; at com.ghost.scala.DataTranslate$.AnalyData(DataTranslate.scala:90) at com.ghost.scala.DataTranslate$$anonfun$1.apply(DataTranslate.scala:29) at com.ghost.scala.DataTranslate$$anonfun$1.apply(DataTranslate.scala:29) at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371) at scala.collection.Iterator$$anon$14.hasNext(Iterator.scala:388) at scala.collection.Iterator$$anon$14.hasNext(Iterator.scala:388) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327) at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:203) at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:73) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) at org.apache.spark.scheduler.Task.run(Task.scala:88) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    You are the first who have seen this exception. Write a tip to help other users and build your expert profile.