scheduler.TaskSetManager: Lost task 1.0 in stage 5.0 (TID 11, rz-data-hdp-dn2113.rz.***.com): java.lang.UnsatisfiedLinkError: /data8/hadoop/yarn/nm-local-dir/usercache/hadoop-pay-dev/appcache/application_1468926081807_765036/container_1468926081807_765036_01_000003/tmp/libxgboost4j4481460359283600466.so: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /data8/hadoop/yarn/nm-local-dir/usercache/hadoop-pay-dev/appcache/application_1468926081807_765036/container_1468926081807_765036_01_000003/tmp/libxgboost4j4481460359283600466.so)

GitHub | yxzf | 4 months ago
  1. 0

    [jvm-packages] Running xgboost on the yarn-cluster with the error [/lib64/libc.so.6: version `GLIBC_2.14']

    GitHub | 4 months ago | yxzf
    scheduler.TaskSetManager: Lost task 1.0 in stage 5.0 (TID 11, rz-data-hdp-dn2113.rz.***.com): java.lang.UnsatisfiedLinkError: /data8/hadoop/yarn/nm-local-dir/usercache/hadoop-pay-dev/appcache/application_1468926081807_765036/container_1468926081807_765036_01_000003/tmp/libxgboost4j4481460359283600466.so: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /data8/hadoop/yarn/nm-local-dir/usercache/hadoop-pay-dev/appcache/application_1468926081807_765036/container_1468926081807_765036_01_000003/tmp/libxgboost4j4481460359283600466.so)
  2. 0

    [jvm-packages] Running xgboost on the yarn-cluster with the error [/lib64/libc.so.6: version `GLIBC_2.14']

    GitHub | 4 months ago | yxzf
    scheduler.TaskSetManager: Lost task 0.0 in stage 5.0 (TID 10, rz-data-hdp-dn2113.rz.***.com): java.lang.NoClassDefFoundError: Could not initialize class ml.dmlc.xgboost4j.java.Rabit
  3. 0

    Py-Spark Unknown OpCode Error

    Stack Overflow | 2 years ago
    scheduler.TaskSetManager: Lost task 15.0 in stage 12.0 (TID 28, ip-2): org.apache.spark.api.python.PythonException: Traceback (most recent call last): File "/home/hadoop/spark/python/pyspark/worker.py", line 101, in main process() File "/home/hadoop/spark/python/pyspark/worker.py", line 96, in process serializer.dump_stream(func(split_index, iterator), outfile) File "/home/hadoop/spark/python/pyspark/rdd.py", line 2252, in pipeline_func return func(split, prev_func(split, iterator)) File "/home/hadoop/spark/python/pyspark/rdd.py", line 2252, in pipeline_func return func(split, prev_func(split, iterator)) File "/home/hadoop/spark/python/pyspark/rdd.py", line 282, in func return f(iterator) File "/home/hadoop/spark/python/pyspark/rdd.py", line 1704, in combineLocally if spill else InMemoryMerger(agg) SystemError: unknown opcode
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Gauva version conflict when using with Apache Spark

    GitHub | 5 months ago | arturekbb
    scheduler.TaskSetManager: Lost task 1.0 in stage 1.0 (TID 17, hdp115-yarn.prod.ne.lan): java.lang.NoClassDefFoundError: Could not initialize class org.elasticsearch.threadpool.ThreadPool
  6. 0

    Null pointer : with Streaming RDD to Spark SQL Data frame conversion

    Stack Overflow | 9 months ago | New coder
    scheduler.TaskSetManager: Lost task 0.0 in stage 5.0 (TID 4, xrdcldbda010001.unix.medcity.net): java.lang.NullPointerException

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. scheduler.TaskSetManager

      Lost task 1.0 in stage 5.0 (TID 11, rz-data-hdp-dn2113.rz.***.com): java.lang.UnsatisfiedLinkError: /data8/hadoop/yarn/nm-local-dir/usercache/hadoop-pay-dev/appcache/application_1468926081807_765036/container_1468926081807_765036_01_000003/tmp/libxgboost4j4481460359283600466.so: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /data8/hadoop/yarn/nm-local-dir/usercache/hadoop-pay-dev/appcache/application_1468926081807_765036/container_1468926081807_765036_01_000003/tmp/libxgboost4j4481460359283600466.so)

      at java.lang.ClassLoader$NativeLibrary.load()
    2. Java RT
      System.load
      1. java.lang.ClassLoader$NativeLibrary.load(Native Method)
      2. java.lang.ClassLoader.loadLibrary1(ClassLoader.java:1965)
      3. java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1890)
      4. java.lang.ClassLoader.loadLibrary(ClassLoader.java:1851)
      5. java.lang.Runtime.load0(Runtime.java:795)
      6. java.lang.System.load(System.java:1062)
      6 frames
    3. ml.dmlc.xgboost4j
      XGBoost$$anonfun$buildDistributedBoosters$1.apply
      1. ml.dmlc.xgboost4j.java.NativeLibLoader.loadLibraryFromJar(NativeLibLoader.java:67)
      2. ml.dmlc.xgboost4j.java.NativeLibLoader.smartLoad(NativeLibLoader.java:153)
      3. ml.dmlc.xgboost4j.java.NativeLibLoader.initXGBoost(NativeLibLoader.java:41)
      4. ml.dmlc.xgboost4j.java.Rabit.<clinit>(Rabit.java:18)
      5. ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:63)
      6. ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:61)
      6 frames
    4. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$17.apply(RDD.scala:706)
      2. org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$17.apply(RDD.scala:706)
      3. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      4. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:297)
      5. org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:69)
      6. org.apache.spark.rdd.RDD.iterator(RDD.scala:262)
      7. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
      8. org.apache.spark.scheduler.Task.run(Task.scala:88)
      9. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
      9 frames
    5. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames