util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/01/11 04:24:14 INFO client.RMProxy: Connecting to ResourceManager at /192.168.85.129:8032 17/01/11 04:24:15 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this. 17/01/11 04:24:15 WARN mapreduce.JobSubmitter: No job jar file set. User classes may not be found. See Job or Job#setJar(String). 17/01/11 04:24:15 INFO input.FileInputFormat: Total input paths to process : 1 17/01/11 04:24:15 INFO mapreduce.JobSubmitter: number of splits:1 17/01/11 04:24:16 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1484121450974_0004 17/01/11 04:24:16 INFO mapred.YARNRunner: Job jar is not present. Not adding any jar to the list of resources. 17/01/11 04:24:16 INFO impl.YarnClientImpl: Submitted application application_1484121450974_0004 17/01/11 04:24:16 INFO mapreduce.Job: The url to track the job: http://hadoop-senior:8088/proxy/application_1484121450974_0004/ 17/01/11 04:24:16 INFO mapreduce.Job: Running job: job_1484121450974_0004 17/01/11 04:24:23 INFO mapreduce.Job: Job job_1484121450974_0004 running in uber mode : false 17/01/11 04:24:23 INFO mapreduce.Job: map 0% reduce 0% 17/01/11 04:24:26 INFO mapreduce.Job: Task Id : attempt_1484121450974_0004_m_000000_0, Status : FAILED Exception from container-launch: ExitCodeException exitCode=1: ExitCodeException exitCode=1:

Stack Overflow | 茅华峰 | 2 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    INFO org.apache.hadoop.service.AbstractService: Service org.apache.hadoop.mapreduce.v2.hs.server.HSAd minServer failed in state INITED;

    Stack Overflow | 2 months ago | 茅华峰
    util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/01/11 04:24:14 INFO client.RMProxy: Connecting to ResourceManager at /192.168.85.129:8032 17/01/11 04:24:15 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this. 17/01/11 04:24:15 WARN mapreduce.JobSubmitter: No job jar file set. User classes may not be found. See Job or Job#setJar(String). 17/01/11 04:24:15 INFO input.FileInputFormat: Total input paths to process : 1 17/01/11 04:24:15 INFO mapreduce.JobSubmitter: number of splits:1 17/01/11 04:24:16 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1484121450974_0004 17/01/11 04:24:16 INFO mapred.YARNRunner: Job jar is not present. Not adding any jar to the list of resources. 17/01/11 04:24:16 INFO impl.YarnClientImpl: Submitted application application_1484121450974_0004 17/01/11 04:24:16 INFO mapreduce.Job: The url to track the job: http://hadoop-senior:8088/proxy/application_1484121450974_0004/ 17/01/11 04:24:16 INFO mapreduce.Job: Running job: job_1484121450974_0004 17/01/11 04:24:23 INFO mapreduce.Job: Job job_1484121450974_0004 running in uber mode : false 17/01/11 04:24:23 INFO mapreduce.Job: map 0% reduce 0% 17/01/11 04:24:26 INFO mapreduce.Job: Task Id : attempt_1484121450974_0004_m_000000_0, Status : FAILED Exception from container-launch: ExitCodeException exitCode=1: ExitCodeException exitCode=1:
  2. 0

    HDFS write resulting in " CreateSymbolicLink error (1314): A required privilege is not held by the client."

    Stack Overflow | 2 years ago | Sylvester Daniel
    mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this. 15/03/10 13:13:10 WARN mapreduce.JobSubmitter: No job jar file set. User classes may not be found. See Job or Job#setJar(String). 15/03/10 13:13:10 INFO input.FileInputFormat: Total input paths to process : 2 15/03/10 13:13:11 INFO mapreduce.JobSubmitter: number of splits:2 15/03/10 13:13:11 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1425973278169_0001 15/03/10 13:13:12 INFO mapred.YARNRunner: Job jar is not present. Not adding any jar to the list of resources. 15/03/10 13:13:12 INFO impl.YarnClientImpl: Submitted application application_1425973278169_0001 15/03/10 13:13:12 INFO mapreduce.Job: The url to track the job: http://B2ML10803:8088/proxy/application_1425973278169_0001/ 15/03/10 13:13:12 INFO mapreduce.Job: Running job: job_1425973278169_0001 15/03/10 13:13:18 INFO mapreduce.Job: Job job_1425973278169_0001 running in uber mode : false 15/03/10 13:13:18 INFO mapreduce.Job: map 0% reduce 0% 15/03/10 13:13:18 INFO mapreduce.Job: Job job_1425973278169_0001 failed with state FAILED due to: Application application_1425973278169_0001 failed 2 times due to AM Container for appattempt_1425973278169_0001_000002 exited with exitCode: 1 For more detailed output, check application tracking page:http://B2ML10803:8088/proxy/application_1425973278169_0001/Then, click on links to logs of each attemp t. Diagnostics: Exception from container-launch. Container id: container_1425973278169_0001_02_000001 Exit code: 1 Exception message: CreateSymbolicLink error (1314): A required privilege is not held by the client. Stack trace: ExitCodeException exitCode=1: CreateSymbolicLink error (1314): A required privilege is not held by the client. at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
  3. 0

    HIPI 2.1.0 and Hadoop 2.6.0 are compatible

    Google Groups | 1 year ago | Ankur Dubey
    client.RMProxy: Connecting to ResourceManager at /0.0.0.0:803215/10/27 03:22:37 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.15/10/27 03:22:40 INFO input.FileInputFormat: Total input paths to process : 1Spawned 1map tasks15/10/27 03:22:48 INFO mapreduce.JobSubmitter: number of splits:115/10/27 03:22:50 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1445938202522_000315/10/27 03:23:00 INFO impl.YarnClientImpl: Submitted application application_1445938202522_000315/10/27 03:23:00 INFO mapreduce.Job: The url to track the job: http://quickstart.cloudera:8088/proxy/application_1445938202522_0003/15/10/27 03:23:00 INFO mapreduce.Job: Running job: job_1445938202522_000315/10/27 03:23:43 INFO mapreduce.Job: Job job_1445938202522_0003 running in uber mode : false15/10/27 03:23:43 INFO mapreduce.Job: map 0% reduce 0%15/10/27 03:23:43 INFO mapreduce.Job: Job job_1445938202522_0003 failed with state FAILED due to: Application application_1445938202522_0003 failed 2 times due to AM Container for appattempt_1445938202522_0003_000002 exited with exitCode: 1For more detailed output, check application tracking page:http://quickstart.cloudera:8088/proxy/application_1445938202522_0003/Then, click on links to logs of each attempt.Diagnostics: Exception from container-launch.Container id: container_1445938202522_0003_02_000001Exit code: 1Stack trace: ExitCodeException exitCode=1:
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    sqoop import is failing with mysql

    Stack Overflow | 3 months ago | user3540695
    db.DBInputFormat: Using read commited transaction isolation 17/01/02 07:25:27 INFO mapreduce.JobSubmitter: number of splits:1 17/01/02 07:25:27 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1483355577531_0006 17/01/02 07:25:28 INFO impl.YarnClientImpl: Submitted application application_1483355577531_0006 17/01/02 07:25:28 INFO mapreduce.Job: The url to track the job: http://ubuntu:8088/proxy/application_1483355577531_0006/ 17/01/02 07:25:28 INFO mapreduce.Job: Running job: job_1483355577531_0006 17/01/02 07:25:38 INFO mapreduce.Job: Job job_1483355577531_0006 running in uber mode : false 17/01/02 07:25:38 INFO mapreduce.Job: map 0% reduce 0% 17/01/02 07:25:39 INFO mapreduce.Job: Task Id : attempt_1483355577531_0006_m_000000_0, Status : FAILED Container launch failed for container_1483355577531_0006_01_000002 : org.apache.hadoop.yarn.exceptions.InvalidAuxServiceException: The auxService:mapreduce_shuffle does not exist
  6. 0

    Sqoop import the data but replication issue even after changing hdfs-site.xml properties

    Stack Overflow | 11 months ago | Md Rasool
    sqoop.Sqoop: Running Sqoop version: 1.4.6.2.4.0.0-169 16/04/22 08:50:40 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 16/04/22 08:50:40 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled. 16/04/22 08:50:40 INFO manager.SqlManager: Using default fetchSize of 1000 16/04/22 08:50:40 INFO tool.CodeGenTool: Beginning code generation 16/04/22 08:50:41 INFO manager.OracleManager: Time zone has been set to GMT 16/04/22 08:50:42 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM PWRLINE_COPY.DATAAGGRUN t WHERE 1=0 16/04/22 08:50:42 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is c:\hadoop\hdp\hadoop-2.7.1.2.4.0.0-169 Note: \tmp\sqoop-sahus\compile\f1f5245c3a8fbf8c7782e696f3662575\PWRLINE_COPY_DATAAGGRUN.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 16/04/22 08:50:45 INFO orm.CompilationManager: Writing jar file: \tmp\sqoop-sahus\compile\f1f5245c3a8fbf8c7782e696f3662575\PWRLINE_COPY.DATAAGGRUN.jar 16/04/22 08:50:45 INFO manager.OracleManager: Time zone has been set to GMT 16/04/22 08:50:46 INFO manager.OracleManager: Time zone has been set to GMT 16/04/22 08:50:46 INFO mapreduce.ImportJobBase: Beginning import of PWRLINE_COPY.DATAAGGRUN 16/04/22 08:50:46 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar 16/04/22 08:50:46 INFO manager.OracleManager: Time zone has been set to GMT 16/04/22 08:50:47 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps 16/04/22 08:50:48 INFO impl.TimelineClientImpl: Timeline service address: http://cc-wvd-ap161.pepcoholdings.biz:8188/ws/v1/timeline/ 16/04/22 08:50:49 INFO client.RMProxy: Connecting to ResourceManager at cc-wvd-ap161.pepcoholdings.biz/161.186.159.156:8032 16/04/22 08:50:50 INFO mapreduce.JobSubmitter: Cleaning up the staging area /user/sahus/.staging/job_1461298205218_0003 16/04/22 08:50:50 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.ipc.RemoteException(java.io.IOException): file /user/sahus/.staging/job_1461298205218_0003/libjars/xz-1.0.jar. Requested replication 10 exceeds maximum 3

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. util.NativeCodeLoader

      Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/01/11 04:24:14 INFO client.RMProxy: Connecting to ResourceManager at /192.168.85.129:8032 17/01/11 04:24:15 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this. 17/01/11 04:24:15 WARN mapreduce.JobSubmitter: No job jar file set. User classes may not be found. See Job or Job#setJar(String). 17/01/11 04:24:15 INFO input.FileInputFormat: Total input paths to process : 1 17/01/11 04:24:15 INFO mapreduce.JobSubmitter: number of splits:1 17/01/11 04:24:16 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1484121450974_0004 17/01/11 04:24:16 INFO mapred.YARNRunner: Job jar is not present. Not adding any jar to the list of resources. 17/01/11 04:24:16 INFO impl.YarnClientImpl: Submitted application application_1484121450974_0004 17/01/11 04:24:16 INFO mapreduce.Job: The url to track the job: http://hadoop-senior:8088/proxy/application_1484121450974_0004/ 17/01/11 04:24:16 INFO mapreduce.Job: Running job: job_1484121450974_0004 17/01/11 04:24:23 INFO mapreduce.Job: Job job_1484121450974_0004 running in uber mode : false 17/01/11 04:24:23 INFO mapreduce.Job: map 0% reduce 0% 17/01/11 04:24:26 INFO mapreduce.Job: Task Id : attempt_1484121450974_0004_m_000000_0, Status : FAILED Exception from container-launch: ExitCodeException exitCode=1: ExitCodeException exitCode=1:

      at org.apache.hadoop.util.Shell.runCommand()
    2. Hadoop
      Shell$ShellCommandExecutor.execute
      1. org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
      2. org.apache.hadoop.util.Shell.run(Shell.java:455)
      3. org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
      3 frames
    3. hadoop-yarn-server-nodemanager
      ContainerLaunch.call
      1. org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
      2. org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300)
      3. org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
      3 frames
    4. Java RT
      Thread.run
      1. java.util.concurrent.FutureTask.run(FutureTask.java:266)
      2. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      3. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      4. java.lang.Thread.run(Thread.java:745)
      4 frames