org.apache.hadoop.tools.DistCp: Exception encountered org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token can be issued only with kerberos or web authentication

Stack Overflow | Kumar | 4 months ago
  1. 0

    Oozie distcp failed in secure cluster

    Stack Overflow | 4 months ago | Kumar
    org.apache.hadoop.tools.DistCp: Exception encountered org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token can be issued only with kerberos or web authentication
  2. 0

    Hadoop 0.22.0 Release Notes

    apache.org | 3 months ago
    org.apache.hadoop.tools.DistCp: FAIL README.txt : java.io.IOException: Server returned HTTP response code: 400 for URL: http:/namenode:50070/data/user/tsz/README.txt?ugi=tsz,users
  3. 0

    Hadoop 0.22.0 Release Notes

    apache.org | 3 months ago
    org.apache.hadoop.tools.DistCp: FAIL 2010/0/part-00032 : java.io.IOException: File size not matched: copied 193855488 bytes (184.9m) to tmpfile (=hdfs://omehost.com:8020/somepath/part-00032) but expected 1710327403 bytes (1.6g) from hftp://someotherhost/somepath/part-00032
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Copying folders containing + & ! characters between hdfs (using hftp) does not work in distcp For example: Copying folder "string1+string2" at "namenode.address.com", hftp port myport to "/myotherhome/folder" on "myothermachine" does not work myothermachine prompt>>> hadoop --config ~/mycluster/ distcp "hftp://namenode.address.com:myport/myhome/dir/string1+string2" /myotherhome/folder/ -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Error results for hadoop job1: -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 08/07/16 00:27:39 INFO tools.DistCp: srcPaths=[hftp://namenode.address.com:myport/myhome/dir/string1+string2] 08/07/16 00:27:39 INFO tools.DistCp: destPath=/myotherhome/folder/ 08/07/16 00:27:41 INFO tools.DistCp: srcCount=2 08/07/16 00:27:42 INFO mapred.JobClient: Running job: job1 08/07/16 00:27:43 INFO mapred.JobClient: map 0% reduce 0% 08/07/16 00:27:58 INFO mapred.JobClient: Task Id : attempt_1_m_000000_0, Status : FAILED java.io.IOException: Copied: 0 Skipped: 0 Failed: 1 at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:538) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:226) at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2208) 08/07/16 00:28:14 INFO mapred.JobClient: Task Id : attempt_1_m_000000_1, Status : FAILED java.io.IOException: Copied: 0 Skipped: 0 Failed: 1 at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:538) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:226) at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2208) 08/07/16 00:28:28 INFO mapred.JobClient: Task Id : attempt_1_m_000000_2, Status : FAILED java.io.IOException: Copied: 0 Skipped: 0 Failed: 1 at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:538) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:226) at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2208) With failures, global counters are inaccurate; consider running with -i Copy failed: java.io.IOException: Job failed! at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1053) at org.apache.hadoop.tools.DistCp.copy(DistCp.java:615) at org.apache.hadoop.tools.DistCp.run(DistCp.java:764) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) at org.apache.hadoop.tools.DistCp.main(DistCp.java:784) -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Error log for the map task which failed -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- INFO org.apache.hadoop.tools.DistCp: FAIL string1+string2/myjobtrackermachine.com-joblog.tar.gz : java.io.IOException: Server returned HTTP response code: 500 for URL: http://mymachine.com:myport/streamFile?filename=/myhome/dir/string1+string2/myjobtrackermachine.com-joblog.tar.gz&ugi=myid,mygroup at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1241) at org.apache.hadoop.dfs.HftpFileSystem.open(HftpFileSystem.java:117) at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:371) at org.apache.hadoop.tools.DistCp$CopyFilesMapper.copy(DistCp.java:377) at org.apache.hadoop.tools.DistCp$CopyFilesMapper.map(DistCp.java:504) at org.apache.hadoop.tools.DistCp$CopyFilesMapper.map(DistCp.java:279) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:47) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:226) at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2208) --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    Apache's JIRA Issue Tracker | 8 years ago | Viraj Bhat
    org.apache.hadoop.tools.DistCp: FAIL string1+string2/myjobtrackermachine.com-joblog.tar.gz : java.io.IOException: Server returned HTTP response code: 500 for URL: http://mymachine.com:myport/streamFile?filename=/myhome/dir/string1+string2/myjobtrackermachine.com-joblog.tar.gz&ugi=myid,mygroup
  6. 0

    hadoop distcp fail over hftp protocol

    Stack Overflow | 3 years ago | user1573269
    org.apache.hadoop.tools.DistCp: FAIL test1.dat : java.io.IOException: HTTP_OK expected, received 503 <em>at org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderUrlOpener.connect(HftpFileSystem.java:376)

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.apache.hadoop.tools.DistCp

      Exception encountered org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token can be issued only with kerberos or web authentication

      at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getDelegationToken()
    2. Apache Hadoop HDFS
      ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod
      1. org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getDelegationToken(FSNamesystem.java:6635)
      2. org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getDelegationToken(NameNodeRpcServer.java:563)
      3. org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getDelegationToken(ClientNamenodeProtocolServerSideTranslatorPB.java:987)
      4. org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
      4 frames
    3. Hadoop
      Server$Handler$1.run
      1. org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
      2. org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
      3. org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
      4. org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
      4 frames
    4. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:415)
      2 frames
    5. Hadoop
      ProtobufRpcEngine$Invoker.invoke
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656)
      2. org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
      3. org.apache.hadoop.ipc.Client.call(Client.java:1475)
      4. org.apache.hadoop.ipc.Client.call(Client.java:1412)
      5. org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
      5 frames
    6. com.sun.proxy
      $Proxy14.getDelegationToken
      1. com.sun.proxy.$Proxy14.getDelegationToken(Unknown Source)
      1 frame
    7. Apache Hadoop HDFS
      ClientNamenodeProtocolTranslatorPB.getDelegationToken
      1. org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getDelegationToken(ClientNamenodeProtocolTranslatorPB.java:933)
      1 frame
    8. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:606)
      4 frames
    9. Hadoop
      RetryInvocationHandler.invoke
      1. org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
      2. org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
      2 frames
    10. com.sun.proxy
      $Proxy15.getDelegationToken
      1. com.sun.proxy.$Proxy15.getDelegationToken(Unknown Source)
      1 frame
    11. Apache Hadoop HDFS
      DistributedFileSystem.getDelegationToken
      1. org.apache.hadoop.hdfs.DFSClient.getDelegationToken(DFSClient.java:1029)
      2. org.apache.hadoop.hdfs.DistributedFileSystem.getDelegationToken(DistributedFileSystem.java:1542)
      2 frames
    12. Hadoop
      FileSystem.addDelegationTokens
      1. org.apache.hadoop.fs.FileSystem.collectDelegationTokens(FileSystem.java:530)
      2. org.apache.hadoop.fs.FileSystem.addDelegationTokens(FileSystem.java:508)
      2 frames
    13. Apache Hadoop HDFS
      DistributedFileSystem.addDelegationTokens
      1. org.apache.hadoop.hdfs.DistributedFileSystem.addDelegationTokens(DistributedFileSystem.java:2228)
      1 frame
    14. Hadoop
      TokenCache.obtainTokensForNamenodes
      1. org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:121)
      2. org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:100)
      3. org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodes(TokenCache.java:80)
      3 frames
    15. Apache Hadoop Distributed Copy
      CopyOutputFormat.checkOutputSpecs
      1. org.apache.hadoop.tools.mapred.CopyOutputFormat.checkOutputSpecs(CopyOutputFormat.java:121)
      1 frame
    16. Hadoop
      Job$10.run
      1. org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:266)
      2. org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139)
      3. org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
      4. org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
      4 frames
    17. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:415)
      2 frames
    18. Hadoop
      UserGroupInformation.doAs
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656)
      1 frame
    19. Hadoop
      Job.submit
      1. org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
      1 frame
    20. Hadoop
      ToolRunner.run
      1. org.apache.hadoop.tools.DistCp.createAndSubmitJob(DistCp.java:183)
      2. org.apache.hadoop.tools.DistCp.execute(DistCp.java:153)
      3. org.apache.hadoop.tools.DistCp.run(DistCp.java:126)
      4. org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
      5. org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
      5 frames
    21. org.apache.oozie
      DistcpMain.main
      1. org.apache.oozie.action.hadoop.DistcpMain.run(DistcpMain.java:64)
      2. org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
      3. org.apache.oozie.action.hadoop.DistcpMain.main(DistcpMain.java:34)
      3 frames
    22. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:606)
      4 frames
    23. org.apache.oozie
      LauncherMapper.map
      1. org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
      1 frame
    24. Hadoop
      LocalContainerLauncher$EventHandler$1.run
      1. org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
      2. org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
      3. org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
      4. org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:380)
      5. org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:301)
      6. org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:187)
      7. org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:230)
      7 frames
    25. Java RT
      Thread.run
      1. java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
      2. java.util.concurrent.FutureTask.run(FutureTask.java:262)
      3. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      4. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      5. java.lang.Thread.run(Thread.java:744)
      5 frames