java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "secure-gateway/xxx.xxx.xxx.xxx"; destination host is: "insecure-namenode":9000;

Apache's JIRA Issue Tracker | Yongjun Zhang | 3 years ago
tip
Click on the to mark the solution that helps you, Samebug will learn from it.
As a community member, you’ll be rewarded for you help.
  1. 0

    hadoop TOKEN + kerberos

    Stack Overflow | 11 months ago | heap
    java.io.IOException: Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "xxx"; destination host is: "yyy":8020;
  2. 0

    Re: InvalidProtocolBufferException Exception

    apache.org | 1 year ago
    java.io.IOException: Failed on local exception: java.io.EOFException; Host Details : local host is: "PC/192.168.3.58"; destination host is: "" cloud2.com":8020;
  3. 0

    Issuing distcp command at the secure cluster side, trying to copy stuff from insecure cluster to secure cluster, and see the following problem: {code} hadoopuser@yjc5u-1 ~]$ hadoop distcp webhdfs://<insure-cluster>:<port>/tmp hdfs://<sure-cluster>:8020/tmp/tmptgt 14/07/30 20:06:19 INFO tools.DistCp: Input Options: DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false, ignoreFailures=false, maxMaps=20, sslConfigurationFile='null', copyStrategy='uniformsize', sourceFileListing=null, sourcePaths=[webhdfs://<insecure-cluster>:<port>/tmp], targetPath=hdfs://<secure-cluster>:8020/tmp/tmptgt, targetPathExists=true} 14/07/30 20:06:19 INFO client.RMProxy: Connecting to ResourceManager at <secure-clister>:8032 14/07/30 20:06:20 WARN ssl.FileBasedKeyStoresFactory: The property 'ssl.client.truststore.location' has not been set, no TrustStore will be loaded 14/07/30 20:06:20 WARN security.UserGroupInformation: PriviledgedActionException as:hadoopuser@xyz.COM (auth:KERBEROS) cause:java.io.IOException: Failed to get the token for hadoopuser, user=hadoopuser 14/07/30 20:06:20 WARN security.UserGroupInformation: PriviledgedActionException as:hadoopuser@xyz.COM (auth:KERBEROS) cause:java.io.IOException: Failed to get the token for hadoopuser, user=hadoopuser 14/07/30 20:06:20 ERROR tools.DistCp: Exception encountered java.io.IOException: Failed to get the token for hadoopuser, user=hadoopuser at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106) at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:365) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$600(WebHdfsFileSystem.java:84) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.shouldRetry(WebHdfsFileSystem.java:618) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:584) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:438) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:466) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:462) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getDelegationToken(WebHdfsFileSystem.java:1132) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getDelegationToken(WebHdfsFileSystem.java:218) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getAuthParameters(WebHdfsFileSystem.java:403) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toUrl(WebHdfsFileSystem.java:424) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractFsPathRunner.getUrl(WebHdfsFileSystem.java:640) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:565) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:438) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:466) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:462) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getHdfsFileStatus(WebHdfsFileSystem.java:781) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getFileStatus(WebHdfsFileSystem.java:796) at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57) at org.apache.hadoop.fs.Globber.glob(Globber.java:248) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1623) at org.apache.hadoop.tools.GlobbedCopyListing.doBuildListing(GlobbedCopyListing.java:77) at org.apache.hadoop.tools.CopyListing.buildListing(CopyListing.java:81) at org.apache.hadoop.tools.DistCp.createInputFileListing(DistCp.java:342) at org.apache.hadoop.tools.DistCp.execute(DistCp.java:154) at org.apache.hadoop.tools.DistCp.run(DistCp.java:121) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.tools.DistCp.main(DistCp.java:390) Caused by: org.apache.hadoop.ipc.RemoteException(java.io.IOException): Failed to get the token for hadoopuser, user=hadoopuser at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:159) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:334) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:84) at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:570) ... 30 more [hadoopuser@yjc5u-1 ~]$ {code}

    Apache's JIRA Issue Tracker | 3 years ago | Yongjun Zhang
    java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "secure-gateway/xxx.xxx.xxx.xxx"; destination host is: "insecure-namenode":9000;
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Running in Kerberized Hadoop clusters doesn't seem to be supported...

    GitHub | 1 year ago | BrianGallew
    java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "redacted/redacted"; destination host is: "redactedt":8020;
  6. 0

    Connecting to Kerberrized HDFS , java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name;

    Stack Overflow | 1 year ago | avinash patil
    java.io.IOException: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name; Host Details : local host is: "Securonix-int3.local/10.0.4.36"; destination host is: "sobd189.securonix.com":8020;

    2 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.io.IOException

      Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.

      at org.apache.hadoop.ipc.Client$Connection.setupIOstreams()
    2. Hadoop
      ProtobufRpcEngine$Invoker.invoke
      1. org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:734)
      2. org.apache.hadoop.ipc.Client$Connection.access$2700(Client.java:367)
      3. org.apache.hadoop.ipc.Client.getConnection(Client.java:1458)
      4. org.apache.hadoop.ipc.Client.call(Client.java:1377)
      5. org.apache.hadoop.ipc.Client.call(Client.java:1359)
      6. org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
      6 frames
    3. com.sun.proxy
      $Proxy11.getFileInfo
      1. com.sun.proxy.$Proxy11.getFileInfo(Unknown Source)
      1 frame
    4. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:606)
      4 frames
    5. Hadoop
      RetryInvocationHandler.invoke
      1. org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
      2. org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
      2 frames
    6. com.sun.proxy
      $Proxy11.getFileInfo
      1. com.sun.proxy.$Proxy11.getFileInfo(Unknown Source)
      1 frame
    7. Apache Hadoop HDFS
      DistributedFileSystem$17.doCall
      1. org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:671)
      2. org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1746)
      3. org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1112)
      4. org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1108)
      4 frames
    8. Hadoop
      FileSystemLinkResolver.resolve
      1. org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
      1 frame
    9. Apache Hadoop HDFS
      DistributedFileSystem.getFileStatus
      1. org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1108)
      1 frame
    10. Hadoop
      DistCp.main
      1. org.apache.hadoop.fs.FileSystem.isFile(FileSystem.java:1425)
      2. org.apache.hadoop.tools.SimpleCopyListing.validatePaths(SimpleCopyListing.java:69)
      3. org.apache.hadoop.tools.CopyListing.buildListing(CopyListing.java:79)
      4. org.apache.hadoop.tools.GlobbedCopyListing.doBuildListing(GlobbedCopyListing.java:90)
      5. org.apache.hadoop.tools.CopyListing.buildListing(CopyListing.java:80)
      6. org.apache.hadoop.tools.DistCp.createInputFileListing(DistCp.java:327)
      7. org.apache.hadoop.tools.DistCp.execute(DistCp.java:151)
      8. org.apache.hadoop.tools.DistCp.run(DistCp.java:118)
      9. org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
      10. org.apache.hadoop.tools.DistCp.main(DistCp.java:375)
      10 frames