master.HMaster: Unhandled exception. Starting shutdown. org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.UnknownCryptoProtoc olVersionException): No crypto protocol versions provided by the client are supported. Client provided: [] NameNode supports: [CryptoProtocolVersion{description='Unknown', version=1, unknownValue=null}, CryptoProtocolVersion{description='Encryption zones', version=2, unknownValue=null}] at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.chooseProtocolVersion(FSNamesystem.java:2470)

Stack Overflow | Tinku | 2 months ago
  1. 0

    Fail to start hbase with encrypted hdfs

    Stack Overflow | 2 months ago | Tinku
    master.HMaster: Unhandled exception. Starting shutdown. org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.UnknownCryptoProtoc olVersionException): No crypto protocol versions provided by the client are supported. Client provided: [] NameNode supports: [CryptoProtocolVersion{description='Unknown', version=1, unknownValue=null}, CryptoProtocolVersion{description='Encryption zones', version=2, unknownValue=null}] at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.chooseProtocolVersion(FSNamesystem.java:2470)
  2. 0

    How to configure HBase in a HA mode?

    Google Groups | 4 months ago | Alexandr Porunov
    master.HMaster: Failed to become active master org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /hbase/.tmp/hbase.version could only be replicated to 0 nodes instead of minReplication (=1). There are 0 datanode(s) running and no node(s) are excluded in this operation.
  3. 0

    Hbase HDFS integration - Hbase Master not starting

    Stack Overflow | 2 years ago | Raju Chal
    master.HMaster: Unhandled exception. Starting shutdown. org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /hbase/.tmp/hbase.version could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and no node(s) are excluded in this operation. …………………………………….
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    HBase keeps doing SIMPLE authentication

    Stack Overflow | 3 years ago | Nam Pham
    master.HMaster: Unhandled exception. Starting shutdown. org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
  6. 0

    Hbase 0.96 and Hadoop 2.2

    Google Groups | 3 years ago | Paul Honig
    master.HMaster: Unhandled exception. Starting shutdown. org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.RpcServerException): Unknown out of band call #-2147483647

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. master.HMaster

      Unhandled exception. Starting shutdown. org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.UnknownCryptoProtoc olVersionException): No crypto protocol versions provided by the client are supported. Client provided: [] NameNode supports: [CryptoProtocolVersion{description='Unknown', version=1, unknownValue=null}, CryptoProtocolVersion{description='Encryption zones', version=2, unknownValue=null}] at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.chooseProtocolVersion(FSNamesystem.java:2470)

      at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt()
    2. Apache Hadoop HDFS
      ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod
      1. org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2600)
      2. org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2519)
      3. org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:566)
      4. org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:394)
      5. org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
      5 frames
    3. Hadoop
      Server$Handler$1.run
      1. org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
      2. org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
      3. org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
      4. org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
      4 frames
    4. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:422)
      2 frames
    5. Hadoop
      ProtobufRpcEngine$Invoker.invoke
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
      2. org.apache.hadoop.ipc.Server$Handler.run(Server.java:2035)
      3. org.apache.hadoop.ipc.Client.call(Client.java:1411)
      4. org.apache.hadoop.ipc.Client.call(Client.java:1364)
      5. org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
      5 frames
    6. com.sun.proxy
      $Proxy16.create
      1. com.sun.proxy.$Proxy16.create(Unknown Source)
      1 frame
    7. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:483)
      4 frames
    8. Hadoop
      RetryInvocationHandler.invoke
      1. org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
      2. org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
      2 frames
    9. com.sun.proxy
      $Proxy16.create
      1. com.sun.proxy.$Proxy16.create(Unknown Source)
      1 frame
    10. Apache Hadoop HDFS
      ClientNamenodeProtocolTranslatorPB.create
      1. org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:264)
      1 frame
    11. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:483)
      4 frames
    12. HBase
      HFileSystem$1.invoke
      1. org.apache.hadoop.hbase.fs.HFileSystem$1.invoke(HFileSystem.java:279)
      1 frame
    13. com.sun.proxy
      $Proxy17.create
      1. com.sun.proxy.$Proxy17.create(Unknown Source)
      1 frame
    14. Apache Hadoop HDFS
      DistributedFileSystem$6.doCall
      1. org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1612)
      2. org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1488)
      3. org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1413)
      4. org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:387)
      5. org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:383)
      5 frames
    15. Hadoop
      FileSystemLinkResolver.resolve
      1. org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
      1 frame
    16. Apache Hadoop HDFS
      DistributedFileSystem.create
      1. org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:383)
      2. org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:327)
      2 frames
    17. Hadoop
      FileSystem.create
      1. org.apache.hadoop.fs.FileSystem.create(FileSystem.java:906)
      2. org.apache.hadoop.fs.FileSystem.create(FileSystem.java:887)
      3. org.apache.hadoop.fs.FileSystem.create(FileSystem.java:784)
      4. org.apache.hadoop.fs.FileSystem.create(FileSystem.java:773)
      4 frames
    18. HBase
      FSUtils.checkVersion
      1. org.apache.hadoop.hbase.util.FSUtils.setVersion(FSUtils.java:650)
      2. org.apache.hadoop.hbase.util.FSUtils.setVersion(FSUtils.java:628)
      3. org.apache.hadoop.hbase.util.FSUtils.checkVersion(FSUtils.java:585)
      3 frames
    19. HBase - Client
      HMaster$1.run
      1. org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:436)
      2. org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:145)
      3. org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:125)
      4. org.apache.hadoop.hbase.master.HMaster.finishActiveMasterInitialization(HMaster.java:591)
      5. org.apache.hadoop.hbase.master.HMaster.access$500(HMaster.java:165)
      6. org.apache.hadoop.hbase.master.HMaster$1.run(HMaster.java:1425)
      6 frames
    20. Java RT
      Thread.run
      1. java.lang.Thread.run(Thread.java:745)
      1 frame