javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

Cloudera Open Source | ping | 8 months ago
  1. 0

    Parquet support in Sqoop via SQOOP-1390 used Kite SDK. When I ran the sqoop command in Kerberos environment, the map reduce job failed. sqoop import --connect jdbc:db2://xxx:50000/testdb --username xxx --password xxx --table users --hive-import -hive-table users3 --as-parquetfile -m 1 The import job failed: ...... 2016-02-26 04:20:07,020 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Using mapred newApiCommitter. 2016-02-26 04:20:08,088 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: OutputCommitter set in config null 2016-02-26 04:20:08,918 INFO [main] hive.metastore: Trying to connect to metastore with URI thrift://xxx:9083 2016-02-26 04:30:09,207 WARN [main] hive.metastore: set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it. org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129) at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380) at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230) at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:3688) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:3674) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:448) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:237) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:182) at org.kitesdk.data.spi.hive.MetaStoreUtil.<init>(MetaStoreUtil.java:82) at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.getMetaStoreUtil(HiveAbstractMetadataProvider.java:63) at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:270) at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:255) at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.load(HiveAbstractMetadataProvider.java:102) at org.kitesdk.data.spi.filesystem.FileSystemDatasetRepository.load(FileSystemDatasetRepository.java:192) at org.kitesdk.data.Datasets.load(Datasets.java:108) at org.kitesdk.data.Datasets.load(Datasets.java:165) at org.kitesdk.data.mapreduce.DatasetKeyOutputFormat.load(DatasetKeyOutputFormat.java:510) at org.kitesdk.data.mapreduce.DatasetKeyOutputFormat.getOutputCommitter(DatasetKeyOutputFormat.java:473) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:476) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:458) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.callWithJobClassLoader(MRAppMaster.java:1560) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:458) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:377) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$4.run(MRAppMaster.java:1518) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1515) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1448) ....... I found a Kite bug KITE-1014 for "Fix support for Hive datasets on Kerberos enabled clusters." on version 1.1.0. After Sqoop pick up the the new Kite version, it still failed for the error. Then I had a try to add the hive configuration from Sqoop side and send it to kitesdk.data.spi.hive.MetaStoreUtil. The error above will be gone, but a new problem occurred. Seems Kite-hive parts sill not support Kerberos successfully. Please take a look at , thanks! ... ... 2016-03-24 21:21:12,647 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: hadoop login 2016-03-24 21:21:12,649 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: hadoop login commit 2016-03-24 21:21:12,650 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: using kerberos user:ambari-qa@XXX.COM 2016-03-24 21:21:12,650 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: Using user: "ambari-qa@XXX.COM" with name ambari-qa@XXX.COM 2016-03-24 21:21:12,650 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: User entry: "ambari-qa@XXX.COM" 2016-03-24 21:21:12,657 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: UGI loginUser:ambari-qa@XXX.COM (auth:KERBEROS) 2016-03-24 21:21:12,657 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Executing with tokens: 2016-03-24 21:21:12,657 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Kind: YARN_AM_RM_TOKEN, Service: , Ident: (appAttemptId { application_id { id: 6 cluster_timestamp: 1458832712880 } attemptId: 1 } keyId: 1898169565) 2016-03-24 21:21:12,752 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Kind: HDFS_DELEGATION_TOKEN, Service: 9.30.151.107:8020, Ident: (HDFS_DELEGATION_TOKEN token 38 for ambari-qa) 2016-03-24 21:21:12,753 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Kind: TIMELINE_DELEGATION_TOKEN, Service: 9.30.151.107:8188, Ident: (owner=ambari-qa, renewer=yarn, realUser=, issueDate=1458879665842, maxDate=1459484465842, sequenceNumber=32, masterKeyId=37) 2016-03-24 21:21:12,672 DEBUG [TGT Renewer for ambari-qa@XXX.COM] org.apache.hadoop.security.UserGroupInformation: Found tgt Ticket (hex) = ... ... 2016-03-24 21:21:13,728 DEBUG [main] org.kitesdk.data.spi.filesystem.FileSystemDatasetRepository: Loading dataset: hhh555 2016-03-24 21:21:13,823 INFO [main] hive.metastore: Trying to connect to metastore with URI thrift://xxx:9083 2016-03-24 21:21:13,830 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:ambari-qa (auth:SIMPLE) from:org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:403) 2016-03-24 21:21:13,858 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:ambari-qa (auth:SIMPLE) from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) 2016-03-24 21:21:13,858 DEBUG [main] org.apache.thrift.transport.TSaslTransport: opening transport org.apache.thrift.transport.TSaslClientTransport@25f7391e 2016-03-24 21:21:13,864 ERROR [main] org.apache.thrift.transport.TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:432) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:237) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:182) at org.kitesdk.data.spi.hive.MetaStoreUtil.<init>(MetaStoreUtil.java:89) at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.getMetaStoreUtil(HiveAbstractMetadataProvider.java:63) at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:270) at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:255) at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.load(HiveAbstractMetadataProvider.java:102) at org.kitesdk.data.spi.filesystem.FileSystemDatasetRepository.load(FileSystemDatasetRepository.java:197) at org.kitesdk.data.Datasets.load(Datasets.java:108) at org.kitesdk.data.Datasets.load(Datasets.java:165) at org.kitesdk.data.mapreduce.DatasetKeyOutputFormat.load(DatasetKeyOutputFormat.java:542) at org.kitesdk.data.mapreduce.DatasetKeyOutputFormat.getOutputCommitter(DatasetKeyOutputFormat.java:505) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:476) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:458) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.callWithJobClassLoader(MRAppMaster.java:1560) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:458) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:377) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$4.run(MRAppMaster.java:1518) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1515) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1448) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ... 34 more

    Cloudera Open Source | 8 months ago | ping
    javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
  2. 0

    [jira] [Commented] (ATLAS-381) HiveMetaStoreBridge will not connect to a kerberized hive metastore

    atlas-dev | 12 months ago | Tom Beerbower (JIRA)
    javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
  3. 0

    Queries occasionally fail targeting impalad when running against a secure environment. This seems to be because the instance is unable to talk to other impalads in the cluster due to its ticket expiring. These are short running queries (should complete in < 10 seconds) {code} Tuple(id=0 size=36 slots=[Slot(id=0 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=2 type=INT col=2 offset=8 null=(offset=0 mask=2)), Slot(id=4 type=INT col=4 offset=12 null=(offset=0 mask=4)), Slot(id=6 type=INT col=7 offset=16 null=(offset=0 mask=8)), Slot(id=10 type=INT col=10 offset=20 null=(offset=0 mask=10)), Slot(id=11 type=FLOAT col=12 offset=24 null=(offset=0 mask=20)), Slot(id=12 type=FLOAT col=19 offset=28 null=(offset=0 mask=40)), Slot(id=13 type=FLOAT col=13 offset=32 null=(offset=0 mask=80))]) Tuple(id=1 size=12 slots=[Slot(id=1 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=17 type=INT col=6 offset=8 null=(offset=0 mask=2))]) Tuple(id=2 size=24 slots=[Slot(id=3 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=8 type=STRING col=1 offset=8 null=(offset=0 mask=2))]) Tuple(id=3 size=56 slots=[Slot(id=5 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=14 type=STRING col=1 offset=8 null=(offset=0 mask=2)), Slot(id=15 type=STRING col=2 offset=24 null=(offset=0 mask=4)), Slot(id=16 type=STRING col=3 offset=40 null=(offset=0 mask=8))]) Tuple(id=4 size=24 slots=[Slot(id=7 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=9 type=STRING col=24 offset=8 null=(offset=0 mask=2))]) Tuple(id=5 size=104 slots=[Slot(id=18 type=STRING col=-1 offset=72 null=(offset=0 mask=10)), Slot(id=19 type=STRING col=-1 offset=88 null=(offset=0 mask=20)), Slot(id=20 type=BIGINT col=-1 offset=8 null=(offset=0 mask=1)), Slot(id=21 type=BIGINT col=-1 offset=16 null=(offset=0 mask=0)), Slot(id=22 type=DOUBLE col=-1 offset=24 null=(offset=0 mask=2)), Slot(id=23 type=BIGINT col=-1 offset=32 null=(offset=0 mask=0)), Slot(id=24 type=DOUBLE col=-1 offset=40 null=(offset=0 mask=4)), Slot(id=25 type=BIGINT col=-1 offset=48 null=(offset=0 mask=0)), Slot(id=26 type=DOUBLE col=-1 offset=56 null=(offset=0 mask=8)), Slot(id=27 type=BIGINT col=-1 offset=64 null=(offset=0 mask=0))]) E0214 10:00:07.121000 24251 authorization.cc:72] Kerberos: GSSAPI Error: Unspecified GSS failure. Minor code may provide more information (Ticket expired) I0214 10:00:07.159731 24251 status.cc:40] Couldn't open transport for 10.20.80.123:22000(SASL(-1): generic failure: GSSAPI Error: Unspecified GSS failure. Minor code may provide more information (Ticket expired)) @ 0x852262 impala::Status::Status() @ 0x818315 impala::ThriftClientImpl::Open() @ 0x7b6284 impala::BackendClientCache::GetClient() @ 0x9b5cca impala::DataStreamSender::Channel::Init() @ 0x9b7cbf impala::DataStreamSender::Init() @ 0x7fa8e5 impala::PlanFragmentExecutor::Prepare() @ 0x68bf4b impala::ImpalaServer::FragmentExecState::Prepare() @ 0x69c89f impala::ImpalaServer::StartPlanFragmentExecution() @ 0x69d876 impala::ImpalaServer::ExecPlanFragment() @ 0x858cb0 impala::ImpalaInternalServiceProcessor::process_ExecPlanFragment() @ 0x852f09 impala::ImpalaInternalServiceProcessor::dispatchCall() @ 0x69e73b apache::thrift::TDispatchProcessor::process() @ 0x12c977a apache::thrift::server::TThreadedServer::Task::run() @ 0x12cbe72 apache::thrift::concurrency::PthreadThread::threadMain() @ 0x7ffd037bf7b6 start_thread @ 0x7ffd02d829cd clone {code} I also have seen cause failures when communicating with the Hive Meta Store Service in secure mode. It appears Impalad is not kinit'ing frequently enough: {code} 13/02/20 11:52:42 INFO hive.metastore: Trying to connect to metastore with URI thrift://impala-centos57-3.ent.cloudera.com:9083 13/02/20 11:52:42 DEBUG security.UserGroupInformation: PrivilegedAction as:root (auth:KERBEROS) from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) 13/02/20 11:52:42 DEBUG transport.TSaslTransport: opening transport org.apache.thrift.transport.TSaslClientTransport@8dc1f04 13/02/20 11:52:42 ERROR transport.TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:194) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:277) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:163) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) at java.lang.reflect.Constructor.newInstance(Constructor.java:513) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1082) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:51) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:61) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2140) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2151) at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1114) at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1103) at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2206) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:347) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:706) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:208) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:130) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:106) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:172) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:209) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:195) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:162) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:175) ... 40 more {code} Version: impalad version 0.6 RELEASE (build ac205b9d3c02cbc82f306d68fa7633790fb7a6ad)

    Cloudera Open Source | 4 years ago | Lenni Kuff
    javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Hue + Sentry : Bad status: 3 (Unsupported mechanism type PLAIN) - Grokbase

    grokbase.com | 6 months ago
    javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
  6. 0

    Queries occasionally fail targeting impalad when running against a secure environment. This seems to be because the instance is unable to talk to other impalads in the cluster due to its ticket expiring. These are short running queries (should complete in < 10 seconds) {code} Tuple(id=0 size=36 slots=[Slot(id=0 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=2 type=INT col=2 offset=8 null=(offset=0 mask=2)), Slot(id=4 type=INT col=4 offset=12 null=(offset=0 mask=4)), Slot(id=6 type=INT col=7 offset=16 null=(offset=0 mask=8)), Slot(id=10 type=INT col=10 offset=20 null=(offset=0 mask=10)), Slot(id=11 type=FLOAT col=12 offset=24 null=(offset=0 mask=20)), Slot(id=12 type=FLOAT col=19 offset=28 null=(offset=0 mask=40)), Slot(id=13 type=FLOAT col=13 offset=32 null=(offset=0 mask=80))]) Tuple(id=1 size=12 slots=[Slot(id=1 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=17 type=INT col=6 offset=8 null=(offset=0 mask=2))]) Tuple(id=2 size=24 slots=[Slot(id=3 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=8 type=STRING col=1 offset=8 null=(offset=0 mask=2))]) Tuple(id=3 size=56 slots=[Slot(id=5 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=14 type=STRING col=1 offset=8 null=(offset=0 mask=2)), Slot(id=15 type=STRING col=2 offset=24 null=(offset=0 mask=4)), Slot(id=16 type=STRING col=3 offset=40 null=(offset=0 mask=8))]) Tuple(id=4 size=24 slots=[Slot(id=7 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=9 type=STRING col=24 offset=8 null=(offset=0 mask=2))]) Tuple(id=5 size=104 slots=[Slot(id=18 type=STRING col=-1 offset=72 null=(offset=0 mask=10)), Slot(id=19 type=STRING col=-1 offset=88 null=(offset=0 mask=20)), Slot(id=20 type=BIGINT col=-1 offset=8 null=(offset=0 mask=1)), Slot(id=21 type=BIGINT col=-1 offset=16 null=(offset=0 mask=0)), Slot(id=22 type=DOUBLE col=-1 offset=24 null=(offset=0 mask=2)), Slot(id=23 type=BIGINT col=-1 offset=32 null=(offset=0 mask=0)), Slot(id=24 type=DOUBLE col=-1 offset=40 null=(offset=0 mask=4)), Slot(id=25 type=BIGINT col=-1 offset=48 null=(offset=0 mask=0)), Slot(id=26 type=DOUBLE col=-1 offset=56 null=(offset=0 mask=8)), Slot(id=27 type=BIGINT col=-1 offset=64 null=(offset=0 mask=0))]) E0214 10:00:07.121000 24251 authorization.cc:72] Kerberos: GSSAPI Error: Unspecified GSS failure. Minor code may provide more information (Ticket expired) I0214 10:00:07.159731 24251 status.cc:40] Couldn't open transport for 10.20.80.123:22000(SASL(-1): generic failure: GSSAPI Error: Unspecified GSS failure. Minor code may provide more information (Ticket expired)) @ 0x852262 impala::Status::Status() @ 0x818315 impala::ThriftClientImpl::Open() @ 0x7b6284 impala::BackendClientCache::GetClient() @ 0x9b5cca impala::DataStreamSender::Channel::Init() @ 0x9b7cbf impala::DataStreamSender::Init() @ 0x7fa8e5 impala::PlanFragmentExecutor::Prepare() @ 0x68bf4b impala::ImpalaServer::FragmentExecState::Prepare() @ 0x69c89f impala::ImpalaServer::StartPlanFragmentExecution() @ 0x69d876 impala::ImpalaServer::ExecPlanFragment() @ 0x858cb0 impala::ImpalaInternalServiceProcessor::process_ExecPlanFragment() @ 0x852f09 impala::ImpalaInternalServiceProcessor::dispatchCall() @ 0x69e73b apache::thrift::TDispatchProcessor::process() @ 0x12c977a apache::thrift::server::TThreadedServer::Task::run() @ 0x12cbe72 apache::thrift::concurrency::PthreadThread::threadMain() @ 0x7ffd037bf7b6 start_thread @ 0x7ffd02d829cd clone {code} I also have seen cause failures when communicating with the Hive Meta Store Service in secure mode. It appears Impalad is not kinit'ing frequently enough: {code} 13/02/20 11:52:42 INFO hive.metastore: Trying to connect to metastore with URI thrift://impala-centos57-3.ent.cloudera.com:9083 13/02/20 11:52:42 DEBUG security.UserGroupInformation: PrivilegedAction as:root (auth:KERBEROS) from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) 13/02/20 11:52:42 DEBUG transport.TSaslTransport: opening transport org.apache.thrift.transport.TSaslClientTransport@8dc1f04 13/02/20 11:52:42 ERROR transport.TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:194) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:277) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:163) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) at java.lang.reflect.Constructor.newInstance(Constructor.java:513) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1082) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:51) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:61) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2140) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2151) at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1114) at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1103) at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2206) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:347) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:706) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:208) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:130) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:106) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:172) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:209) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:195) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:162) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:175) ... 40 more {code} Version: impalad version 0.6 RELEASE (build ac205b9d3c02cbc82f306d68fa7633790fb7a6ad)

    Cloudera Open Source | 4 years ago | Lenni Kuff
    javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

    3 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. javax.security.sasl.SaslException

      GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

      at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge()
    2. Java RT
      GssKrb5Client.evaluateChallenge
      1. com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
      1 frame
    3. Apache Thrift
      TSaslClientTransport.open
      1. org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
      2. org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
      3. org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
      3 frames
    4. Hive Shims
      TUGIAssumingTransport$1.run
      1. org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
      2. org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
      2 frames
    5. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:422)
      2 frames
    6. Hadoop
      UserGroupInformation.doAs
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
      1 frame
    7. Hive Shims
      TUGIAssumingTransport.open
      1. org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
      1 frame
    8. Hive Metastore
      HiveMetaStoreClient.<init>
      1. org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:432)
      2. org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:237)
      3. org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:182)
      3 frames
    9. org.kitesdk.data
      HiveAbstractMetadataProvider.load
      1. org.kitesdk.data.spi.hive.MetaStoreUtil.<init>(MetaStoreUtil.java:89)
      2. org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.getMetaStoreUtil(HiveAbstractMetadataProvider.java:63)
      3. org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:270)
      4. org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:255)
      5. org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.load(HiveAbstractMetadataProvider.java:102)
      5 frames
    10. Kite Data Core Module
      Datasets.load
      1. org.kitesdk.data.spi.filesystem.FileSystemDatasetRepository.load(FileSystemDatasetRepository.java:197)
      2. org.kitesdk.data.Datasets.load(Datasets.java:108)
      3. org.kitesdk.data.Datasets.load(Datasets.java:165)
      3 frames
    11. org.kitesdk.data
      DatasetKeyOutputFormat.getOutputCommitter
      1. org.kitesdk.data.mapreduce.DatasetKeyOutputFormat.load(DatasetKeyOutputFormat.java:542)
      2. org.kitesdk.data.mapreduce.DatasetKeyOutputFormat.getOutputCommitter(DatasetKeyOutputFormat.java:505)
      2 frames
    12. hadoop-mapreduce-client-app
      MRAppMaster.serviceInit
      1. org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:476)
      2. org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:458)
      3. org.apache.hadoop.mapreduce.v2.app.MRAppMaster.callWithJobClassLoader(MRAppMaster.java:1560)
      4. org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:458)
      5. org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:377)
      5 frames
    13. Hadoop
      AbstractService.init
      1. org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
      1 frame
    14. hadoop-mapreduce-client-app
      MRAppMaster$4.run
      1. org.apache.hadoop.mapreduce.v2.app.MRAppMaster$4.run(MRAppMaster.java:1518)
      1 frame
    15. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:422)
      2 frames
    16. Hadoop
      UserGroupInformation.doAs
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
      1 frame
    17. hadoop-mapreduce-client-app
      MRAppMaster.main
      1. org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1515)
      2. org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1448)
      2 frames