javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

Cloudera Open Source | Lenni Kuff | 4 years ago
  1. 0

    Queries occasionally fail targeting impalad when running against a secure environment. This seems to be because the instance is unable to talk to other impalads in the cluster due to its ticket expiring. These are short running queries (should complete in < 10 seconds) {code} Tuple(id=0 size=36 slots=[Slot(id=0 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=2 type=INT col=2 offset=8 null=(offset=0 mask=2)), Slot(id=4 type=INT col=4 offset=12 null=(offset=0 mask=4)), Slot(id=6 type=INT col=7 offset=16 null=(offset=0 mask=8)), Slot(id=10 type=INT col=10 offset=20 null=(offset=0 mask=10)), Slot(id=11 type=FLOAT col=12 offset=24 null=(offset=0 mask=20)), Slot(id=12 type=FLOAT col=19 offset=28 null=(offset=0 mask=40)), Slot(id=13 type=FLOAT col=13 offset=32 null=(offset=0 mask=80))]) Tuple(id=1 size=12 slots=[Slot(id=1 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=17 type=INT col=6 offset=8 null=(offset=0 mask=2))]) Tuple(id=2 size=24 slots=[Slot(id=3 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=8 type=STRING col=1 offset=8 null=(offset=0 mask=2))]) Tuple(id=3 size=56 slots=[Slot(id=5 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=14 type=STRING col=1 offset=8 null=(offset=0 mask=2)), Slot(id=15 type=STRING col=2 offset=24 null=(offset=0 mask=4)), Slot(id=16 type=STRING col=3 offset=40 null=(offset=0 mask=8))]) Tuple(id=4 size=24 slots=[Slot(id=7 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=9 type=STRING col=24 offset=8 null=(offset=0 mask=2))]) Tuple(id=5 size=104 slots=[Slot(id=18 type=STRING col=-1 offset=72 null=(offset=0 mask=10)), Slot(id=19 type=STRING col=-1 offset=88 null=(offset=0 mask=20)), Slot(id=20 type=BIGINT col=-1 offset=8 null=(offset=0 mask=1)), Slot(id=21 type=BIGINT col=-1 offset=16 null=(offset=0 mask=0)), Slot(id=22 type=DOUBLE col=-1 offset=24 null=(offset=0 mask=2)), Slot(id=23 type=BIGINT col=-1 offset=32 null=(offset=0 mask=0)), Slot(id=24 type=DOUBLE col=-1 offset=40 null=(offset=0 mask=4)), Slot(id=25 type=BIGINT col=-1 offset=48 null=(offset=0 mask=0)), Slot(id=26 type=DOUBLE col=-1 offset=56 null=(offset=0 mask=8)), Slot(id=27 type=BIGINT col=-1 offset=64 null=(offset=0 mask=0))]) E0214 10:00:07.121000 24251 authorization.cc:72] Kerberos: GSSAPI Error: Unspecified GSS failure. Minor code may provide more information (Ticket expired) I0214 10:00:07.159731 24251 status.cc:40] Couldn't open transport for 10.20.80.123:22000(SASL(-1): generic failure: GSSAPI Error: Unspecified GSS failure. Minor code may provide more information (Ticket expired)) @ 0x852262 impala::Status::Status() @ 0x818315 impala::ThriftClientImpl::Open() @ 0x7b6284 impala::BackendClientCache::GetClient() @ 0x9b5cca impala::DataStreamSender::Channel::Init() @ 0x9b7cbf impala::DataStreamSender::Init() @ 0x7fa8e5 impala::PlanFragmentExecutor::Prepare() @ 0x68bf4b impala::ImpalaServer::FragmentExecState::Prepare() @ 0x69c89f impala::ImpalaServer::StartPlanFragmentExecution() @ 0x69d876 impala::ImpalaServer::ExecPlanFragment() @ 0x858cb0 impala::ImpalaInternalServiceProcessor::process_ExecPlanFragment() @ 0x852f09 impala::ImpalaInternalServiceProcessor::dispatchCall() @ 0x69e73b apache::thrift::TDispatchProcessor::process() @ 0x12c977a apache::thrift::server::TThreadedServer::Task::run() @ 0x12cbe72 apache::thrift::concurrency::PthreadThread::threadMain() @ 0x7ffd037bf7b6 start_thread @ 0x7ffd02d829cd clone {code} I also have seen cause failures when communicating with the Hive Meta Store Service in secure mode. It appears Impalad is not kinit'ing frequently enough: {code} 13/02/20 11:52:42 INFO hive.metastore: Trying to connect to metastore with URI thrift://impala-centos57-3.ent.cloudera.com:9083 13/02/20 11:52:42 DEBUG security.UserGroupInformation: PrivilegedAction as:root (auth:KERBEROS) from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) 13/02/20 11:52:42 DEBUG transport.TSaslTransport: opening transport org.apache.thrift.transport.TSaslClientTransport@8dc1f04 13/02/20 11:52:42 ERROR transport.TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:194) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:277) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:163) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) at java.lang.reflect.Constructor.newInstance(Constructor.java:513) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1082) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:51) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:61) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2140) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2151) at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1114) at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1103) at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2206) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:347) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:706) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:208) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:130) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:106) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:172) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:209) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:195) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:162) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:175) ... 40 more {code} Version: impalad version 0.6 RELEASE (build ac205b9d3c02cbc82f306d68fa7633790fb7a6ad)

    Cloudera Open Source | 4 years ago | Lenni Kuff
    javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
  2. 0

    Queries occasionally fail targeting impalad when running against a secure environment. This seems to be because the instance is unable to talk to other impalads in the cluster due to its ticket expiring. These are short running queries (should complete in < 10 seconds) {code} Tuple(id=0 size=36 slots=[Slot(id=0 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=2 type=INT col=2 offset=8 null=(offset=0 mask=2)), Slot(id=4 type=INT col=4 offset=12 null=(offset=0 mask=4)), Slot(id=6 type=INT col=7 offset=16 null=(offset=0 mask=8)), Slot(id=10 type=INT col=10 offset=20 null=(offset=0 mask=10)), Slot(id=11 type=FLOAT col=12 offset=24 null=(offset=0 mask=20)), Slot(id=12 type=FLOAT col=19 offset=28 null=(offset=0 mask=40)), Slot(id=13 type=FLOAT col=13 offset=32 null=(offset=0 mask=80))]) Tuple(id=1 size=12 slots=[Slot(id=1 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=17 type=INT col=6 offset=8 null=(offset=0 mask=2))]) Tuple(id=2 size=24 slots=[Slot(id=3 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=8 type=STRING col=1 offset=8 null=(offset=0 mask=2))]) Tuple(id=3 size=56 slots=[Slot(id=5 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=14 type=STRING col=1 offset=8 null=(offset=0 mask=2)), Slot(id=15 type=STRING col=2 offset=24 null=(offset=0 mask=4)), Slot(id=16 type=STRING col=3 offset=40 null=(offset=0 mask=8))]) Tuple(id=4 size=24 slots=[Slot(id=7 type=INT col=0 offset=4 null=(offset=0 mask=1)), Slot(id=9 type=STRING col=24 offset=8 null=(offset=0 mask=2))]) Tuple(id=5 size=104 slots=[Slot(id=18 type=STRING col=-1 offset=72 null=(offset=0 mask=10)), Slot(id=19 type=STRING col=-1 offset=88 null=(offset=0 mask=20)), Slot(id=20 type=BIGINT col=-1 offset=8 null=(offset=0 mask=1)), Slot(id=21 type=BIGINT col=-1 offset=16 null=(offset=0 mask=0)), Slot(id=22 type=DOUBLE col=-1 offset=24 null=(offset=0 mask=2)), Slot(id=23 type=BIGINT col=-1 offset=32 null=(offset=0 mask=0)), Slot(id=24 type=DOUBLE col=-1 offset=40 null=(offset=0 mask=4)), Slot(id=25 type=BIGINT col=-1 offset=48 null=(offset=0 mask=0)), Slot(id=26 type=DOUBLE col=-1 offset=56 null=(offset=0 mask=8)), Slot(id=27 type=BIGINT col=-1 offset=64 null=(offset=0 mask=0))]) E0214 10:00:07.121000 24251 authorization.cc:72] Kerberos: GSSAPI Error: Unspecified GSS failure. Minor code may provide more information (Ticket expired) I0214 10:00:07.159731 24251 status.cc:40] Couldn't open transport for 10.20.80.123:22000(SASL(-1): generic failure: GSSAPI Error: Unspecified GSS failure. Minor code may provide more information (Ticket expired)) @ 0x852262 impala::Status::Status() @ 0x818315 impala::ThriftClientImpl::Open() @ 0x7b6284 impala::BackendClientCache::GetClient() @ 0x9b5cca impala::DataStreamSender::Channel::Init() @ 0x9b7cbf impala::DataStreamSender::Init() @ 0x7fa8e5 impala::PlanFragmentExecutor::Prepare() @ 0x68bf4b impala::ImpalaServer::FragmentExecState::Prepare() @ 0x69c89f impala::ImpalaServer::StartPlanFragmentExecution() @ 0x69d876 impala::ImpalaServer::ExecPlanFragment() @ 0x858cb0 impala::ImpalaInternalServiceProcessor::process_ExecPlanFragment() @ 0x852f09 impala::ImpalaInternalServiceProcessor::dispatchCall() @ 0x69e73b apache::thrift::TDispatchProcessor::process() @ 0x12c977a apache::thrift::server::TThreadedServer::Task::run() @ 0x12cbe72 apache::thrift::concurrency::PthreadThread::threadMain() @ 0x7ffd037bf7b6 start_thread @ 0x7ffd02d829cd clone {code} I also have seen cause failures when communicating with the Hive Meta Store Service in secure mode. It appears Impalad is not kinit'ing frequently enough: {code} 13/02/20 11:52:42 INFO hive.metastore: Trying to connect to metastore with URI thrift://impala-centos57-3.ent.cloudera.com:9083 13/02/20 11:52:42 DEBUG security.UserGroupInformation: PrivilegedAction as:root (auth:KERBEROS) from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) 13/02/20 11:52:42 DEBUG transport.TSaslTransport: opening transport org.apache.thrift.transport.TSaslClientTransport@8dc1f04 13/02/20 11:52:42 ERROR transport.TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:194) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:277) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:163) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) at java.lang.reflect.Constructor.newInstance(Constructor.java:513) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1082) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:51) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:61) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2140) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2151) at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1114) at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1103) at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2206) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:347) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:706) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:208) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:130) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:106) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:172) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:209) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:195) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:162) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:175) ... 40 more {code} Version: impalad version 0.6 RELEASE (build ac205b9d3c02cbc82f306d68fa7633790fb7a6ad)

    Cloudera Open Source | 4 years ago | Lenni Kuff
    javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
  3. 0

    The explore service is unable to start on a secure 5.x cluster. On cdh 5.2 and 5.3, we see the following exception: {code} 2015-03-21 01:31:50,059 - ERROR [ExploreExecutorService STARTING:c.c.c.c.t.AbstractMasterTwillRunnable$1@144] - Service co.cask.cdap.explore.executor.ExploreExecutorService failed com.google.common.util.concurrent.UncheckedExecutionException: org.apache.hive.service.ServiceException: Unable to login to kerberos with given principal/keytab at com.google.common.util.concurrent.Futures.wrapAndThrowUnchecked(Futures.java:1015) ~[com.google.guava.guava-13.0.1.jar:na] at com.google.common.util.concurrent.Futures.getUnchecked(Futures.java:1001) ~[com.google.guava.guava-13.0.1.jar:na] at com.google.common.util.concurrent.AbstractService.startAndWait(AbstractService.java:220) ~[com.google.guava.guava-13.0.1.jar:na] at com.google.common.util.concurrent.AbstractIdleService.startAndWait(AbstractIdleService.java:106) ~[com.google.guava.guava-13.0.1.jar:na] at co.cask.cdap.explore.executor.ExploreExecutorService.startUp(ExploreExecutorService.java:90) ~[co.cask.cdap.cdap-explore-2.8.0-SNAPSHOT.jar:na] at com.google.common.util.concurrent.AbstractIdleService$1$1.run(AbstractIdleService.java:43) ~[com.google.guava.guava-13.0.1.jar:na] at java.lang.Thread.run(Thread.java:701) ~[na:1.6.0_34] Caused by: org.apache.hive.service.ServiceException: Unable to login to kerberos with given principal/keytab at org.apache.hive.service.cli.CLIService.init(CLIService.java:88) ~[hive-service-0.13.1-cdh5.3.2.jar:0.13.1-cdh5.3.2] at co.cask.cdap.explore.service.hive.BaseHiveExploreService.startUp(BaseHiveExploreService.java:275) ~[co.cask.cdap.cdap-explore-2.8.0-SNAPSHOT.jar:na] ... 2 common frames omitted Caused by: java.io.IOException: HiveServer2 kerberos principal or keytab is not correctly configured at org.apache.hive.service.auth.HiveAuthFactory.loginFromKeytab(HiveAuthFactory.java:183) ~[hive-service-0.13.1-cdh5.3.2.jar:0.13.1-cdh5.3.2] at org.apache.hive.service.cli.CLIService.init(CLIService.java:85) ~[hive-service-0.13.1-cdh5.3.2.jar:0.13.1-cdh5.3.2] ... 3 common frames omitted {code} with cdh 5.0, the service starts up but queries dont go through. For example, a 'show tables' query results in the following exception: {code} 2015-03-21 02:22:21,423 - ERROR [pool-27-thread-1:o.a.t.t.TSaslTransport@296] - SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212) ~[na:1.6.0_34] at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) ~[org.apache.thrift.libthrift-0.9.0.jar:0.9.0] at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253) ~[org.apache.thrift.libthrift-0.9.0.jar:0.9.0] at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) ~[org.apache.thrift.libthrift-0.9.0.jar:0.9.0] at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) ~[hive-exec-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) ~[hive-exec-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at java.security.AccessController.doPrivileged(Native Method) ~[na:1.6.0_34] at javax.security.auth.Subject.doAs(Subject.java:416) ~[na:1.6.0_34] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) ~[hadoop-common-2.3.0-cdh5.0.0.jar:na] at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) ~[hive-exec-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:288) ~[hive-metastore-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:169) ~[hive-metastore-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.6.0_34] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) ~[na:1.6.0_34] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.6.0_34] at java.lang.reflect.Constructor.newInstance(Constructor.java:534) ~[na:1.6.0_34] at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1161) ~[hive-metastore-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62) ~[hive-metastore-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72) ~[hive-metastore-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2407) ~[hive-exec-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2418) ~[hive-exec-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1141) ~[hive-exec-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1130) ~[hive-exec-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2250) ~[hive-exec-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334) ~[hive-exec-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151) ~[hive-exec-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65) ~[hive-exec-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1485) ~[hive-exec-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1263) ~[hive-exec-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1091) ~[hive-exec-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931) ~[hive-exec-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hadoop.hive.ql.Driver.run(Driver.java:926) ~[hive-exec-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:144) ~[hive-service-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:64) ~[hive-service-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:177) ~[hive-service-0.12.0-cdh5.0.0.jar:0.12.0-cdh5.0.0] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) ~[na:1.6.0_34] at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) ~[na:1.6.0_34] at java.util.concurrent.FutureTask.run(FutureTask.java:166) ~[na:1.6.0_34] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146) ~[na:1.6.0_34] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) ~[na:1.6.0_34] at java.lang.Thread.run(Thread.java:701) ~[na:1.6.0_34] Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) ~[na:1.6.0_34] at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121) ~[na:1.6.0_34] at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) ~[na:1.6.0_34] at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:218) ~[na:1.6.0_34] at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:213) ~[na:1.6.0_34] at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:180) ~[na:1.6.0_34] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193) ~[na:1.6.0_34] ... 40 common frames omitted {code}

    Cask Community Issue Tracker | 2 years ago | Albert Shau
    javax.security.sasl.SaslException: GSS initiate failed
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided

    Stack Overflow | 1 year ago | Shankar
    javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Ker beros tgt)]
  6. 0

    Cascading flow with Hive Partition Tap getting failed with Kerberos Issue

    Google Groups | 1 year ago | Ganesh Auti
    javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]*

    3 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. javax.security.sasl.SaslException

      GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

      at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge()
    2. Java RT
      GssKrb5Client.evaluateChallenge
      1. com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:194)
      1 frame
    3. Apache Thrift
      TSaslClientTransport.open
      1. org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
      2. org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
      3. org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
      3 frames
    4. Hive Shims
      TUGIAssumingTransport$1.run
      1. org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
      2. org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
      2 frames
    5. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:396)
      2 frames
    6. Hadoop
      UserGroupInformation.doAs
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
      1 frame
    7. Hive Shims
      TUGIAssumingTransport.open
      1. org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
      1 frame
    8. Hive Metastore
      HiveMetaStoreClient.<init>
      1. org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:277)
      2. org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:163)
      2 frames
    9. Java RT
      Constructor.newInstance
      1. sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      2. sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
      3. sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
      4. java.lang.reflect.Constructor.newInstance(Constructor.java:513)
      4 frames
    10. Hive Metastore
      RetryingMetaStoreClient.getProxy
      1. org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1082)
      2. org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:51)
      3. org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:61)
      3 frames
    11. Hive Query Language
      Driver.run
      1. org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2140)
      2. org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2151)
      3. org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1114)
      4. org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1103)
      5. org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2206)
      6. org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334)
      7. org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
      8. org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
      9. org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352)
      10. org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138)
      11. org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
      11 frames
    12. org.apache.hadoop
      CliDriver.main
      1. org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
      2. org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
      3. org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
      4. org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:347)
      5. org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:706)
      6. org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
      6 frames
    13. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      4. java.lang.reflect.Method.invoke(Method.java:597)
      4 frames
    14. Hadoop
      RunJar.main
      1. org.apache.hadoop.util.RunJar.main(RunJar.java:208)
      1 frame