java.lang.reflect.InvocationTargetException

incubator-sentry-dev | Vivek Shrivastava | 2 years ago
tip
Do you know that we can give you better hits? Get more relevant results from Samebug’s stack trace search.
  1. 0

    Re: Unable to run latest Sentry build with Hive 1.1.0

    incubator-sentry-dev | 2 years ago | Vivek Shrivastava
    java.lang.reflect.InvocationTargetException
  2. 0

    How to configure Apache NiFi for a Kerberized Hadoop Cluster

    Stack Overflow | 5 months ago | pavan
    org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (Could not open client transport with JDBC Uri: jdbc:hive2://ddas1106a:10000/innovate: Peer indicated failure: Unsupported mechanism type PLAIN)
  3. 0

    GitHub comment 59#42850805

    GitHub | 3 years ago | prabhunkl
    java.lang.RuntimeException: java.sql.SQLException: Could not open connection to jdbc:hive2://168.69.200.211:10000/default: Peer indicated failure: Unsupported mechanism type PLAIN
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    {noformat} beeline -u 'jdbc:hive2://localhost:10001/default?httpPath=/;transportMode=http' -n hdiuser scan complete in 15ms Connecting to jdbc:hive2://localhost:10001/default?httpPath=/;transportMode=http Java heap space Beeline version 0.14.0.2.2.4.1-1 by Apache Hive 0: jdbc:hive2://localhost:10001/default (closed)> ^C hdiuser@headnode0:~$ {noformat} But it works if I use the deprecated param - {noformat} hdiuser@headnode0:~$ beeline -u 'jdbc:hive2://localhost:10001/default?hive.server2.transport.mode=http;httpPath=/' -n hdiuser scan complete in 12ms Connecting to jdbc:hive2://localhost:10001/default?hive.server2.transport.mode=http;httpPath=/ 15/04/28 23:16:46 [main]: WARN jdbc.Utils: ***** JDBC param deprecation ***** 15/04/28 23:16:46 [main]: WARN jdbc.Utils: The use of hive.server2.transport.mode is deprecated. 15/04/28 23:16:46 [main]: WARN jdbc.Utils: Please use transportMode like so: jdbc:hive2://<host>:<port>/dbName;transportMode=<transport_mode_value> Connected to: Apache Hive (version 0.14.0.2.2.4.1-1) Driver: Hive JDBC (version 0.14.0.2.2.4.1-1) Transaction isolation: TRANSACTION_REPEATABLE_READ Beeline version 0.14.0.2.2.4.1-1 by Apache Hive 0: jdbc:hive2://localhost:10001/default> show tables; +------------------+--+ | tab_name | +------------------+--+ | hivesampletable | +------------------+--+ 1 row selected (18.181 seconds) 0: jdbc:hive2://localhost:10001/default> ^C hdiuser@headnode0:~$ ^C {noformat} The reason for the above message is : The url is wrong. Correct one: {noformat} beeline -u 'jdbc:hive2://localhost:10001/default;httpPath=/;transportMode=http' -n hdiuser {noformat} Note the ";" instead of "?". The deprecation msg prints the format as well: {noformat} Please use transportMode like so: jdbc:hive2://<host>:<port>/dbName;transportMode=<transport_mode_value> {noformat}

    Apache's JIRA Issue Tracker | 2 years ago | Hari Sankar Sivarama Subramaniyan
    java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10001/default?httpPath=/;transportMode=http: Invalid status 72
  6. 0

    hive

    solveseek.com | 2 years ago
    org.apache.thrift.transport.TTransportException: No common protection layer between client and server

    4 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.apache.thrift.transport.TTransportException

      Peer indicated failure: Problem with callback handler

      at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage()
    2. Apache Thrift
      TSaslClientTransport.open
      1. org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
      2. org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:307)
      3. org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
      3 frames
    3. org.apache.sentry
      SentryPolicyServiceClientDefaultImpl$UgiSaslClientTransport$1.run
      1. org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClientDefaultImpl$UgiSaslClientTransport.baseOpen(SentryPolicyServiceClientDefaultImpl.java:120)
      2. org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClientDefaultImpl$UgiSaslClientTransport.access$000(SentryPolicyServiceClientDefaultImpl.java:79)
      3. org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClientDefaultImpl$UgiSaslClientTransport$1.run(SentryPolicyServiceClientDefaultImpl.java:106)
      4. org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClientDefaultImpl$UgiSaslClientTransport$1.run(SentryPolicyServiceClientDefaultImpl.java:104)
      4 frames
    4. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:415)
      2 frames
    5. Hadoop
      UserGroupInformation.doAs
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
      1 frame
    6. org.apache.sentry
      SimpleDBProviderBackend.<init>
      1. org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClientDefaultImpl$UgiSaslClientTransport.open(SentryPolicyServiceClientDefaultImpl.java:104)
      2. org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClientDefaultImpl.<init>(SentryPolicyServiceClientDefaultImpl.java:156)
      3. org.apache.sentry.service.thrift.SentryServiceClientFactory.create(SentryServiceClientFactory.java:42)
      4. org.apache.sentry.provider.db.SimpleDBProviderBackend.<init>(SimpleDBProviderBackend.java:53)
      5. org.apache.sentry.provider.db.SimpleDBProviderBackend.<init>(SimpleDBProviderBackend.java:49)
      5 frames
    7. Java RT
      Constructor.newInstance
      1. sun.reflect.NativeConstructorAccessorImpl.newInstance0(NativeMethod)
      2. sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
      3. sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      4. java.lang.reflect.Constructor.newInstance(Constructor.java:526)
      4 frames
    8. org.apache.sentry
      HiveAuthzBindingHook.<init>
      1. org.apache.sentry.binding.hive.authz.HiveAuthzBinding.getAuthProvider(HiveAuthzBinding.java:205)
      2. org.apache.sentry.binding.hive.authz.HiveAuthzBinding.<init>(HiveAuthzBinding.java:87)
      3. org.apache.sentry.binding.hive.authz.HiveAuthzBinding.<init>(HiveAuthzBinding.java:79)
      4. org.apache.sentry.binding.hive.HiveAuthzBindingHook.<init>(HiveAuthzBindingHook.java:97)
      4 frames
    9. Java RT
      Class.newInstance
      1. sun.reflect.NativeConstructorAccessorImpl.newInstance0(NativeMethod)
      2. sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
      3. sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      4. java.lang.reflect.Constructor.newInstance(Constructor.java:526)
      5. java.lang.Class.newInstance(Class.java:379)
      5 frames
    10. Hive Query Language
      Driver.compileAndRespond
      1. org.apache.hadoop.hive.ql.hooks.HookUtils.getHooks(HookUtils.java:60)
      2. org.apache.hadoop.hive.ql.Driver.getHooks(Driver.java:1297)
      3. org.apache.hadoop.hive.ql.Driver.compile(Driver.java:407)
      4. org.apache.hadoop.hive.ql.Driver.compile(Driver.java:307)
      5. org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1112)
      6. org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1106)
      6 frames
    11. org.apache.hive
      TCLIService$Processor$ExecuteStatement.getResult
      1. org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:101)
      2. org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:172)
      3. org.apache.hive.service.cli.operation.Operation.run(Operation.java:257)
      4. org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:379)
      5. org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:366)
      6. org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:271)
      7. org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:415)
      8. org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
      9. org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
      9 frames
    12. Apache Thrift
      TBaseProcessor.process
      1. org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
      2. org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
      2 frames
    13. Hive Shims
      HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process
      1. org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:692)
      1 frame
    14. Apache Thrift
      TThreadPoolServer$WorkerProcess.run
      1. org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
      1 frame
    15. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames