org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out

Cloudera Open Source | ping | 12 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    RE: Sqoop hive import with "as-parquetfile" failed in Kerberos enabled cluster

    sqoop-user | 1 year ago | Jordan Birdsell
    org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out
  2. 0

    Re: Sqoop hive import with "as-parquetfile" failed in Kerberos enabled cluster

    sqoop-user | 1 year ago | suraj shrestha
    org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out
  3. 0

    RE: Sqoop hive import with "as-parquetfile" failed in Kerberos enabled cluster

    sqoop-user | 1 year ago | Jordan Birdsell
    org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Re: Sqoop hive import with "as-parquetfile" failed in Kerberos enabled cluster

    sqoop-user | 1 year ago | suraj shrestha
    org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out
  6. 0

    Parquet support in Sqoop via SQOOP-1390 used Kite SDK. When I ran the sqoop command in Kerberos environment, the map reduce job failed. sqoop import --connect jdbc:db2://xxx:50000/testdb --username xxx --password xxx --table users --hive-import -hive-table users3 --as-parquetfile -m 1 The import job failed: ...... 2016-02-26 04:20:07,020 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Using mapred newApiCommitter. 2016-02-26 04:20:08,088 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: OutputCommitter set in config null 2016-02-26 04:20:08,918 INFO [main] hive.metastore: Trying to connect to metastore with URI thrift://xxx:9083 2016-02-26 04:30:09,207 WARN [main] hive.metastore: set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it. org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129) at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380) at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230) at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:3688) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:3674) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:448) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:237) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:182) at org.kitesdk.data.spi.hive.MetaStoreUtil.<init>(MetaStoreUtil.java:82) at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.getMetaStoreUtil(HiveAbstractMetadataProvider.java:63) at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:270) at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:255) at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.load(HiveAbstractMetadataProvider.java:102) at org.kitesdk.data.spi.filesystem.FileSystemDatasetRepository.load(FileSystemDatasetRepository.java:192) at org.kitesdk.data.Datasets.load(Datasets.java:108) at org.kitesdk.data.Datasets.load(Datasets.java:165) at org.kitesdk.data.mapreduce.DatasetKeyOutputFormat.load(DatasetKeyOutputFormat.java:510) at org.kitesdk.data.mapreduce.DatasetKeyOutputFormat.getOutputCommitter(DatasetKeyOutputFormat.java:473) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:476) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:458) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.callWithJobClassLoader(MRAppMaster.java:1560) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:458) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:377) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$4.run(MRAppMaster.java:1518) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1515) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1448) ....... I found a Kite bug KITE-1014 for "Fix support for Hive datasets on Kerberos enabled clusters." on version 1.1.0. After Sqoop pick up the the new Kite version, it still failed for the error. Then I had a try to add the hive configuration from Sqoop side and send it to kitesdk.data.spi.hive.MetaStoreUtil. The error above will be gone, but a new problem occurred. Seems Kite-hive parts sill not support Kerberos successfully. Please take a look at , thanks! ... ... 2016-03-24 21:21:12,647 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: hadoop login 2016-03-24 21:21:12,649 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: hadoop login commit 2016-03-24 21:21:12,650 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: using kerberos user:ambari-qa@XXX.COM 2016-03-24 21:21:12,650 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: Using user: "ambari-qa@XXX.COM" with name ambari-qa@XXX.COM 2016-03-24 21:21:12,650 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: User entry: "ambari-qa@XXX.COM" 2016-03-24 21:21:12,657 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: UGI loginUser:ambari-qa@XXX.COM (auth:KERBEROS) 2016-03-24 21:21:12,657 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Executing with tokens: 2016-03-24 21:21:12,657 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Kind: YARN_AM_RM_TOKEN, Service: , Ident: (appAttemptId { application_id { id: 6 cluster_timestamp: 1458832712880 } attemptId: 1 } keyId: 1898169565) 2016-03-24 21:21:12,752 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Kind: HDFS_DELEGATION_TOKEN, Service: 9.30.151.107:8020, Ident: (HDFS_DELEGATION_TOKEN token 38 for ambari-qa) 2016-03-24 21:21:12,753 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Kind: TIMELINE_DELEGATION_TOKEN, Service: 9.30.151.107:8188, Ident: (owner=ambari-qa, renewer=yarn, realUser=, issueDate=1458879665842, maxDate=1459484465842, sequenceNumber=32, masterKeyId=37) 2016-03-24 21:21:12,672 DEBUG [TGT Renewer for ambari-qa@XXX.COM] org.apache.hadoop.security.UserGroupInformation: Found tgt Ticket (hex) = ... ... 2016-03-24 21:21:13,728 DEBUG [main] org.kitesdk.data.spi.filesystem.FileSystemDatasetRepository: Loading dataset: hhh555 2016-03-24 21:21:13,823 INFO [main] hive.metastore: Trying to connect to metastore with URI thrift://xxx:9083 2016-03-24 21:21:13,830 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:ambari-qa (auth:SIMPLE) from:org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:403) 2016-03-24 21:21:13,858 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:ambari-qa (auth:SIMPLE) from:org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) 2016-03-24 21:21:13,858 DEBUG [main] org.apache.thrift.transport.TSaslTransport: opening transport org.apache.thrift.transport.TSaslClientTransport@25f7391e 2016-03-24 21:21:13,864 ERROR [main] org.apache.thrift.transport.TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:432) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:237) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:182) at org.kitesdk.data.spi.hive.MetaStoreUtil.<init>(MetaStoreUtil.java:89) at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.getMetaStoreUtil(HiveAbstractMetadataProvider.java:63) at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:270) at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:255) at org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.load(HiveAbstractMetadataProvider.java:102) at org.kitesdk.data.spi.filesystem.FileSystemDatasetRepository.load(FileSystemDatasetRepository.java:197) at org.kitesdk.data.Datasets.load(Datasets.java:108) at org.kitesdk.data.Datasets.load(Datasets.java:165) at org.kitesdk.data.mapreduce.DatasetKeyOutputFormat.load(DatasetKeyOutputFormat.java:542) at org.kitesdk.data.mapreduce.DatasetKeyOutputFormat.getOutputCommitter(DatasetKeyOutputFormat.java:505) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:476) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:458) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.callWithJobClassLoader(MRAppMaster.java:1560) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:458) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:377) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$4.run(MRAppMaster.java:1518) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1515) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1448) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ... 34 more

    Cloudera Open Source | 12 months ago | ping
    org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out

  1. kid 1 times, last 7 months ago
22 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. org.apache.thrift.transport.TTransportException

    java.net.SocketTimeoutException: Read timed out

    at org.apache.thrift.transport.TIOStreamTransport.read()
  2. Apache Thrift
    TServiceClient.receiveBase
    1. org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129)
    2. org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
    3. org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380)
    4. org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230)
    5. org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
    5 frames
  3. Hive Metastore
    HiveMetaStoreClient.<init>
    1. org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:3688)
    2. org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:3674)
    3. org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:448)
    4. org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:237)
    5. org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:182)
    5 frames
  4. org.kitesdk.data
    HiveAbstractMetadataProvider.load
    1. org.kitesdk.data.spi.hive.MetaStoreUtil.<init>(MetaStoreUtil.java:82)
    2. org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.getMetaStoreUtil(HiveAbstractMetadataProvider.java:63)
    3. org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:270)
    4. org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.resolveNamespace(HiveAbstractMetadataProvider.java:255)
    5. org.kitesdk.data.spi.hive.HiveAbstractMetadataProvider.load(HiveAbstractMetadataProvider.java:102)
    5 frames
  5. Kite Data Core Module
    Datasets.load
    1. org.kitesdk.data.spi.filesystem.FileSystemDatasetRepository.load(FileSystemDatasetRepository.java:192)
    2. org.kitesdk.data.Datasets.load(Datasets.java:108)
    3. org.kitesdk.data.Datasets.load(Datasets.java:165)
    3 frames
  6. org.kitesdk.data
    DatasetKeyOutputFormat.getOutputCommitter
    1. org.kitesdk.data.mapreduce.DatasetKeyOutputFormat.load(DatasetKeyOutputFormat.java:510)
    2. org.kitesdk.data.mapreduce.DatasetKeyOutputFormat.getOutputCommitter(DatasetKeyOutputFormat.java:473)
    2 frames
  7. hadoop-mapreduce-client-app
    MRAppMaster.serviceInit
    1. org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:476)
    2. org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:458)
    3. org.apache.hadoop.mapreduce.v2.app.MRAppMaster.callWithJobClassLoader(MRAppMaster.java:1560)
    4. org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:458)
    5. org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:377)
    5 frames
  8. Hadoop
    AbstractService.init
    1. org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
    1 frame
  9. hadoop-mapreduce-client-app
    MRAppMaster$4.run
    1. org.apache.hadoop.mapreduce.v2.app.MRAppMaster$4.run(MRAppMaster.java:1518)
    1 frame
  10. Java RT
    Subject.doAs
    1. java.security.AccessController.doPrivileged(Native Method)
    2. javax.security.auth.Subject.doAs(Subject.java:422)
    2 frames
  11. Hadoop
    UserGroupInformation.doAs
    1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    1 frame
  12. hadoop-mapreduce-client-app
    MRAppMaster.main
    1. org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1515)
    2. org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1448)
    2 frames