java.net.SocketTimeoutException: Read timed out

Google Groups | Chanh Le | 4 months ago
  1. 0

    Do I need switch to FT mode?

    Google Groups | 4 months ago | Chanh Le
    java.net.SocketTimeoutException: Read timed out
  2. 0

    H060 Unable to open Hive session - Hortonworks

    hortonworks.com | 1 year ago
    org.apache.ambari.view.hive.client.HiveClientException: H060 Unable to open Hive session: org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out
  3. 0

    16/02/05 11:08:29 INFO : Tachyon client (version 0.8.2) is trying to connect with FileSystemMaster master @ /192.168.1.251:19998 16/02/05 11:08:29 INFO : Client registered with FileSystemMaster master @ /192.168.1.251:19998 16/02/05 11:08:59 ERROR : java.net.SocketTimeoutException: Read timed out tachyon.org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out at tachyon.org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129) at tachyon.org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) at tachyon.org.apache.thrift.transport.TFramedTransport.readFrame(TFramedTransport.java:129) at tachyon.org.apache.thrift.transport.TFramedTransport.read(TFramedTransport.java:101) at tachyon.org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) at tachyon.org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) at tachyon.org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) at tachyon.org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219) at tachyon.org.apache.thrift.protocol.TProtocolDecorator.readMessageBegin(TProtocolDecorator.java:135) at tachyon.org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69) at tachyon.thrift.FileSystemMasterService$Client.recv_create(FileSystemMasterService.java:238) at tachyon.thrift.FileSystemMasterService$Client.create(FileSystemMasterService.java:224) at tachyon.client.FileSystemMasterClient.create(FileSystemMasterClient.java:239) at tachyon.client.file.AbstractTachyonFileSystem.create(AbstractTachyonFileSystem.java:73) at tachyon.client.file.TachyonFileSystem.getOutStream(TachyonFileSystem.java:167) at tachyon.client.file.TachyonFileSystem.getOutStream(TachyonFileSystem.java:141) at com.asksunny.bigdata.tachyon.TachyonPocMain.main(TachyonPocMain.java:30) Caused by: java.net.SocketTimeoutException: Read timed out at java.net.SocketInputStream.socketRead0(Native Method) at java.net.SocketInputStream.socketRead(SocketInputStream.java:116) at java.net.SocketInputStream.read(SocketInputStream.java:170) at java.net.SocketInputStream.read(SocketInputStream.java:141) at java.io.BufferedInputStream.fill(BufferedInputStream.java:246) at java.io.BufferedInputStream.read1(BufferedInputStream.java:286) at java.io.BufferedInputStream.read(BufferedInputStream.java:345) at tachyon.org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127) ... 16 more

    JIRA | 10 months ago | Developer Sunny
    tachyon.org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    ambari hive view timeout ,why ? - Hortonworks

    hortonworks.com | 3 months ago
    org.apache.ambari.view.hive.client.HiveClientException: H100 Unable to submit statement use pushopdb;: org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out

  1. kid 1 times, last 3 months ago
  2. danleyb2Interintel 1 times, last 5 months ago
  3. tfr 7 times, last 5 months ago
20 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.net.SocketTimeoutException

    Read timed out

    at java.net.SocketInputStream.socketRead0()
  2. Java RT
    BufferedInputStream.read
    1. java.net.SocketInputStream.socketRead0(Native Method)
    2. java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
    3. java.net.SocketInputStream.read(SocketInputStream.java:170)
    4. java.net.SocketInputStream.read(SocketInputStream.java:141)
    5. java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
    6. java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
    7. java.io.BufferedInputStream.read(BufferedInputStream.java:345)
    7 frames
  3. alluxio.org.apache
    TServiceClient.receiveBase
    1. alluxio.org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)
    2. alluxio.org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
    3. alluxio.org.apache.thrift.transport.TFramedTransport.readFrame(TFramedTransport.java:129)
    4. alluxio.org.apache.thrift.transport.TFramedTransport.read(TFramedTransport.java:101)
    5. alluxio.org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
    6. alluxio.org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
    7. alluxio.org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
    8. alluxio.org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
    9. alluxio.org.apache.thrift.protocol.TProtocolDecorator.readMessageBegin(TProtocolDecorator.java:135)
    10. alluxio.org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
    10 frames
  4. alluxio.thrift
    BlockWorkerClientService$Client.cancelBlock
    1. alluxio.thrift.BlockWorkerClientService$Client.recv_cancelBlock(BlockWorkerClientService.java:282)
    2. alluxio.thrift.BlockWorkerClientService$Client.cancelBlock(BlockWorkerClientService.java:268)
    2 frames
  5. alluxio.client.block
    BlockWorkerClient$4.call
    1. alluxio.client.block.BlockWorkerClient$4.call(BlockWorkerClient.java:167)
    2. alluxio.client.block.BlockWorkerClient$4.call(BlockWorkerClient.java:164)
    2 frames
  6. alluxio
    AbstractClient.retryRPC
    1. alluxio.AbstractClient.retryRPC(AbstractClient.java:327)
    1 frame
  7. alluxio.client.block
    RemoteBlockOutStream.cancel
    1. alluxio.client.block.BlockWorkerClient.cancelBlock(BlockWorkerClient.java:164)
    2. alluxio.client.block.RemoteBlockOutStream.cancel(RemoteBlockOutStream.java:65)
    2 frames
  8. alluxio.client.file
    FileInStream.seek
    1. alluxio.client.file.FileInStream.closeOrCancelCacheStream(FileInStream.java:339)
    2. alluxio.client.file.FileInStream.handleCacheStreamIOException(FileInStream.java:397)
    3. alluxio.client.file.FileInStream.read(FileInStream.java:214)
    4. alluxio.client.file.FileInStream.readCurrentBlockToPos(FileInStream.java:617)
    5. alluxio.client.file.FileInStream.seekInternalWithCachingPartiallyReadBlock(FileInStream.java:562)
    6. alluxio.client.file.FileInStream.seek(FileInStream.java:247)
    6 frames
  9. alluxio.hadoop
    HdfsFileInputStream.seek
    1. alluxio.hadoop.HdfsFileInputStream.seek(HdfsFileInputStream.java:324)
    1 frame
  10. Hadoop
    FSDataInputStream.seek
    1. org.apache.hadoop.fs.FSDataInputStream.seek(FSDataInputStream.java:62)
    1 frame
  11. org.apache.parquet
    ParquetFileReader.readFooter
    1. org.apache.parquet.hadoop.ParquetFileReader.readFooter(ParquetFileReader.java:417)
    2. org.apache.parquet.hadoop.ParquetFileReader.readFooter(ParquetFileReader.java:385)
    2 frames
  12. org.apache.spark
    UnsafeRowParquetRecordReader.tryInitialize
    1. org.apache.spark.sql.execution.datasources.parquet.SpecificParquetRecordReaderBase.initialize(SpecificParquetRecordReaderBase.java:98)
    2. org.apache.spark.sql.execution.datasources.parquet.UnsafeRowParquetRecordReader.initialize(UnsafeRowParquetRecordReader.java:130)
    3. org.apache.spark.sql.execution.datasources.parquet.UnsafeRowParquetRecordReader.tryInitialize(UnsafeRowParquetRecordReader.java:117)
    3 frames
  13. Spark
    CoalescedRDD$$anonfun$compute$1.apply
    1. org.apache.spark.rdd.SqlNewHadoopRDD$$anon$1.<init>(SqlNewHadoopRDD.scala:169)
    2. org.apache.spark.rdd.SqlNewHadoopRDD.compute(SqlNewHadoopRDD.scala:126)
    3. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
    4. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
    5. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
    6. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
    7. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
    8. org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:87)
    9. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
    10. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
    11. org.apache.spark.rdd.CoalescedRDD$$anonfun$compute$1.apply(CoalescedRDD.scala:96)
    12. org.apache.spark.rdd.CoalescedRDD$$anonfun$compute$1.apply(CoalescedRDD.scala:95)
    12 frames
  14. Scala
    Iterator$$anon$13.hasNext
    1. scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
    1 frame
  15. org.apache.spark
    InsertIntoHadoopFsRelation$$anonfun$run$1$$anonfun$apply$mcV$sp$3.apply
    1. org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer.writeRows(WriterContainer.scala:376)
    2. org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation$$anonfun$run$1$$anonfun$apply$mcV$sp$3.apply(InsertIntoHadoopFsRelation.scala:150)
    3. org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation$$anonfun$run$1$$anonfun$apply$mcV$sp$3.apply(InsertIntoHadoopFsRelation.scala:150)
    3 frames
  16. Spark
    Executor$TaskRunner.run
    1. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    2. org.apache.spark.scheduler.Task.run(Task.scala:89)
    3. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    3 frames
  17. Java RT
    Thread.run
    1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    3. java.lang.Thread.run(Thread.java:745)
    3 frames