java.lang.RuntimeException

error communicating via Thrift

Samebug tips0

There are no available Samebug tips for this exception. If you know how to solve this issue, help other users by writing a short tip.

Don't give up yet. Paste your full stack trace to get a solution.

Solutions on the web16303

  • via Stack Overflow by freeza
    , 11 months ago
    error communicating via Thrift
  • Could not resolve error that occured when launching map reduce job: java.lang.NoClassDefFoundError: org/apache/thrift/TBase
  • via Google Groups by ron cai, 9 months ago
    java.lang.Exception: Thrift error for org.apache.tephra.distributed.TransactionServiceClient$2@273e2fa4: Unable to discover tx service.
  • Stack trace

    • java.lang.RuntimeException: error communicating via Thrift at org.apache.cassandra.hadoop.ColumnFamilyRecordReader$RowIterator.<init>(ColumnFamilyRecordReader.java:267) at org.apache.cassandra.hadoop.ColumnFamilyRecordReader$RowIterator.<init>(ColumnFamilyRecordReader.java:215) at org.apache.cassandra.hadoop.ColumnFamilyRecordReader$StaticRowIterator.<init>(ColumnFamilyRecordReader.java:331) at org.apache.cassandra.hadoop.ColumnFamilyRecordReader$StaticRowIterator.<init>(ColumnFamilyRecordReader.java:331) at org.apache.cassandra.hadoop.ColumnFamilyRecordReader.initialize(ColumnFamilyRecordReader.java:171) at com.thinkaurelius.titan.hadoop.formats.cassandra.CassandraBinaryRecordReader.initialize(CassandraBinaryRecordReader.java:39) at com.thinkaurelius.titan.hadoop.formats.util.GiraphRecordReader.initialize(GiraphRecordReader.java:38) at org.apache.spark.rdd.NewHadoopRDD$$anon$1.<init>(NewHadoopRDD.scala:135) at org.apache.spark.rdd.NewHadoopRDD.compute(NewHadoopRDD.scala:107) at org.apache.spark.rdd.NewHadoopRDD.compute(NewHadoopRDD.scala:69) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:280) at org.apache.spark.rdd.RDD.iterator(RDD.scala:247) at org.apache.spark.rdd.MappedRDD.compute(MappedRDD.scala:31) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:280) at org.apache.spark.rdd.RDD.iterator(RDD.scala:247) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) at org.apache.spark.scheduler.Task.run(Task.scala:56) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:200) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    You’re the first here who have seen this exception.