java.lang.RuntimeException: error communicating via Thrift

Stack Overflow | freeza | 5 months ago
  1. 0

    Implementing OLAP on titan/cassandra graph

    Stack Overflow | 5 months ago | freeza
    java.lang.RuntimeException: error communicating via Thrift
  2. 0

    No local connection available in map reduce job

    Google Groups | 5 years ago | Gabriel Ki
    java.lang.RuntimeException: java.lang.UnsupportedOperationException: no local connection available
  3. 0

    Exception in Hadoop Word Count sample

    Google Groups | 5 years ago | Tharindu Mathew
    java.lang.RuntimeException: java.lang.UnsupportedOperationException: no local connection available
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    Faunus + ( Cassandra with Credentials )

    Google Groups | 3 years ago | Suhale MK
    java.lang.RuntimeException: unable to load keyspace titan
  6. 0

    Bundling jars when submitting map/reduce jobs via Pig?

    Stack Overflow | 6 years ago | cdecker
    java.lang.RuntimeException: Could not resolve error that occured when launching map reduce job: java.lang.NoClassDefFoundError: org/apache/thrift/TBase

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.RuntimeException

      error communicating via Thrift

      at org.apache.cassandra.hadoop.ColumnFamilyRecordReader$RowIterator.<init>()
    2. org.apache.cassandra
      ColumnFamilyRecordReader.initialize
      1. org.apache.cassandra.hadoop.ColumnFamilyRecordReader$RowIterator.<init>(ColumnFamilyRecordReader.java:267)
      2. org.apache.cassandra.hadoop.ColumnFamilyRecordReader$RowIterator.<init>(ColumnFamilyRecordReader.java:215)
      3. org.apache.cassandra.hadoop.ColumnFamilyRecordReader$StaticRowIterator.<init>(ColumnFamilyRecordReader.java:331)
      4. org.apache.cassandra.hadoop.ColumnFamilyRecordReader$StaticRowIterator.<init>(ColumnFamilyRecordReader.java:331)
      5. org.apache.cassandra.hadoop.ColumnFamilyRecordReader.initialize(ColumnFamilyRecordReader.java:171)
      5 frames
    3. com.thinkaurelius.titan
      GiraphRecordReader.initialize
      1. com.thinkaurelius.titan.hadoop.formats.cassandra.CassandraBinaryRecordReader.initialize(CassandraBinaryRecordReader.java:39)
      2. com.thinkaurelius.titan.hadoop.formats.util.GiraphRecordReader.initialize(GiraphRecordReader.java:38)
      2 frames
    4. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.rdd.NewHadoopRDD$$anon$1.<init>(NewHadoopRDD.scala:135)
      2. org.apache.spark.rdd.NewHadoopRDD.compute(NewHadoopRDD.scala:107)
      3. org.apache.spark.rdd.NewHadoopRDD.compute(NewHadoopRDD.scala:69)
      4. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:280)
      5. org.apache.spark.rdd.RDD.iterator(RDD.scala:247)
      6. org.apache.spark.rdd.MappedRDD.compute(MappedRDD.scala:31)
      7. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:280)
      8. org.apache.spark.rdd.RDD.iterator(RDD.scala:247)
      9. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
      10. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
      11. org.apache.spark.scheduler.Task.run(Task.scala:56)
      12. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:200)
      12 frames
    5. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames