org.apache.spark.SparkException: Checkpoint RDD ReliableCheckpointRDD[232] at aggregate at ALS.scala:1182(0) has different number of partitions from original RDD itemFactors-10 MapPartitionsRDD[230] at mapValues at ALS.scala:1131(18)

Stack Overflow | Hsiang | 4 months ago
  1. 0

    Spark MLlib error on trainimplicit

    Stack Overflow | 4 months ago | Hsiang
    org.apache.spark.SparkException: Checkpoint RDD ReliableCheckpointRDD[232] at aggregate at ALS.scala:1182(0) has different number of partitions from original RDD itemFactors-10 MapPartitionsRDD[230] at mapValues at ALS.scala:1131(18)
  2. 0

    Checkpoint RDD ReliableCheckpointRDD has different number of partitions from original RDD

    Stack Overflow | 1 year ago | Soumitra
    org.apache.spark.SparkException: Checkpoint RDD ReliableCheckpointRDD[11] at print at StatefulNetworkWordCount.scala:78(1) has different number of partitions from original RDD MapPartitionsRDD[10] at updateStateByKey at StatefulNetworkWordCount.scala:76(2)
  3. 0

    SparkStreaming with Tachyon - I still get BlockNotFoundException !

    Google Groups | 2 years ago | Dibyendu Bhattacharya
    org.apache.spark.SparkException: *Could not read data from write ahead log record FileBasedWriteAheadLogSegment(tachyon-ft://10.252.5.113:19998/tachyon/checkpoint/receivedData/2/log-1431341091711-1431341151711,645603894,10891919)* at org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDD.org <http://org.apache.spark.streaming.rdd.writeaheadlogbackedblockrdd.org/> $apache$spark$streaming$rdd$WriteAheadLogBackedBlockRDD$$getBlockFromWriteAheadLog$1(WriteAheadLogBackedBlockRDD.scala:144)
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    SparkStreaming with Tachyon - I still get BlockNotFoundException !

    Google Groups | 2 years ago | Dibyendu Bhattacharya
    org.apache.spark.SparkException: *Could not read data from write ahead log record FileBasedWriteAheadLogSegment(tachyon-ft://10.252.5.113:19998/tachyon/checkpoint/receivedData/2/log-1431341091711-1431341151711,645603894,10891919 <http://10.252.5.113:19998/tachyon/checkpoint/receivedData/2/log-1431341091711-1431341151711,645603894,10891919>)* at org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDD.org <http://org.apache.spark.streaming.rdd.writeaheadlogbackedblockrdd.org/> $apache$spark$streaming$rdd$WriteAheadLogBackedBlockRDD$$getBlockFromWriteAheadLog$1(WriteAheadLogBackedBlockRDD.scala:144) at org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDD$$anonfun$compute$1.apply(WriteAheadLogBackedBlockRDD.scala:168) at org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDD$$anonfun$compute$1.apply(WriteAheadLogBackedBlockRDD.scala:168)
  6. 0

    SparkStreaming with Tachyon - I still get BlockNotFoundException !

    Google Groups | 2 years ago | Dibyendu Bhattacharya
    org.apache.spark.SparkException: Could not read data from write ahead log record FileBasedWriteAheadLogSegment(tachyon-ft:// 10.252.5.113:19998/tachyon/checkpoint/receivedData/2/log-1431341091711-1431341151711,645603894,10891919 ) at org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDD.org <http://org.apache.spark.streaming.rdd.writeaheadlogbackedblockrdd.org/> $apache$spark$streaming$rdd$WriteAheadLogBackedBlockRDD$$getBlockFromWriteAheadLog$1(WriteAheadLogBackedBlockRDD.scala:144) at org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDD$$anonfun$compute$1.apply(WriteAheadLogBackedBlockRDD.scala:168) at org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDD$$anonfun$compute$1.apply(WriteAheadLogBackedBlockRDD.scala:168)

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. org.apache.spark.SparkException

      Checkpoint RDD ReliableCheckpointRDD[232] at aggregate at ALS.scala:1182(0) has different number of partitions from original RDD itemFactors-10 MapPartitionsRDD[230] at mapValues at ALS.scala:1131(18)

      at org.apache.spark.rdd.ReliableRDDCheckpointData.doCheckpoint()
    2. Spark
      RDD$$anonfun$doCheckpoint$1.apply
      1. org.apache.spark.rdd.ReliableRDDCheckpointData.doCheckpoint(ReliableRDDCheckpointData.scala:73)
      2. org.apache.spark.rdd.RDDCheckpointData.checkpoint(RDDCheckpointData.scala:74)
      3. org.apache.spark.rdd.RDD$$anonfun$doCheckpoint$1.apply$mcV$sp(RDD.scala:1655)
      4. org.apache.spark.rdd.RDD$$anonfun$doCheckpoint$1.apply(RDD.scala:1652)
      4 frames