java.lang.IllegalArgumentException: Can't zip RDDs with unequal numbers of partitions

github.com | 7 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    eco-release-metadata/RELEASENOTES.1.2.0.md at master · aw-was-here/eco-release-metadata · GitHub

    github.com | 7 months ago
    java.lang.IllegalArgumentException: Can't zip RDDs with unequal numbers of partitions
  2. 0

    zipPartitions issue for reads data on cluster

    GitHub | 1 year ago | erictu
    java.lang.IllegalArgumentException: Can't zip RDDs with unequal numbers of partitions
  3. 0

    [SPARK-3561] Initial commit to provide pluggable strategy to facilitate access to nat... by olegz · Pull Request #2849 · apache/spark · GitHub

    github.com | 2 years ago
    java.lang.IllegalArgumentException: Can't zip RDDs with unequal numbers of partitions
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.IllegalArgumentException

      Can't zip RDDs with unequal numbers of partitions

      at org.apache.spark.rdd.ZippedPartitionsBaseRDD.getPartitions()
    2. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.ZippedPartitionsBaseRDD.getPartitions(ZippedPartitionsRDD.scala:56)
      2. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202)
      3 frames
    3. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    4. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:202)
      2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202)
      4 frames
    5. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    6. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:202)
      2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202)
      4 frames
    7. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    8. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:202)
      2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202)
      4 frames
    9. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    10. Spark
      RDD$$anonfun$dependencies$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:202)
      2. org.apache.spark.ShuffleDependency.<init>(Dependency.scala:79)
      3. org.apache.spark.rdd.ShuffledRDD.getDependencies(ShuffledRDD.scala:80)
      4. org.apache.spark.rdd.RDD$$anonfun$dependencies$2.apply(RDD.scala:191)
      5. org.apache.spark.rdd.RDD$$anonfun$dependencies$2.apply(RDD.scala:189)
      5 frames
    11. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    12. Spark
      DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse
      1. org.apache.spark.rdd.RDD.dependencies(RDD.scala:189)
      2. org.apache.spark.scheduler.DAGScheduler.visit$1(DAGScheduler.scala:298)
      3. org.apache.spark.scheduler.DAGScheduler.getParentStages(DAGScheduler.scala:310)
      4. org.apache.spark.scheduler.DAGScheduler.newStage(DAGScheduler.scala:246)
      5. org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:723)
      6. org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1333)
      6 frames
    13. Akka Actor
      ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec
      1. akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
      2. akka.actor.ActorCell.invoke(ActorCell.scala:456)
      3. akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
      4. akka.dispatch.Mailbox.run(Mailbox.scala:219)
      5. akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
      5 frames
    14. Scala
      ForkJoinWorkerThread.run
      1. scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
      2. scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
      3. scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
      4. scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
      4 frames