java.io.IOException: org.apache.spark.SparkException: Failed to get broadcast_1_piece0 of broadcast_1

Stack Overflow | Jim Lee | 3 months ago
  1. 0

    ERROR IN Spark:java.lang.ClassCastException: org.apache.spark.SparkContext$$anonfun$36 cannot be cast to org.apache.spark.ShuffleDependency

    Stack Overflow | 3 months ago | Jim Lee
    java.io.IOException: org.apache.spark.SparkException: Failed to get broadcast_1_piece0 of broadcast_1
  2. 0

    Reading and writing to Cassandra from Spark worker throws error

    Stack Overflow | 5 months ago | Alex Punnen
    java.io.IOException: org.apache.spark.SparkException: Failed to get broadcast_5_piece0 of broadcast_5
  3. 0

    Spark, mail # user - java.io.IOException: org.apache.spark.SparkException: Failed to get broadcast_2_piece0 - 2015-05-06, 15:06

    search-hadoop.com | 3 months ago
    java.io.IOException: org.apache.spark.SparkException: Failed to get broadcast_2_piece0 of broadcast_2
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    GitHub comment 19#222298002

    GitHub | 6 months ago | skoppar
    java.io.IOException: Could not read footer: java.lang.RuntimeException: file:/Users/skoppar/workspace/pyspark-beacon/stream/allproto.log is not a Parquet file. expected magic number at tail [80, 65, 82, 49] but found [55, 73, 67, 10] at org.apache.parquet.hadoop.ParquetFileReader.readAllFootersInParallel(ParquetFileReader.java:248) at org.apache.spark.sql.execution.datasources.parquet.ParquetRelation$$anonfun$24.apply(ParquetRelation.scala:812) at org.apache.spark.sql.execution.datasources.parquet.ParquetRelation$$anonfun$24.apply(ParquetRelation.scala:801) at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$22.apply(RDD.scala:756) at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$22.apply(RDD.scala:756)
  6. 0

    Can't read parquet with spark2.0

    Google Groups | 4 months ago | Unknown author
    java.io.IOException: Could not read footer for file FileStatus{path=alluxio://master1:9000/tpctest/catalog_sales/_common_metadata; isDirectory=false; length=3654; replication=0; blocksize=0; modification_time=0; access_time=0; owner=; group=; permission=rw-rw-rw-; isSymlink=false} at org.apache.parquet.hadoop.ParquetFileReader.readAllFootersInParallel(ParquetFileReader.java:247) at org.apache.spark.sql.execution.datasources.parquet.ParquetRelation$$anonfun$24.apply(ParquetRelation.scala:812) at org.apache.spark.sql.execution.datasources.parquet.ParquetRelation$$anonfun$24.apply(ParquetRelation.scala:801) at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$22.apply(RDD.scala:756) at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$22.apply(RDD.scala:756) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)

    1 unregistered visitors
    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.io.IOException

      org.apache.spark.SparkException: Failed to get broadcast_1_piece0 of broadcast_1

      at org.apache.spark.util.Utils$.tryOrIOException()
    2. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1177)
      2. org.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:165)
      3. org.apache.spark.broadcast.TorrentBroadcast._value$lzycompute(TorrentBroadcast.scala:64)
      4. org.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:64)
      5. org.apache.spark.broadcast.TorrentBroadcast.getValue(TorrentBroadcast.scala:88)
      6. org.apache.spark.broadcast.Broadcast.value(Broadcast.scala:70)
      7. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:65)
      8. org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
      9. org.apache.spark.scheduler.Task.run(Task.scala:88)
      10. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
      10 frames
    3. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames