org.apache.spark.SparkException: Task not serializable

GitHub | jpdna | 9 months ago
tip
Do you know that we can give you better hits? Get more relevant results from Samebug’s stack trace search.
  1. 0

    adam2vcf Fails with Sample not serializable

    GitHub | 9 months ago | jpdna
    org.apache.spark.SparkException: Task not serializable
  2. 0

    GitHub comment 1101#238033481

    GitHub | 9 months ago | jpdna
    org.apache.spark.SparkException: Task not serializable
  3. 0

    GitHub comment 1101#238034661

    GitHub | 9 months ago | jpdna
    org.apache.spark.SparkException: Task not serializable
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    GitHub comment 394#196891797

    GitHub | 1 year ago | timodonnell
    org.apache.spark.SparkException: Task not serializable
  6. 0

    spark structured streaming (java): task not serializable

    Stack Overflow | 2 weeks ago | user2221654
    org.apache.spark.SparkException: Task not serializable
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.io.NotSerializableException

    org.bdgenomics.formats.avro.Sample Serialization stack: - object not serializable (class: org.bdgenomics.formats.avro.Sample, value: {"sampleId": "HG00096", "name": null, "attributes": {}}) - writeObject data (class: scala.collection.immutable.$colon$colon) - object (class scala.collection.immutable.$colon$colon, List({"sampleId": "HG00096", "name": null, "attributes": {}})) - field (class: org.bdgenomics.adam.rdd.variation.VariantContextRDD, name: samples, type: interface scala.collection.Seq) - object (class org.bdgenomics.adam.rdd.variation.VariantContextRDD, VariantContextRDD(MapPartitionsRDD[4] at map at GenotypeRDD.scala:62,SequenceDictionary{ 1->249250621, 0 2->243199373, 1 3->198022430, 2 4->191154276, 3 5->180915260, 4 6->171115067, 5 7->159138663, 6 8->146364022, 7 9->141213431, 8 10->135534747, 9 11->135006516, 10 12->133851895, 11 13->115169878, 12 14->107349540, 13 15->102531392, 14 16->90354753, 15 17->81195210, 16 18->78077248, 17 19->59128983, 18 20->63025520, 19 21->48129895, 20 22->51304566, 21 GL000191.1->106433, 22 GL000192.1->547496, 23 GL000193.1->189789, 24 GL000194.1->191469, 25 GL000195.1->182896, 26 GL000196.1->38914, 27 GL000197.1->37175, 28 GL000198.1->90085, 29 GL000199.1->169874, 30 GL000200.1->187035, 31 GL000201.1->36148, 32 GL000202.1->40103, 33 GL000203.1->37498, 34 GL000204.1->81310, 35 GL000205.1->174588, 36 GL000206.1->41001, 37 GL000207.1->4262, 38 GL000208.1->92689, 39 GL000209.1->159169, 40 GL000210.1->27682, 41 GL000211.1->166566, 42 GL000212.1->186858, 43 GL000213.1->164239, 44 GL000214.1->137718, 45 GL000215.1->172545, 46 GL000216.1->172294, 47 GL000217.1->172149, 48 GL000218.1->161147, 49 GL000219.1->179198, 50 GL000220.1->161802, 51 GL000221.1->155397, 52 GL000222.1->186861, 53 GL000223.1->180455, 54 GL000224.1->179693, 55 GL000225.1->211173, 56 GL000226.1->15008, 57 GL000227.1->128374, 58 GL000228.1->129120, 59 GL000229.1->19913, 60 GL000230.1->43691, 61 GL000231.1->27386, 62 GL000232.1->40652, 63 GL000233.1->45941, 64 GL000234.1->40531, 65 GL000235.1->34474, 66 GL000236.1->41934, 67 GL000237.1->45867, 68 GL000238.1->39939, 69 GL000239.1->33824, 70 GL000240.1->41933, 71 GL000241.1->42152, 72 GL000242.1->43523, 73 GL000243.1->43341, 74 GL000244.1->39929, 75 GL000245.1->36651, 76 GL000246.1->38154, 77 GL000247.1->36422, 78 GL000248.1->39786, 79 GL000249.1->38502, 80 MT->16569, 81 NC_007605->171823, 82 X->155270560, 83 Y->59373566, 84 hs37d5->35477943, 85},List({"sampleId": "HG00096", "name": null, "attributes": {}}))) - field (class: org.bdgenomics.adam.rdd.variation.VariantContextRDD$$anonfun$4, name: $outer, type: class org.bdgenomics.adam.rdd.variation.VariantContextRDD) - object (class org.bdgenomics.adam.rdd.variation.VariantContextRDD$$anonfun$4, <function2>)

    at org.apache.spark.serializer.SerializationDebugger$.improveException()
  2. Spark
    RDD.mapPartitionsWithIndex
    1. org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
    2. org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47)
    3. org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:101)
    4. org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:301)
    5. org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:294)
    6. org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:122)
    7. org.apache.spark.SparkContext.clean(SparkContext.scala:2055)
    8. org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1.apply(RDD.scala:742)
    9. org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1.apply(RDD.scala:741)
    10. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
    11. org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
    12. org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
    13. org.apache.spark.rdd.RDD.mapPartitionsWithIndex(RDD.scala:741)
    13 frames
  3. org.bdgenomics.adam
    ADAM2Vcf.run
    1. org.bdgenomics.adam.rdd.variation.VariantContextRDD.saveAsVcf(VariantContextRDD.scala:117)
    2. org.bdgenomics.adam.cli.ADAM2Vcf.run(ADAM2Vcf.scala:83)
    2 frames
  4. org.bdgenomics.utils
    BDGSparkCommand$class.run
    1. org.bdgenomics.utils.cli.BDGSparkCommand$class.run(BDGCommand.scala:55)
    1 frame
  5. org.bdgenomics.adam
    ADAMMain.main
    1. org.bdgenomics.adam.cli.ADAM2Vcf.run(ADAM2Vcf.scala:59)
    2. org.bdgenomics.adam.cli.ADAMMain.apply(ADAMMain.scala:131)
    3. org.bdgenomics.adam.cli.ADAMMain$.main(ADAMMain.scala:71)
    4. org.bdgenomics.adam.cli.ADAMMain.main(ADAMMain.scala)
    4 frames
  6. Java RT
    Method.invoke
    1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    4. java.lang.reflect.Method.invoke(Method.java:497)
    4 frames
  7. Spark
    SparkSubmit.main
    1. org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    2. org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    3. org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    4. org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    5. org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    5 frames