java.lang.IllegalArgumentException: String tag does not have length() == 2:

GitHub | akiezun | 7 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    BwaSpark blows up on some reads

    GitHub | 7 months ago | akiezun
    java.lang.IllegalArgumentException: String tag does not have length() == 2:

    Root Cause Analysis

    1. java.lang.IllegalArgumentException

      String tag does not have length() == 2:

      at htsjdk.samtools.SAMTagUtil.makeBinaryTag()
    2. HTS JDK
      SAMLineParser.parseLine
      1. htsjdk.samtools.SAMTagUtil.makeBinaryTag(SAMTagUtil.java:100)
      2. htsjdk.samtools.SAMRecord.setAttribute(SAMRecord.java:1364)
      3. htsjdk.samtools.SAMLineParser.parseTag(SAMLineParser.java:436)
      4. htsjdk.samtools.SAMLineParser.parseLine(SAMLineParser.java:346)
      5. htsjdk.samtools.SAMLineParser.parseLine(SAMLineParser.java:213)
      5 frames
    3. org.broadinstitute.hellbender
      BwaSparkEngine.lambda$alignWithBWA$464b6154$1
      1. org.broadinstitute.hellbender.tools.spark.bwa.BwaSparkEngine.lambda$alignWithBWA$464b6154$1(BwaSparkEngine.java:75)
      1 frame
    4. Spark
      JavaPairRDD$$anonfun$toScalaFunction$1.apply
      1. org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.apply(JavaPairRDD.scala:1015)
      1 frame
    5. Scala
      Iterator$$anon$11.next
      1. scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
      2. scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
      3. scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
      4. scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
      4 frames
    6. Spark
      Executor$TaskRunner.run
      1. org.apache.spark.util.random.SamplingUtils$.reservoirSampleAndCount(SamplingUtils.scala:42)
      2. org.apache.spark.RangePartitioner$$anonfun$9.apply(Partitioner.scala:261)
      3. org.apache.spark.RangePartitioner$$anonfun$9.apply(Partitioner.scala:259)
      4. org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$22.apply(RDD.scala:745)
      5. org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$22.apply(RDD.scala:745)
      6. org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
      7. org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
      8. org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
      9. org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
      10. org.apache.spark.scheduler.Task.run(Task.scala:89)
      11. org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
      11 frames
    7. Java RT
      Thread.run
      1. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      2. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      3. java.lang.Thread.run(Thread.java:745)
      3 frames