There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • GitHub comment 171#248970021
    via GitHub by lokm01
    • org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 123.0 failed 1 times, most recent failure: Lost task 1.0 in stage 123.0 (TID 131, localhost): java.lang.ArrayIndexOutOfBoundsException: 65536 at at at at at at at at$ at$ at com.databricks.spark.xml.XmlRecordReader.readUntilEndElement(XmlInputFormat.scala:194) at at com.databricks.spark.xml.XmlRecordReader.nextKeyValue(XmlInputFormat.scala:128)
    No Bugmate found.