org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 2.0 (TID 2, 172.26.28.101): scala.MatchError: UUIDType (of class org.apache.spark.sql.cassandra.types.UUIDType$)

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via DataStax JIRA by Alexander Sedov, 1 year ago
Lost task 0.0 in stage 2.0 (TID 2, 172.26.28.101): scala.MatchError: UUIDType (of class org.apache.spark.sql.cassandra.types.UUIDType$)
via DataStax JIRA by Alexander Sedov, 1 year ago
Lost task 0.0 in stage 2.0 (TID 2, 172.26.28.101): scala.MatchError: UUIDType (of class org.apache.spark.sql.cassandra.types.UUIDType$)
via Stack Overflow by jguerra
, 2 years ago
Lost task 3.0 in stage 0.0 (TID 6, 161.72.45.76): scala.MatchError: UUIDType (of class org.apache.spark.sql.cassandra.types.UUIDType$)
org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 2.0 (TID 2, 172.26.28.101): scala.MatchError: UUIDType (of class org.apache.spark.sql.cassandra.types.UUIDType$)
at org.apache.spark.sql.execution.SparkSqlSerializer2$$anonfun$createSerializationFunction$1.apply(SparkSqlSerializer2.scala:232)
at org.apache.spark.sql.execution.SparkSqlSerializer2$$anonfun$createSerializationFunction$1.apply(SparkSqlSerializer2.scala:227)
at org.apache.spark.storage.DiskBlockObjectWriter.write(BlockObjectWriter.scala:206)
at org.apache.spark.util.collection.WritablePartitionedIterator$$anon$3.writeNext(WritablePartitionedPairCollection.scala:104)
at org.apache.spark.util.collection.ExternalSorter.spillToPartitionFiles(ExternalSorter.scala:375)
at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:208)
at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:70)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
at org.apache.spark.scheduler.Task.run(Task.scala:70)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Users with the same issue

You are the first who have seen this exception.

Write tip

Know the solutions? Share your knowledge to help other developers to debug faster.