Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via DataStax JIRA by Dmytro Popovych, 1 year ago
scala.collection.immutable.Set$Set2 cannot be cast to scala.collection.Seq
via DataStax JIRA by Dmytro Popovych, 1 year ago
scala.collection.immutable.Set$Set2 cannot be cast to scala.collection.Seq
java.lang.ClassCastException: scala.collection.immutable.Set$Set2 cannot be cast to scala.collection.Seq	at org.apache.spark.sql.catalyst.expressions.Cast$$anonfun$castArray$1$$anonfun$apply$56.apply(Cast.scala:382)	at org.apache.spark.sql.catalyst.expressions.Cast.org$apache$spark$sql$catalyst$expressions$Cast$$buildCast(Cast.scala:111)	at org.apache.spark.sql.catalyst.expressions.Cast$$anonfun$castArray$1.apply(Cast.scala:382)	at org.apache.spark.sql.catalyst.expressions.Cast.eval(Cast.scala:426)	at org.apache.spark.sql.catalyst.expressions.Alias.eval(namedExpressions.scala:113)	at org.apache.spark.sql.catalyst.expressions.InterpretedMutableProjection.apply(Projection.scala:68)	at org.apache.spark.sql.catalyst.expressions.InterpretedMutableProjection.apply(Projection.scala:52)	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)	at scala.collection.Iterator$$anon$10.next(Iterator.scala:312)	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)	at org.apache.spark.util.collection.ExternalSorter.spillToPartitionFiles(ExternalSorter.scala:371)	at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:211)	at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)	at org.apache.spark.scheduler.Task.run(Task.scala:64)	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)	at java.lang.Thread.run(Thread.java:745)