java.lang.IllegalArgumentException: Don't know how to save StructField(blah,StructType(StructField(blah,StructType(StructField(ApplicationNaStructField(foo,BooleanType,true)),true)),true) to JDBC

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Solutions on the web

via GitHub by samelamin
, 1 year ago
Don't know how to save StructField(blah,StructType(StructField(blah,StructType(StructField(ApplicationNaStructField(foo,BooleanType,true)),true)),true) to JDBC
via Stack Overflow by jduff1075
, 1 year ago
Don't know how to save StructField(user_agent,ArrayType(StringType,true),true) to JDBC
java.lang.IllegalArgumentException: Don't know how to save StructField(blah,StructType(StructField(blah,StructType(StructField(ApplicationNaStructField(foo,BooleanType,true)),true)),true) to JDBC
at com.databricks.spark.redshift.JDBCWrapper$$anonfun$schemaString$1.apply(RedshiftJDBCWrapper.scala:262)
at com.databricks.spark.redshift.JDBCWrapper$$anonfun$schemaString$1.apply(RedshiftJDBCWrapper.scala:242)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at com.databricks.spark.redshift.JDBCWrapper.schemaString(RedshiftJDBCWrapper.scala:242)
at com.databricks.spark.redshift.RedshiftWriter.createTableSql(RedshiftWriter.scala:75)
at com.databricks.spark.redshift.RedshiftWriter.com$databricks$spark$redshift$RedshiftWriter$$doRedshiftLoad(RedshiftWriter.scala:160)
at com.databricks.spark.redshift.RedshiftWriter$$anonfun$saveToRedshift$1.apply(RedshiftWriter.scala:378)
at com.databricks.spark.redshift.RedshiftWriter$$anonfun$saveToRedshift$1.apply(RedshiftWriter.scala:376)
at com.databricks.spark.redshift.RedshiftWriter.withStagingTable(RedshiftWriter.scala:121)
at com.databricks.spark.redshift.RedshiftWriter.saveToRedshift(RedshiftWriter.scala:376)
at com.databricks.spark.redshift.DefaultSource.createRelation(DefaultSource.scala:106)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:222)

Users with the same issue

You are the first who have seen this exception.

Write tip

Know the solutions? Share your knowledge to help other developers to debug faster.