java.lang.IllegalArgumentException

Don't know how to save StructField(blah,StructType(StructField(blah,StructType(StructField(ApplicationNaStructField(foo,BooleanType,true)),true)),true) to JDBC

Solutions on the web6240

  • via GitHub by samelamin
    , 1 year ago
    Don't know how to save StructField(blah,StructType(StructField(blah,StructType(StructField(ApplicationNaStructField(foo,BooleanType,true)),true)),true) to JDBC
  • Don't know how to save StructField(user_agent,ArrayType(StringType,true),true) to JDBC
  • via Stack Overflow by raekwon.ha
    , 10 months ago
    Don't know how to save StructField(max_request_time,DecimalType(30,0),true) to JDBC
  • Stack trace

    • java.lang.IllegalArgumentException: Don't know how to save StructField(blah,StructType(StructField(blah,StructType(StructField(ApplicationNaStructField(foo,BooleanType,true)),true)),true) to JDBC at com.databricks.spark.redshift.JDBCWrapper$$anonfun$schemaString$1.apply(RedshiftJDBCWrapper.scala:262) at com.databricks.spark.redshift.JDBCWrapper$$anonfun$schemaString$1.apply(RedshiftJDBCWrapper.scala:242) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108) at com.databricks.spark.redshift.JDBCWrapper.schemaString(RedshiftJDBCWrapper.scala:242) at com.databricks.spark.redshift.RedshiftWriter.createTableSql(RedshiftWriter.scala:75) at com.databricks.spark.redshift.RedshiftWriter.com$databricks$spark$redshift$RedshiftWriter$$doRedshiftLoad(RedshiftWriter.scala:160) at com.databricks.spark.redshift.RedshiftWriter$$anonfun$saveToRedshift$1.apply(RedshiftWriter.scala:378) at com.databricks.spark.redshift.RedshiftWriter$$anonfun$saveToRedshift$1.apply(RedshiftWriter.scala:376) at com.databricks.spark.redshift.RedshiftWriter.withStagingTable(RedshiftWriter.scala:121) at com.databricks.spark.redshift.RedshiftWriter.saveToRedshift(RedshiftWriter.scala:376) at com.databricks.spark.redshift.DefaultSource.createRelation(DefaultSource.scala:106) at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:222)

    Write tip

    You have a different solution? A short tip here would help you and many other users who saw this issue last week.

    Users with the same issue

    You are the first who have seen this exception. Write a tip to help other users and build your expert profile.