java.lang.RuntimeException

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • WSO2 DAS does not suport Postgres?
    via Stack Overflow by raekwon.ha
    ,
    • java.lang.RuntimeException: Don't know how to save StructField(max_request_time,DecimalType(30,0),true) to JDBC at org.apache.spark.sql.jdbc.carbon.JDBCRelation.insert(JDBCRelation.scala:194) at org.apache.spark.sql.sources.InsertIntoDataSource.run(commands.scala:53) at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57) at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57) at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:68) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147) at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:87) at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:950) at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:950) at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:144) at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:128) at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51) at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:755) at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQueryLocal(SparkAnalyticsExecutor.java:731) at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQuery(SparkAnalyticsExecutor.java:709) at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService.executeQuery(CarbonAnalyticsProcessorService.java:201) at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService.executeScript(CarbonAnalyticsProcessorService.java:151) at org.wso2.carbon.analytics.spark.core.AnalyticsTask.execute(AnalyticsTask.java:59) at org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAdapter.java:67) at org.quartz.core.JobRunShell.run(JobRunShell.java:213) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.IllegalArgumentException: Don't know how to save StructField(max_request_time,DecimalType(30,0),true) to JDBC at org.apache.spark.sql.jdbc.carbon.package$JDBCWriteDetails$$anonfun$schemaString$1$$anonfun$2.apply(carbon.scala:55) at org.apache.spark.sql.jdbc.carbon.package$JDBCWriteDetails$$anonfun$schemaString$1$$anonfun$2.apply(carbon.scala:42) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.sql.jdbc.carbon.package$JDBCWriteDetails$$anonfun$schemaString$1.apply(carbon.scala:41) at org.apache.spark.sql.jdbc.carbon.package$JDBCWriteDetails$$anonfun$schemaString$1.apply(carbon.scala:38) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108) at org.apache.spark.sql.jdbc.carbon.package$JDBCWriteDetails$.schemaString(carbon.scala:38) at org.apache.spark.sql.jdbc.carbon.JDBCRelation.insert(JDBCRelation.scala:180) ... 26 more
    No Bugmate found.