Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Samebug tips

  1. ,

    Delete the corrupted index files and on startup, Kafka server will rebuild it.

Solutions on the web

via Stack Overflow by Robin
, 1 year ago
java.lang.IllegalArgumentException: requirement failed: Duplicate SQLConfigEntry. spark.sql.hive.convertCTAS has been registered	at scala.Predef$.require(Predef.scala:224)	at org.apache.spark.sql.internal.SQLConf$.org$apache$spark$sql$internal$SQLConf$$register(SQLConf.scala:44)	at org.apache.spark.sql.internal.SQLConf$SQLConfigBuilder$$anonfun$apply$1.apply(SQLConf.scala:51)	at org.apache.spark.sql.internal.SQLConf$SQLConfigBuilder$$anonfun$apply$1.apply(SQLConf.scala:51)	at org.apache.spark.internal.config.TypedConfigBuilder$$anonfun$createWithDefault$1.apply(ConfigBuilder.scala:122)	at org.apache.spark.internal.config.TypedConfigBuilder$$anonfun$createWithDefault$1.apply(ConfigBuilder.scala:122)	at scala.Option.foreach(Option.scala:257)	at org.apache.spark.internal.config.TypedConfigBuilder.createWithDefault(ConfigBuilder.scala:122)	at org.apache.spark.sql.hive.HiveUtils$.(HiveUtils.scala:103)	at org.apache.spark.sql.hive.HiveUtils$.(HiveUtils.scala)	at org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:48)	at org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:47)	at org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:54)	at org.apache.spark.sql.hive.HiveSharedState.externalCatalog(HiveSharedState.scala:54)	at org.apache.spark.sql.hive.HiveSessionState.catalog$lzycompute(HiveSessionState.scala:50)	at org.apache.spark.sql.hive.HiveSessionState.catalog(HiveSessionState.scala:48)	at org.apache.spark.sql.hive.HiveSessionState$$anon$1.(HiveSessionState.scala:63)	at org.apache.spark.sql.hive.HiveSessionState.analyzer$lzycompute(HiveSessionState.scala:63)	at org.apache.spark.sql.hive.HiveSessionState.analyzer(HiveSessionState.scala:62)	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:582)	at hiveTest$.main(hiveTest.scala:34)	at hiveTest.main(hiveTest.scala)