java.lang.ExceptionInInitializerError

Stack Overflow | Robin | 3 months ago
  1. 0

    GitHub comment 1988#201105812

    GitHub | 8 months ago | clockfly
    java.lang.IllegalArgumentException: requirement failed: Cannot push port (responseOut) twice
  2. 0

    replaceWithContext troubles

    GitHub | 8 months ago | gebner
    java.lang.IllegalArgumentException: requirement failed: Incorrect shallow formula: y = d != c = d
  3. Speed up your debug routine!

    Automated exception search integrated into your IDE

  4. 0

    Requests with method 'HEAD' must have an empty entity

    GitHub | 8 months ago | jrudnick
    java.lang.IllegalArgumentException: requirement failed: Requests with method 'HEAD' must have an empty entity
  5. 0

    GitHub comment 20209#205896543

    GitHub | 8 months ago | hochgi
    java.lang.IllegalArgumentException: requirement failed: contentLength must be positive (use `HttpEntity.empty(contentType)` for empty entities)

  1. rp 1 times, last 2 days ago
  2. richard77 9 times, last 2 weeks ago
  3. gehel 2 times, last 1 month ago
  4. tyson925 1 times, last 1 month ago
  5. melezov 2 times, last 3 months ago
3 more registered users
35 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. java.lang.IllegalArgumentException

    requirement failed: Duplicate SQLConfigEntry. spark.sql.hive.convertCTAS has been registered

    at scala.Predef$.require()
  2. Scala
    Predef$.require
    1. scala.Predef$.require(Predef.scala:224)
    1 frame
  3. org.apache.spark
    TypedConfigBuilder$$anonfun$createWithDefault$1.apply
    1. org.apache.spark.sql.internal.SQLConf$.org$apache$spark$sql$internal$SQLConf$$register(SQLConf.scala:44)
    2. org.apache.spark.sql.internal.SQLConf$SQLConfigBuilder$$anonfun$apply$1.apply(SQLConf.scala:51)
    3. org.apache.spark.sql.internal.SQLConf$SQLConfigBuilder$$anonfun$apply$1.apply(SQLConf.scala:51)
    4. org.apache.spark.internal.config.TypedConfigBuilder$$anonfun$createWithDefault$1.apply(ConfigBuilder.scala:122)
    5. org.apache.spark.internal.config.TypedConfigBuilder$$anonfun$createWithDefault$1.apply(ConfigBuilder.scala:122)
    5 frames
  4. Scala
    Option.foreach
    1. scala.Option.foreach(Option.scala:257)
    1 frame
  5. org.apache.spark
    TypedConfigBuilder.createWithDefault
    1. org.apache.spark.internal.config.TypedConfigBuilder.createWithDefault(ConfigBuilder.scala:122)
    1 frame
  6. Spark Project Hive
    HiveSessionState.analyzer
    1. org.apache.spark.sql.hive.HiveUtils$.<init>(HiveUtils.scala:103)
    2. org.apache.spark.sql.hive.HiveUtils$.<clinit>(HiveUtils.scala)
    3. org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:48)
    4. org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:47)
    5. org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:54)
    6. org.apache.spark.sql.hive.HiveSharedState.externalCatalog(HiveSharedState.scala:54)
    7. org.apache.spark.sql.hive.HiveSessionState.catalog$lzycompute(HiveSessionState.scala:50)
    8. org.apache.spark.sql.hive.HiveSessionState.catalog(HiveSessionState.scala:48)
    9. org.apache.spark.sql.hive.HiveSessionState$$anon$1.<init>(HiveSessionState.scala:63)
    10. org.apache.spark.sql.hive.HiveSessionState.analyzer$lzycompute(HiveSessionState.scala:63)
    11. org.apache.spark.sql.hive.HiveSessionState.analyzer(HiveSessionState.scala:62)
    11 frames
  7. Spark Project SQL
    SparkSession.sql
    1. org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
    2. org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
    3. org.apache.spark.sql.SparkSession.sql(SparkSession.scala:582)
    3 frames
  8. Unknown
    hiveTest.main
    1. hiveTest$.main(hiveTest.scala:34)
    2. hiveTest.main(hiveTest.scala)
    2 frames