java.lang.ExceptionInInitializerError: This exception has no message.

Searched on Google with the first line of a JAVA stack trace?

We can recommend more relevant solutions and speed up debugging when you paste your entire stack trace with the exception message. Try a sample exception.

Recommended solutions based on your search

Samebug tips

,

This happened when I tried to insert an empty list of elements to a MongoDB collection.

,
via JIRA by Stephan Schroevers

You try to insert an empty list into MongoDB before 3.5.0 .

Check that the inserted list is not empty or upgrade your server.

Solutions on the web

via GitHub by tuxdna
, 2 months ago
This exception has no message.
via GitHub by mdeang
, 2 months ago
This exception has no message.
via Google Groups by Geoffrey Knauth, 1 year ago
java.lang.ExceptionInInitializerError:
at com.mongodb.assertions.Assertions.isTrueArgument(Assertions.java:99)
at com.mongodb.WriteConcern.(WriteConcern.java:316)
at com.mongodb.WriteConcern.(WriteConcern.java:227)
at com.mongodb.casbah.WriteConcern$.(WriteConcern.scala:41)
at com.mongodb.casbah.WriteConcern$.(WriteConcern.scala)
at com.mongodb.casbah.BaseImports$class.$init$(Implicits.scala:162)
at com.mongodb.casbah.Imports$.(Implicits.scala:142)
at com.mongodb.casbah.Imports$.(Implicits.scala)
at com.mongodb.casbah.MongoClient.apply(MongoClient.scala:219)
at com.stratio.datasource.mongodb.partitioner.MongodbPartitioner.isShardedCollection(MongodbPartitioner.scala:78)
at com.stratio.datasource.mongodb.partitioner.MongodbPartitioner$$anonfun$computePartitions$1.apply(MongodbPartitioner.scala:67)
at com.stratio.datasource.mongodb.partitioner.MongodbPartitioner$$anonfun$computePartitions$1.apply(MongodbPartitioner.scala:66)
at com.stratio.datasource.util.using$.apply(using.scala:38)
at com.stratio.datasource.mongodb.partitioner.MongodbPartitioner.computePartitions(MongodbPartitioner.scala:66)
at com.stratio.datasource.mongodb.rdd.MongodbRDD.getPartitions(MongodbRDD.scala:42)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
at org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:65)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$3.apply(PairRDDFunctions.scala:331)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$3.apply(PairRDDFunctions.scala:331)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
at com.stratio.datasource.mongodb.schema.MongodbSchema.schema(MongodbSchema.scala:47)
at com.stratio.datasource.mongodb.MongodbRelation.com$stratio$datasource$mongodb$MongodbRelation$$lazySchema$lzycompute(MongodbRelation.scala:63)
at com.stratio.datasource.mongodb.MongodbRelation.com$stratio$datasource$mongodb$MongodbRelation$$lazySchema(MongodbRelation.scala:60)
at com.stratio.datasource.mongodb.MongodbRelation$$anonfun$1.apply(MongodbRelation.scala:65)
at com.stratio.datasource.mongodb.MongodbRelation$$anonfun$1.apply(MongodbRelation.scala:65)
at scala.Option.getOrElse(Option.scala:120)
at com.stratio.datasource.mongodb.MongodbRelation.(MongodbRelation.scala:65)
at com.stratio.datasource.mongodb.DefaultSource.createRelation(DefaultSource.scala:36)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
at com.askingdata.test.TestSpark.main(TestSpark.java:23)

Users with the same issue

8 times, 2 weeks ago
15 times, 1 month ago
Once, 1 month ago
3 times, 1 month ago
Once, 5 months ago

Write tip

Know the solutions? Share your knowledge to help other developers to debug faster.