java.lang.ExceptionInInitializerError

If you like a tip written by other Samebug users, mark is as helpful! Marks help our algorithm provide you better solutions and also help other users.
tip

This happened when I tried to insert an empty list of elements to a MongoDB collection.


poroszdporoszd

You have a different solution? A short tip here would help you and many other users who saw this issue last week.

  • java.lang.ExceptionInInitializerError at com.mongodb.casbah.BaseImports$class.$init$(Implicits.scala:162) at com.mongodb.casbah.Imports$.<init>(Implicits.scala:142) at com.mongodb.casbah.Imports$.<clinit>(Implicits.scala) at com.mongodb.casbah.MongoClient.apply(MongoClient.scala:219) at com.stratio.datasource.mongodb.partitioner.MongodbPartitioner.isShardedCollection(MongodbPartitioner.scala:78) at com.stratio.datasource.mongodb.partitioner.MongodbPartitioner$$anonfun$computePartitions$1.apply(MongodbPartitioner.scala:67) at com.stratio.datasource.mongodb.partitioner.MongodbPartitioner$$anonfun$computePartitions$1.apply(MongodbPartitioner.scala:66) at com.stratio.datasource.util.using$.apply(using.scala:38) at com.stratio.datasource.mongodb.partitioner.MongodbPartitioner.computePartitions(MongodbPartitioner.scala:66) at com.stratio.datasource.mongodb.rdd.MongodbRDD.getPartitions(MongodbRDD.scala:42) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:65) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$3.apply(PairRDDFunctions.scala:331) at org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$3.apply(PairRDDFunctions.scala:331) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111) at org.apache.spark.rdd.RDD.withScope(RDD.scala:316) at org.apache.spark.rdd.PairRDDFunctions.reduceByKey(PairRDDFunctions.scala:330) at com.stratio.datasource.mongodb.schema.MongodbSchema.schema(MongodbSchema.scala:47) at com.stratio.datasource.mongodb.MongodbRelation.com$stratio$datasource$mongodb$MongodbRelation$$lazySchema$lzycompute(MongodbRelation.scala:63) at com.stratio.datasource.mongodb.MongodbRelation.com$stratio$datasource$mongodb$MongodbRelation$$lazySchema(MongodbRelation.scala:60) at com.stratio.datasource.mongodb.MongodbRelation$$anonfun$1.apply(MongodbRelation.scala:65) at com.stratio.datasource.mongodb.MongodbRelation$$anonfun$1.apply(MongodbRelation.scala:65) at scala.Option.getOrElse(Option.scala:120) at com.stratio.datasource.mongodb.MongodbRelation.<init>(MongodbRelation.scala:65) at com.stratio.datasource.mongodb.DefaultSource.createRelation(DefaultSource.scala:36) at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119) at com.askingdata.test.TestSpark.main(TestSpark.java:23) Caused by: java.lang.IllegalArgumentException: state should be: w >= 0 at com.mongodb.assertions.Assertions.isTrueArgument(Assertions.java:99) at com.mongodb.WriteConcern.<init>(WriteConcern.java:316) at com.mongodb.WriteConcern.<init>(WriteConcern.java:227) at com.mongodb.casbah.WriteConcern$.<init>(WriteConcern.scala:41) at com.mongodb.casbah.WriteConcern$.<clinit>(WriteConcern.scala) ... 37 more

Users with the same issue

rprp
1 times, last one,
renireni
1 times, last one,
poroszdporoszd
2 times, last one,