java.lang.IllegalArgumentException: Couldn't connect and authenticate to get collection

Stack Overflow | D.Asare | 6 months ago
tip
Your exception is missing from the Samebug knowledge base.
Here are the best solutions we found on the Internet.
Click on the to mark the helpful solution and get rewards for you help.
  1. 0

    MultiCollectionSplitBuilder not working

    Stack Overflow | 6 months ago | D.Asare
    java.lang.IllegalArgumentException: Couldn't connect and authenticate to get collection
  2. 0

    Mongo Hadoop Connector Issue

    Stack Overflow | 2 years ago | Kevin
    java.lang.IllegalArgumentException: Couldn't connect and authenticate to get collection
  3. 0

    MultiCollectionSplitBuilder in Mongo-hadoop does not work.

    Google Groups | 2 years ago | Alessandro Gelormini
    java.lang.IllegalArgumentException: Couldn't connect and authenticate to get collection
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. java.lang.NullPointerException

      No message provided

      at com.mongodb.Mongo.createCluster()
    2. MongoDB Java Driver
      MongoClient.<init>
      1. com.mongodb.Mongo.createCluster(Mongo.java:613)
      2. com.mongodb.Mongo.<init>(Mongo.java:283)
      3. com.mongodb.MongoClient.<init>(MongoClient.java:265)
      3 frames
    3. com.mongodb.hadoop
      MongoInputFormat.getSplits
      1. com.mongodb.hadoop.util.MongoConfigUtil.getMongoClient(MongoConfigUtil.java:999)
      2. com.mongodb.hadoop.util.MongoConfigUtil.getCollection(MongoConfigUtil.java:439)
      3. com.mongodb.hadoop.splitter.MongoSplitterFactory.getSplitterByStats(MongoSplitterFactory.java:72)
      4. com.mongodb.hadoop.splitter.MongoSplitterFactory.getSplitter(MongoSplitterFactory.java:113)
      5. com.mongodb.hadoop.MongoInputFormat.getSplits(MongoInputFormat.java:56)
      5 frames
    4. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:95)
      2. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      3 frames
    5. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    6. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      4 frames
    7. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    8. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      4 frames
    9. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    10. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      4 frames
    11. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    12. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      2. org.apache.spark.rdd.MapPartitionsWithPreparationRDD.getPartitions(MapPartitionsWithPreparationRDD.scala:40)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      4 frames
    13. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    14. Spark
      RDD$$anonfun$partitions$2.apply
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      2. org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
      3. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      4. org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      4 frames
    15. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    16. Spark
      ShuffleDependency.<init>
      1. org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      2. org.apache.spark.ShuffleDependency.<init>(Dependency.scala:82)
      2 frames
    17. Spark Project SQL
      ShuffledRowRDD.getDependencies
      1. org.apache.spark.sql.execution.ShuffledRowRDD.getDependencies(ShuffledRowRDD.scala:59)
      1 frame
    18. Spark
      RDD$$anonfun$dependencies$2.apply
      1. org.apache.spark.rdd.RDD$$anonfun$dependencies$2.apply(RDD.scala:226)
      2. org.apache.spark.rdd.RDD$$anonfun$dependencies$2.apply(RDD.scala:224)
      2 frames
    19. Scala
      Option.getOrElse
      1. scala.Option.getOrElse(Option.scala:120)
      1 frame
    20. Spark
      EventLoop$$anon$1.run
      1. org.apache.spark.rdd.RDD.dependencies(RDD.scala:224)
      2. org.apache.spark.scheduler.DAGScheduler.visit$1(DAGScheduler.scala:351)
      3. org.apache.spark.scheduler.DAGScheduler.getParentStages(DAGScheduler.scala:363)
      4. org.apache.spark.scheduler.DAGScheduler.getParentStagesAndId(DAGScheduler.scala:266)
      5. org.apache.spark.scheduler.DAGScheduler.newResultStage(DAGScheduler.scala:300)
      6. org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:734)
      7. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1463)
      8. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1455)
      9. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1444)
      10. org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
      10 frames