java.io.IOException: com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=am0101.test.com:27101, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSecurityException: Exception authenticating MongoCredential{mechanism=null, userName='dmdbxcc_write', source='dxccdb01', password=<hidden>, mechanismProperties={}}}, caused by {com.mongodb.MongoCommandException: Command failed with error 18: 'Authentication failed.' on server am0101.test.com:27101. The full response is { "ok" : 0.0, "code" : 18, "errmsg" : "Authentication failed." }}}, {address=am0102.test.com:27101, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSecurityException: Exception authenticating MongoCredential{mechanism=null, userName='dmdbxcc_write', source='dxccdb01', password=<hidden>, mechanismProperties={}}}, caused by {com.mongodb.MongoCommandException: Command failed with error 18: 'Authentication failed.' on server am0102.test.com:27101. The full response is { "ok" : 0.0, "code" : 18, "errmsg" : "Authentication failed." }}}]

Stack Overflow | Maddy | 4 months ago
  1. 0

    Not able to import the data into hive from mongodb using mongo-hadoop connector

    Stack Overflow | 4 months ago | Maddy
    java.io.IOException: com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=am0101.test.com:27101, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSecurityException: Exception authenticating MongoCredential{mechanism=null, userName='dmdbxcc_write', source='dxccdb01', password=<hidden>, mechanismProperties={}}}, caused by {com.mongodb.MongoCommandException: Command failed with error 18: 'Authentication failed.' on server am0101.test.com:27101. The full response is { "ok" : 0.0, "code" : 18, "errmsg" : "Authentication failed." }}}, {address=am0102.test.com:27101, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSecurityException: Exception authenticating MongoCredential{mechanism=null, userName='dmdbxcc_write', source='dxccdb01', password=<hidden>, mechanismProperties={}}}, caused by {com.mongodb.MongoCommandException: Command failed with error 18: 'Authentication failed.' on server am0102.test.com:27101. The full response is { "ok" : 0.0, "code" : 18, "errmsg" : "Authentication failed." }}}]
  2. 0

    Unable to load data from Mongodb to Pig

    Stack Overflow | 2 years ago
    com.mongodb.MongoException$Network: Exception opening the socket}, caused by {java.net.ConnectException: Connection refused}}]
  3. 0

    I have a testing setup where I SSH tunnel from my local machine to a machine in a different VPC that hosts a mongo hidden secondary. It seems to be impossible to read from that secondary without given the driver access to the primary as well. I get the following exception when I try: {{ feature-dev 20:00:43.216 INFO org.mongodb.driver.cluster - Monitor thread successfully connected to server with description ServerDescription{address=localhost:27018, type=STANDALONE, state=CONNECTED, ok=true, version=ServerVersion{versionList=[3, 0, 2]}, minWireVersion=0, maxWireVersion=3, maxDocumentSize=16777216, roundTripTimeNanos=10440806} feature-dev 20:00:43.224 INFO org.mongodb.driver.cluster - Monitor thread successfully connected to server with description ServerDescription{address=localhost:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[3, 0, 4]}, minWireVersion=0, maxWireVersion=3, maxDocumentSize=16777216, roundTripTimeNanos=9489844, setName='mongo1', hosts=[10.0.0.126:27017, 10.0.0.124:27017, 10.0.0.125:27017], passives=[], arbiters=[], primary='10.0.0.124:27017', tagSet=TagSet{[]}} feature-dev[ERROR] com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches PrimaryServerSelector. Client view of cluster state is {type=REPLICA_SET, servers=[{address=localhost:27017, type=REPLICA_SET_SECONDARY, roundTripTime=10.6 ms, state=CONNECTED}] feature-dev[ERROR] at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:370) feature-dev[ERROR] at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101) feature-dev[ERROR] at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75) feature-dev[ERROR] at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71) feature-dev[ERROR] at com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68) feature-dev[ERROR] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:104) feature-dev[ERROR] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:97) feature-dev[ERROR] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:85) feature-dev[ERROR] at com.mongodb.operation.CommandWriteOperation.execute(CommandWriteOperation.java:55) feature-dev[ERROR] at com.mongodb.Mongo.execute(Mongo.java:745) feature-dev[ERROR] at com.mongodb.Mongo$2.execute(Mongo.java:728) feature-dev[ERROR] at com.mongodb.DB.executeCommand(DB.java:583) feature-dev[ERROR] at com.mongodb.DBCollection.getStats(DBCollection.java:1858) feature-dev[ERROR] at com.mongodb.hadoop.splitter.MongoSplitterFactory.getSplitterByStats(MongoSplitterFactory.java:73) feature-dev[ERROR] at com.mongodb.hadoop.splitter.MongoSplitterFactory.getSplitter(MongoSplitterFactory.java:113) ... }} (can send full stack trace if necessary. Unable to attach it) I'm able to read from the primary fine. Is this behaviour unavoidable? Can the driver not function without being able to query the primary for some reason? Do I need a testing setup that has access to both primary and hidden secondary?

    JIRA | 1 year ago | Ratan Sebastian
    com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches PrimaryServerSelector. Client view of cluster state is {type=REPLICA_SET, servers=[{address=localhost:27017, type=REPLICA_SET_SECONDARY, roundTripTime=10.6 ms, state=CONNECTED}]
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    I have a testing setup where I SSH tunnel from my local machine to a machine in a different VPC that hosts a mongo hidden secondary. It seems to be impossible to read from that secondary without given the driver access to the primary as well. I get the following exception when I try: {{ feature-dev 20:00:43.216 INFO org.mongodb.driver.cluster - Monitor thread successfully connected to server with description ServerDescription{address=localhost:27018, type=STANDALONE, state=CONNECTED, ok=true, version=ServerVersion{versionList=[3, 0, 2]}, minWireVersion=0, maxWireVersion=3, maxDocumentSize=16777216, roundTripTimeNanos=10440806} feature-dev 20:00:43.224 INFO org.mongodb.driver.cluster - Monitor thread successfully connected to server with description ServerDescription{address=localhost:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[3, 0, 4]}, minWireVersion=0, maxWireVersion=3, maxDocumentSize=16777216, roundTripTimeNanos=9489844, setName='mongo1', hosts=[10.0.0.126:27017, 10.0.0.124:27017, 10.0.0.125:27017], passives=[], arbiters=[], primary='10.0.0.124:27017', tagSet=TagSet{[]}} feature-dev[ERROR] com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches PrimaryServerSelector. Client view of cluster state is {type=REPLICA_SET, servers=[{address=localhost:27017, type=REPLICA_SET_SECONDARY, roundTripTime=10.6 ms, state=CONNECTED}] feature-dev[ERROR] at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:370) feature-dev[ERROR] at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101) feature-dev[ERROR] at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75) feature-dev[ERROR] at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71) feature-dev[ERROR] at com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68) feature-dev[ERROR] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:104) feature-dev[ERROR] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:97) feature-dev[ERROR] at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:85) feature-dev[ERROR] at com.mongodb.operation.CommandWriteOperation.execute(CommandWriteOperation.java:55) feature-dev[ERROR] at com.mongodb.Mongo.execute(Mongo.java:745) feature-dev[ERROR] at com.mongodb.Mongo$2.execute(Mongo.java:728) feature-dev[ERROR] at com.mongodb.DB.executeCommand(DB.java:583) feature-dev[ERROR] at com.mongodb.DBCollection.getStats(DBCollection.java:1858) feature-dev[ERROR] at com.mongodb.hadoop.splitter.MongoSplitterFactory.getSplitterByStats(MongoSplitterFactory.java:73) feature-dev[ERROR] at com.mongodb.hadoop.splitter.MongoSplitterFactory.getSplitter(MongoSplitterFactory.java:113) ... }} (can send full stack trace if necessary. Unable to attach it) I'm able to read from the primary fine. Is this behaviour unavoidable? Can the driver not function without being able to query the primary for some reason? Do I need a testing setup that has access to both primary and hidden secondary?

    JIRA | 1 year ago | Ratan Sebastian
    com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches PrimaryServerSelector. Client view of cluster state is {type=REPLICA_SET, servers=[{address=localhost:27017, type=REPLICA_SET_SECONDARY, roundTripTime=10.6 ms, state=CONNECTED}]
  6. 0

    MongoDB "Exception authenticating MongoCredential" using java-driver

    Stack Overflow | 4 months ago | Prayag Upd
    play.api.http.HttpErrorHandlerExceptions$$anon$1: Execution exception[[MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=secondary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=staging-node3.shaharma.com:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSecurityException: Exception authenticating MongoCredential{mechanism=SCRAM-SHA-1, userName='userId', source='events', password=<hidden>, mechanismProperties={}}}, caused by {com.mongodb.MongoCommandException: Command failed with error 18: 'Authentication failed.' on server staging-node3.shaharma.com:27017. The full response is { "ok" : 0.0, "code" : 18, "errmsg" : "Authentication failed." }}}]]]

    Not finding the right solution?
    Take a tour to get the most out of Samebug.

    Tired of useless tips?

    Automated exception search integrated into your IDE

    Root Cause Analysis

    1. com.mongodb.MongoTimeoutException

      Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=am0101.test.com:27101, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSecurityException: Exception authenticating MongoCredential{mechanism=null, userName='dmdbxcc_write', source='dxccdb01', password=<hidden>, mechanismProperties={}}}, caused by {com.mongodb.MongoCommandException: Command failed with error 18: 'Authentication failed.' on server am0101.test.com:27101. The full response is { "ok" : 0.0, "code" : 18, "errmsg" : "Authentication failed." }}}, {address=am0102.test.com:27101, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSecurityException: Exception authenticating MongoCredential{mechanism=null, userName='dmdbxcc_write', source='dxccdb01', password=<hidden>, mechanismProperties={}}}, caused by {com.mongodb.MongoCommandException: Command failed with error 18: 'Authentication failed.' on server am0102.test.com:27101. The full response is { "ok" : 0.0, "code" : 18, "errmsg" : "Authentication failed." }}}]

      at com.mongodb.connection.BaseCluster.createTimeoutException()
    2. MongoDB Java Driver
      DBCollection.getStats
      1. com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
      2. com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
      3. com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
      4. com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
      5. com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
      6. com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:89)
      7. com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:84)
      8. com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
      9. com.mongodb.Mongo.execute(Mongo.java:772)
      10. com.mongodb.Mongo$2.execute(Mongo.java:759)
      11. com.mongodb.DB.executeCommand(DB.java:653)
      12. com.mongodb.DBCollection.getStats(DBCollection.java:2083)
      12 frames
    3. com.mongodb.hadoop
      HiveMongoInputFormat.getSplits
      1. com.mongodb.hadoop.splitter.MongoSplitterFactory.getSplitterByStats(MongoSplitterFactory.java:73)
      2. com.mongodb.hadoop.splitter.MongoSplitterFactory.getSplitter(MongoSplitterFactory.java:113)
      3. com.mongodb.hadoop.hive.input.HiveMongoInputFormat.getSplits(HiveMongoInputFormat.java:240)
      4. com.mongodb.hadoop.hive.input.HiveMongoInputFormat.getSplits(HiveMongoInputFormat.java:68)
      4 frames
    4. Hive Query Language
      CombineHiveInputFormat.getSplits
      1. org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:298)
      2. org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplitsInternal(HiveInputFormat.java:412)
      3. org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:330)
      4. org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getCombineSplits(CombineHiveInputFormat.java:311)
      5. org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplitsInternal(CombineHiveInputFormat.java:519)
      6. org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:463)
      6 frames
    5. Hadoop
      Job$10.run
      1. org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:328)
      2. org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:320)
      3. org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:196)
      4. org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
      5. org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
      5 frames
    6. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:422)
      2 frames
    7. Hadoop
      UserGroupInformation.doAs
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
      1 frame
    8. Hadoop
      JobClient$1.run
      1. org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
      2. org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:575)
      3. org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:570)
      3 frames
    9. Java RT
      Subject.doAs
      1. java.security.AccessController.doPrivileged(Native Method)
      2. javax.security.auth.Subject.doAs(Subject.java:422)
      2 frames
    10. Hadoop
      UserGroupInformation.doAs
      1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
      1 frame
    11. Hadoop
      JobClient.submitJob
      1. org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:570)
      2. org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:561)
      2 frames
    12. Hive Query Language
      Driver.run
      1. org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:429)
      2. org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137)
      3. org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
      4. org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
      5. org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1618)
      6. org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1379)
      7. org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1192)
      8. org.apache.hadoop.hive.ql.Driver.run(Driver.java:1019)
      9. org.apache.hadoop.hive.ql.Driver.run(Driver.java:1009)
      9 frames
    13. org.apache.hadoop
      CliDriver.main
      1. org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:201)
      2. org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:153)
      3. org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:364)
      4. org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:712)
      5. org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:631)
      6. org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:570)
      6 frames
    14. Java RT
      Method.invoke
      1. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      2. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      3. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      4. java.lang.reflect.Method.invoke(Method.java:498)
      4 frames
    15. Hadoop
      RunJar.main
      1. org.apache.hadoop.util.RunJar.run(RunJar.java:221)
      2. org.apache.hadoop.util.RunJar.main(RunJar.java:136)
      2 frames