com.mongodb.MongoException

There are no available Samebug tips for this exception. Do you have an idea how to solve this issue? A short tip would help users who saw this issue last week.

  • invalid sort specified for tailable cursor
    via Stack Overflow by Prasanna
    ,
  • invalid-sort-specified-for-tailable-cursor
    via by Prasanna Ganesh,
  • Data retrieval from mongodb java api
    via Stack Overflow by Jose Ramon
    ,
  • In our test environment we have a Mongo DB running 2.4.10 which has 2 shards and each shard has 3 replica set members. We also have a mongos server sitting in front of 3 config servers. We are testing our hadoop development using Amazon EMR that will pull data from a sharded collection in the DB and write its results to a another collection in the DB. The inputURI that we provide is in the following format bq. mongodb://hadoop:password@mongos-server:27017/inputDB.inputcollection We have configured the Mongo DB permissions as follows admin DB bq. db.addUser({user:"hadoop", userSource:"inputDB", roles:["clusterAdmin", "readWrite"], otherDBRoles:{config:["readWrite"]}}) inputDB DB bq. db.addUser({user:"hadoop",pwd:"**************",roles:["readWrite"]}) We keep getting the following error com.mongodb.MongoException: not authorized for query on config.chunks In the _init()_ method of _MongoCollectionSplitter.java_ it parses the inputURI to get the inputCollection and then from this it works back to get the Mongo object. Then in the _calculateSplits()_ method in _ShardChunkMongoSplitter.java_ it uses this Mongo object to switch to the config DB and query the chunks collection. The Java Mongo driver uses lazy authentication so authentication only appears to be triggered when trying to access the config.chunks collection. I was able to get get the code working by adding the following line of code to the _init()_ method in _MongoCollectionSplitter.java_ bq. DBObject one = this.inputCollection.findOne(); This triggers authentication and then everything else seams to work correctly. Do we need to make any changes to how we have configured our permissions or is this a bug in the code? Full stacktrace is below {quote} 2014-12-03 10:52:19,346 INFO com.mongodb.hadoop.util.MongoTool (main): Setting up and running MapReduce job in foreground, will wait for results. {Verbose? false} 2014-12-03 10:52:19,787 INFO org.apache.hadoop.yarn.client.RMProxy (main): Connecting to ResourceManager at /xxx.xxx.xxx.xxx:9022 2014-12-03 10:52:23,596 INFO com.mongodb.hadoop.splitter.MongoSplitterFactory (main): Retrieved Collection from authURI mongodb://user:password@mongodb:27017/inputDB.inputCollection 2014-12-03 10:52:23,999 INFO com.mongodb.hadoop.MongoInputFormat (main): Using com.mongodb.hadoop.splitter.ShardChunkMongoSplitter@476b1425 to calculate splits. 2014-12-03 10:52:24,003 INFO com.mongodb.hadoop.splitter.ShardChunkMongoSplitter (main): TargetShards flag is false 2014-12-03 10:52:24,028 INFO org.apache.hadoop.mapreduce.JobSubmitter (main): Cleaning up the staging area /tmp/hadoop-yarn/staging/user/hadoop/.staging/job_1417600633426_0002 2014-12-03 10:52:24,041 ERROR com.mongodb.hadoop.util.MongoTool (main): Exception while executing job... com.mongodb.MongoException: not authorized for query on config.chunks at com.mongodb.MongoException.parse(MongoException.java:82) at com.mongodb.DBApiLayer$MyCollection.__find(DBApiLayer.java:314) at com.mongodb.DBApiLayer$MyCollection.__find(DBApiLayer.java:295) at com.mongodb.DBCursor._check(DBCursor.java:368) at com.mongodb.DBCursor._hasNext(DBCursor.java:459) at com.mongodb.DBCursor.hasNext(DBCursor.java:484) at com.mongodb.hadoop.splitter.ShardChunkMongoSplitter.calculateSplits(ShardChunkMongoSplitter.java:94) at com.mongodb.hadoop.MongoInputFormat.getSplits(MongoInputFormat.java:58) at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493) at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303) at com.mongodb.hadoop.util.MongoTool.runMapReduceJob(MongoTool.java:222) at com.mongodb.hadoop.util.MongoTool.run(MongoTool.java:96) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) {quote}
    via by Kevin Brady,
  • In our test environment we have a Mongo DB running 2.4.10 which has 2 shards and each shard has 3 replica set members. We also have a mongos server sitting in front of 3 config servers. We are testing our hadoop development using Amazon EMR that will pull data from a sharded collection in the DB and write its results to a another collection in the DB. The inputURI that we provide is in the following format bq. mongodb://hadoop:password@mongos-server:27017/inputDB.inputcollection We have configured the Mongo DB permissions as follows admin DB bq. db.addUser({user:"hadoop", userSource:"inputDB", roles:["clusterAdmin", "readWrite"], otherDBRoles:{config:["readWrite"]}}) inputDB DB bq. db.addUser({user:"hadoop",pwd:"**************",roles:["readWrite"]}) We keep getting the following error com.mongodb.MongoException: not authorized for query on config.chunks In the _init()_ method of _MongoCollectionSplitter.java_ it parses the inputURI to get the inputCollection and then from this it works back to get the Mongo object. Then in the _calculateSplits()_ method in _ShardChunkMongoSplitter.java_ it uses this Mongo object to switch to the config DB and query the chunks collection. The Java Mongo driver uses lazy authentication so authentication only appears to be triggered when trying to access the config.chunks collection. I was able to get get the code working by adding the following line of code to the _init()_ method in _MongoCollectionSplitter.java_ bq. DBObject one = this.inputCollection.findOne(); This triggers authentication and then everything else seams to work correctly. Do we need to make any changes to how we have configured our permissions or is this a bug in the code? Full stacktrace is below {quote} 2014-12-03 10:52:19,346 INFO com.mongodb.hadoop.util.MongoTool (main): Setting up and running MapReduce job in foreground, will wait for results. {Verbose? false} 2014-12-03 10:52:19,787 INFO org.apache.hadoop.yarn.client.RMProxy (main): Connecting to ResourceManager at /xxx.xxx.xxx.xxx:9022 2014-12-03 10:52:23,596 INFO com.mongodb.hadoop.splitter.MongoSplitterFactory (main): Retrieved Collection from authURI mongodb://user:password@mongodb:27017/inputDB.inputCollection 2014-12-03 10:52:23,999 INFO com.mongodb.hadoop.MongoInputFormat (main): Using com.mongodb.hadoop.splitter.ShardChunkMongoSplitter@476b1425 to calculate splits. 2014-12-03 10:52:24,003 INFO com.mongodb.hadoop.splitter.ShardChunkMongoSplitter (main): TargetShards flag is false 2014-12-03 10:52:24,028 INFO org.apache.hadoop.mapreduce.JobSubmitter (main): Cleaning up the staging area /tmp/hadoop-yarn/staging/user/hadoop/.staging/job_1417600633426_0002 2014-12-03 10:52:24,041 ERROR com.mongodb.hadoop.util.MongoTool (main): Exception while executing job... com.mongodb.MongoException: not authorized for query on config.chunks at com.mongodb.MongoException.parse(MongoException.java:82) at com.mongodb.DBApiLayer$MyCollection.__find(DBApiLayer.java:314) at com.mongodb.DBApiLayer$MyCollection.__find(DBApiLayer.java:295) at com.mongodb.DBCursor._check(DBCursor.java:368) at com.mongodb.DBCursor._hasNext(DBCursor.java:459) at com.mongodb.DBCursor.hasNext(DBCursor.java:484) at com.mongodb.hadoop.splitter.ShardChunkMongoSplitter.calculateSplits(ShardChunkMongoSplitter.java:94) at com.mongodb.hadoop.MongoInputFormat.getSplits(MongoInputFormat.java:58) at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493) at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303) at com.mongodb.hadoop.util.MongoTool.runMapReduceJob(MongoTool.java:222) at com.mongodb.hadoop.util.MongoTool.run(MongoTool.java:96) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) {quote}
    via by Kevin Brady,
    • com.mongodb.MongoException: Unable to execute query: error processing query: ns=local.oplog.rs limit=0 skip=0 Tree: $and Sort: { $natural: -1 } Proj: {} invalid sort specified for tailable cursor: { $natural: -1 } at com.mongodb.MongoException.parse(MongoException.java:82) at com.mongodb.DBApiLayer$MyCollection.__find(DBApiLayer.java:314) at com.mongodb.DBApiLayer$MyCollection.__find(DBApiLayer.java:295) at com.mongodb.DBCursor._check(DBCursor.java:368) at com.mongodb.DBCursor._hasNext(DBCursor.java:459) at com.mongodb.DBCursor.hasNext(DBCursor.java:484) at com.trainings.core.Test.lambda$0(Test.java:32) at java.lang.Thread.run(Unknown Source)

    Users with the same issue

    Unknown visitor
    Unknown visitor1 times, last one,
    sriharshakiransriharshakiran
    2 times, last one,
    Unknown visitor
    Unknown visitor1 times, last one,
    Unknown visitor
    Unknown visitor1 times, last one,
    Unknown visitor
    Unknown visitor1 times, last one,
    1 more bugmates