com.mongodb.MongoException: not authorized for query on config.chunks

JIRA | Kevin Brady | 2 years ago
  1. 0

    In our test environment we have a Mongo DB running 2.4.10 which has 2 shards and each shard has 3 replica set members. We also have a mongos server sitting in front of 3 config servers. We are testing our hadoop development using Amazon EMR that will pull data from a sharded collection in the DB and write its results to a another collection in the DB. The inputURI that we provide is in the following format bq. mongodb://hadoop:password@mongos-server:27017/inputDB.inputcollection We have configured the Mongo DB permissions as follows admin DB bq. db.addUser({user:"hadoop", userSource:"inputDB", roles:["clusterAdmin", "readWrite"], otherDBRoles:{config:["readWrite"]}}) inputDB DB bq. db.addUser({user:"hadoop",pwd:"**************",roles:["readWrite"]}) We keep getting the following error com.mongodb.MongoException: not authorized for query on config.chunks In the _init()_ method of _MongoCollectionSplitter.java_ it parses the inputURI to get the inputCollection and then from this it works back to get the Mongo object. Then in the _calculateSplits()_ method in _ShardChunkMongoSplitter.java_ it uses this Mongo object to switch to the config DB and query the chunks collection. The Java Mongo driver uses lazy authentication so authentication only appears to be triggered when trying to access the config.chunks collection. I was able to get get the code working by adding the following line of code to the _init()_ method in _MongoCollectionSplitter.java_ bq. DBObject one = this.inputCollection.findOne(); This triggers authentication and then everything else seams to work correctly. Do we need to make any changes to how we have configured our permissions or is this a bug in the code? Full stacktrace is below {quote} 2014-12-03 10:52:19,346 INFO com.mongodb.hadoop.util.MongoTool (main): Setting up and running MapReduce job in foreground, will wait for results. {Verbose? false} 2014-12-03 10:52:19,787 INFO org.apache.hadoop.yarn.client.RMProxy (main): Connecting to ResourceManager at /xxx.xxx.xxx.xxx:9022 2014-12-03 10:52:23,596 INFO com.mongodb.hadoop.splitter.MongoSplitterFactory (main): Retrieved Collection from authURI mongodb://user:password@mongodb:27017/inputDB.inputCollection 2014-12-03 10:52:23,999 INFO com.mongodb.hadoop.MongoInputFormat (main): Using com.mongodb.hadoop.splitter.ShardChunkMongoSplitter@476b1425 to calculate splits. 2014-12-03 10:52:24,003 INFO com.mongodb.hadoop.splitter.ShardChunkMongoSplitter (main): TargetShards flag is false 2014-12-03 10:52:24,028 INFO org.apache.hadoop.mapreduce.JobSubmitter (main): Cleaning up the staging area /tmp/hadoop-yarn/staging/user/hadoop/.staging/job_1417600633426_0002 2014-12-03 10:52:24,041 ERROR com.mongodb.hadoop.util.MongoTool (main): Exception while executing job... com.mongodb.MongoException: not authorized for query on config.chunks at com.mongodb.MongoException.parse(MongoException.java:82) at com.mongodb.DBApiLayer$MyCollection.__find(DBApiLayer.java:314) at com.mongodb.DBApiLayer$MyCollection.__find(DBApiLayer.java:295) at com.mongodb.DBCursor._check(DBCursor.java:368) at com.mongodb.DBCursor._hasNext(DBCursor.java:459) at com.mongodb.DBCursor.hasNext(DBCursor.java:484) at com.mongodb.hadoop.splitter.ShardChunkMongoSplitter.calculateSplits(ShardChunkMongoSplitter.java:94) at com.mongodb.hadoop.MongoInputFormat.getSplits(MongoInputFormat.java:58) at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493) at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303) at com.mongodb.hadoop.util.MongoTool.runMapReduceJob(MongoTool.java:222) at com.mongodb.hadoop.util.MongoTool.run(MongoTool.java:96) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) {quote}

    JIRA | 2 years ago | Kevin Brady
    com.mongodb.MongoException: not authorized for query on config.chunks
  2. 0

    In our test environment we have a Mongo DB running 2.4.10 which has 2 shards and each shard has 3 replica set members. We also have a mongos server sitting in front of 3 config servers. We are testing our hadoop development using Amazon EMR that will pull data from a sharded collection in the DB and write its results to a another collection in the DB. The inputURI that we provide is in the following format bq. mongodb://hadoop:password@mongos-server:27017/inputDB.inputcollection We have configured the Mongo DB permissions as follows admin DB bq. db.addUser({user:"hadoop", userSource:"inputDB", roles:["clusterAdmin", "readWrite"], otherDBRoles:{config:["readWrite"]}}) inputDB DB bq. db.addUser({user:"hadoop",pwd:"**************",roles:["readWrite"]}) We keep getting the following error com.mongodb.MongoException: not authorized for query on config.chunks In the _init()_ method of _MongoCollectionSplitter.java_ it parses the inputURI to get the inputCollection and then from this it works back to get the Mongo object. Then in the _calculateSplits()_ method in _ShardChunkMongoSplitter.java_ it uses this Mongo object to switch to the config DB and query the chunks collection. The Java Mongo driver uses lazy authentication so authentication only appears to be triggered when trying to access the config.chunks collection. I was able to get get the code working by adding the following line of code to the _init()_ method in _MongoCollectionSplitter.java_ bq. DBObject one = this.inputCollection.findOne(); This triggers authentication and then everything else seams to work correctly. Do we need to make any changes to how we have configured our permissions or is this a bug in the code? Full stacktrace is below {quote} 2014-12-03 10:52:19,346 INFO com.mongodb.hadoop.util.MongoTool (main): Setting up and running MapReduce job in foreground, will wait for results. {Verbose? false} 2014-12-03 10:52:19,787 INFO org.apache.hadoop.yarn.client.RMProxy (main): Connecting to ResourceManager at /xxx.xxx.xxx.xxx:9022 2014-12-03 10:52:23,596 INFO com.mongodb.hadoop.splitter.MongoSplitterFactory (main): Retrieved Collection from authURI mongodb://user:password@mongodb:27017/inputDB.inputCollection 2014-12-03 10:52:23,999 INFO com.mongodb.hadoop.MongoInputFormat (main): Using com.mongodb.hadoop.splitter.ShardChunkMongoSplitter@476b1425 to calculate splits. 2014-12-03 10:52:24,003 INFO com.mongodb.hadoop.splitter.ShardChunkMongoSplitter (main): TargetShards flag is false 2014-12-03 10:52:24,028 INFO org.apache.hadoop.mapreduce.JobSubmitter (main): Cleaning up the staging area /tmp/hadoop-yarn/staging/user/hadoop/.staging/job_1417600633426_0002 2014-12-03 10:52:24,041 ERROR com.mongodb.hadoop.util.MongoTool (main): Exception while executing job... com.mongodb.MongoException: not authorized for query on config.chunks at com.mongodb.MongoException.parse(MongoException.java:82) at com.mongodb.DBApiLayer$MyCollection.__find(DBApiLayer.java:314) at com.mongodb.DBApiLayer$MyCollection.__find(DBApiLayer.java:295) at com.mongodb.DBCursor._check(DBCursor.java:368) at com.mongodb.DBCursor._hasNext(DBCursor.java:459) at com.mongodb.DBCursor.hasNext(DBCursor.java:484) at com.mongodb.hadoop.splitter.ShardChunkMongoSplitter.calculateSplits(ShardChunkMongoSplitter.java:94) at com.mongodb.hadoop.MongoInputFormat.getSplits(MongoInputFormat.java:58) at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493) at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303) at com.mongodb.hadoop.util.MongoTool.runMapReduceJob(MongoTool.java:222) at com.mongodb.hadoop.util.MongoTool.run(MongoTool.java:96) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) {quote}

    JIRA | 2 years ago | Kevin Brady
    com.mongodb.MongoException: not authorized for query on config.chunks
  3. 0

    Error while using mongo hadoop connector with input splits enabled

    Google Groups | 2 years ago | TC
    org.apache.pig.backend.executionengine.ExecException: ERROR 2118: not authorized for query on config.chunks
  4. Speed up your debug routine!

    Automated exception search integrated into your IDE

  5. 0

    mongodb-user - [mongodb-user] Invalid access error on initial sync - msg#00126 - Recent Discussion OSDir.com

    osdir.com | 12 months ago
    org.apache.pig.backend.executionengine.ExecException: ERROR 2118: not authorized for query on config.chunks

  1. sriharshakiran 2 times, last 7 months ago
5 unregistered visitors
Not finding the right solution?
Take a tour to get the most out of Samebug.

Tired of useless tips?

Automated exception search integrated into your IDE

Root Cause Analysis

  1. com.mongodb.MongoException

    not authorized for query on config.chunks

    at com.mongodb.MongoException.parse()
  2. MongoDB Java Driver
    DBCursor.hasNext
    1. com.mongodb.MongoException.parse(MongoException.java:82)
    2. com.mongodb.DBApiLayer$MyCollection.__find(DBApiLayer.java:314)
    3. com.mongodb.DBApiLayer$MyCollection.__find(DBApiLayer.java:295)
    4. com.mongodb.DBCursor._check(DBCursor.java:368)
    5. com.mongodb.DBCursor._hasNext(DBCursor.java:459)
    6. com.mongodb.DBCursor.hasNext(DBCursor.java:484)
    6 frames
  3. com.mongodb.hadoop
    MongoInputFormat.getSplits
    1. com.mongodb.hadoop.splitter.ShardChunkMongoSplitter.calculateSplits(ShardChunkMongoSplitter.java:94)
    2. com.mongodb.hadoop.MongoInputFormat.getSplits(MongoInputFormat.java:58)
    2 frames
  4. Hadoop
    Job$10.run
    1. org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493)
    2. org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510)
    3. org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394)
    4. org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
    5. org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
    5 frames
  5. Java RT
    Subject.doAs
    1. java.security.AccessController.doPrivileged(Native Method)
    2. javax.security.auth.Subject.doAs(Subject.java:415)
    2 frames
  6. Hadoop
    UserGroupInformation.doAs
    1. org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
    1 frame
  7. Hadoop
    Job.waitForCompletion
    1. org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
    2. org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
    2 frames
  8. com.mongodb.hadoop
    MongoTool.run
    1. com.mongodb.hadoop.util.MongoTool.runMapReduceJob(MongoTool.java:222)
    2. com.mongodb.hadoop.util.MongoTool.run(MongoTool.java:96)
    2 frames
  9. Hadoop
    ToolRunner.run
    1. org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    2. org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    2 frames