How to take backup based on ISODate or ObjectID using mongodump v4.2.2?

How to take backup based on isodate or objectId in mongodb v4.2.2; can you please let me know mongodump command.

Can you clarify what you mean by “based on ISOdate or objectId”?

Do you mean you want to dump all the records in a collection that have some field greater than a particular ISODate or ObjectId?

If you want to select only the documents based on a filter criteria (like the restricting the documents based on a date field values), you can specify a filter with the option –query.

I mean filter documents not all documents.
Example: documents have 100000 records then I need take only limited documents based on _id or createAt date wise , if you know that command let me know.

The below example you can understand I am asking, I hope you.
One more example: one document salary field start from 1 to 100000, then I want take back up from 1000 to 20000 filter it success my back up
But when I was try based on _id or date mongodump filter not working.

The below command tried but output documents not write , command should be wrong if yes, please let me know correct command.
mongodump --port 27017 --db pms --collection purchase --query {"_id":{“gte”:“ObjectId(“5e464233b5b419ebe5e5c035”)”}} --out D:\backup_export\admin\admin\

OUTPUT:


2020-02-28T14:43:50.612+0530 writing pms.purchase to
2020-02-28T14:43:50.631+0530 done dumping pms.purchase (0 documents)

I see you are using Windows OS. I couldn’t get the query filter with _id (an ObjectId) working, But the following worked fine. Assuming that there is a field called as name and it is of string type, I could export the documents with names “Krish”:

mongodump --db=test --collection=test --query="{\"name\": \"Krish\"}"

The documentation says that if the query filter fields use the ObjectId or date type fields they must be specified as Extended JSON in the filter.

1 Like

i was tried string working but when i did use “date” and “_id” then not working.
example: query below:
–query {“createdAt”:{"$lte":“ISODate(“2020-01-23T09:25:48Z”)”}}
can you please provide correct command

These will work on Windows OS. The first is a filter by ObjectId and the next by the date types:

> mongodump --db=test --collection=test --query="{ \"_id\": { \"$gte\" : { \"$oid\": \"5e58e938707edd40784daf83\" } } }"

> mongodump --db=test --collection=test --query="{ \"dt\": { \"$gte\" : { \"$date\": \"2020-02-26T01:00:00.000Z\" } } }"


The --queryFile option:

You can use the --queryFile option instead of the --query, in which case you can put the query filter JSON in a file. For example:

> mongodump --db=test --collection=test --queryFile=qry.json

and the qry.json has the following:
{ "_id": { "$gte" : { "$oid": "5e5911fe9476bac92059a747" } } }

Note the query file option’s JSON is cleaner without the backslash escapes.

4 Likes

Hi Prasad,

Thanks for given solutions, was tried commands first one done but date and queryfile both some issue errors are given below please help on this any mistake about that
This one successful :

mongodump --db=test --collection=test --query=“{ "_id": { "$gte" : { "$oid": "5e58e938707edd40784daf83" } } }”

As per below command based on date all documents not write can you please help on this why documents not write properly.

  1. date wise:
    D:\bin>mongodump --port 27019 --db pms --collection purchase --query “{"dt":{"$gte":{"$date":"2020-02-12T04:07:34Z"}}}” --out D:\backup_export\admin\admin
    2020-03-03T16:19:00.483+0530 writing pms.purchase to
    2020-03-03T16:19:00.527+0530 done dumping pms.purchase (0 documents)

I was tried as per you mentioned --queryFile below error getting can you please help on this

  1. queryFile wise:

mongodump --port 27019 --db pms --collection purchase --queryFile “{"_id":{"$gte":{"$oid":"5e461d06dba04739ea454892"}}}” --out D:\backup_export\admin\admin\ –gzip
2020-03-03T15:47:53.961+0530 Failed: error reading queryFile: open {“_id”:{“$gte”:{“$oid”:“5e461d06dba04739ea454892”}}}: The filename, directory name, or volume label syntax is incorrect.

In your query:

mongodump --port 27019 --db pms --collection purchase --queryFile “{”_id":{"$gte":{"$oid":“5e461d06dba04739ea454892”}}}" --out D:\backup_export\admin\admin\ --gzip

You are using the --queryFile option. But, you have the query instead of the file name. So the error. Change the --queryFile to --query.


Since there is no error in this case, the issue must be with the query criteria and the available data in the collection. Also, the type of quotes you are using might be the problem; use straight quotes " " instead of “ ”.

I used 2 types of command first one with backslash escapes 2nd straight quotes " " instead of “ ” . but proper output not getting. please confirm anything change below commands

D:\bin>mongodump --port 27019 --db pms --collection purchase --query “{“dt”:{”$gte":{"$date":“2020-02-14T04:07:34Z”}}}" --out D:\backup_export\admin\admin\ --gzip
OutPut:


2020-03-03T19:22:16.014+0530 writing pms.purchase to
2020-03-03T19:22:16.059+0530 done dumping pms.purchase (0 documents)

D:\bin>mongodump --port 27019 --db pms --collection purchase --query “{“dt”:{”$gte":{"$date":“2020-02-12T04:07:34Z”}}}" --out D:\backup_export\admin\admin
OutPut:


2020-03-03T19:24:06.198+0530 Failed: error parsing query as Extended JSON: invalid JSON input

mongodump --port 27019 --db pms --collection purchase --query "{ \"dt\":{ \"$gte\": { \"$date\": \"2020-02-14T04:07:34Z\" } } }" --out D:\backup_export\admin\admin --gzip

Please note that:

  • D:\backup_export\admin\admin should be a directory.
  • --gzip is an option you can use
  • dt is the name of the field in the purchase collection in the pms database

Make sure there is data in your collection by running this query from Mongo Shell; and these documents will be exported:
db.purchase.find( { dt: { $gte: ISODate("2020-02-14T04:07:34Z") } } )

For more details on mongodump see the documentation.

$ is a variable. (linux, macos)
Escape the $.

Like this.
-q "{'_id':{'\$lte':ObjectId('ffffffff0000000000000000')}}"