Best way to update multiple documents

Hello to all,

This is my first post here since I am new in mongodb :slight_smile:

We are building a system using C# that collects transactions from an external provider by calling their APIs.
The transactions are stored in a collection in our DB.
The transactions model from the provider is like this:
{
id: int,
date:datetime
amount:decimal,
validated:true/false
}

We call this endpoint daily and we get back the daily transactions and we store them to our collection with the same values except the id where we use mongodb’s default one (objectid).

When we call the external endpoint we get always the same data with a change on the validated field when the transaction has validated.

So, is there a way to update all the existing documents at once without looping on the stored DB transactions and compare them with the incoming ones?

Thanks for your help
Yannis

Do you retain the original ID from the API when you save to Mongo? If so you could get all the transactions on the API call, and put them in a working collection then run an aggregation with a $merge into the existing collection with the $merge matching on the ID that identifies the transaction, this would then be one server call to update all records and the server would do the heavy lifting.
Just need to make sure indexes are in place for fast matching.

You can even specify when match or not, so you can insert a new item or update the existing one and have different logic applied.

Then when done, clear the working collection.

2 Likes

Thanks a lot for your reply! I will test your solution and come back to you

I ve tried your proposal. But I have this error and I cannot resolve it.
“Cannot find index to verify that join fields will be unique”

So, I have a pendingTransactions collection and a transactions collection
I store data to pendingTransactions from calling the external API
then I am running

db.pendingTransactions.aggregate([
  {
    $merge:
      
      {
        into: "transactions",
        on: ["transactionId"],
        whenMatched: "replace",
        whenNotMatched: "insert",
      },
  },
]);

transactionId field belongs to a unique index on the transactions collection. But when I run the aggregation then I see this error.

Any ideas on that?

—Update —

I found it. I was using mongo compass to create the indexes and for some reason although the index was present gave this error.
Anyway, I created the index though mongosh commands db.transactions.createIndex and worked

Thanks again
Yannis

I had a play and you’re right, it throws an issue, I got around it with two things:

  • Add a unique index on the transactionID into the Transaction collection
  • Project out the _id field from the data selection from the pending file and remap it to transactionID
db.getCollection("PendingTransactions").aggregate([
{
    $project:{
        _id:0,
        transactionId:'$_id',
        'Approved':1
    }
},
{
    $merge:{
        into:'Transactions',
        on:'transactionId',
    }
}
])

You can play about with the merge options to work out if you want to replace or merge objects if they already exist…if you’ve updated some to add new fields etc, you may not want to splat them with all the data in the incoming pending transaction.

I found it. I was using mongo compass to create the indexes and for some reason although the index was present gave this error.
Anyway, I created the index though mongosh commands db.transactions.createIndex and worked

Thanks again
Yannis

1 Like

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.