How to import a field from collection-A to Collection-B?

I am trying to update a field name Date in collection-B from collection-A by comparing the articleid of both fields. if articelid id matches then the date will be inserted in the matching document of collection-B
How will I update the date field?

Collection-A

{
"_id" : ObjectId("6368cef0cb0c042cbc5cc4e8"),
"articleid" : "159448182", 
 "type" : "online",
 "Date":"2023-01-01"
}

Collection-B

     {
    "_id" : ObjectId("34342dd123b0c042cbc5cc4e8"),
    "articleid" : "159448182", 
     "guide" : "yes",
     "Date":"2023-04-01"
    }

Hello @Utsav_Upadhyay2 ,

I am a bit confused with what you want to do with your data, can you please carify?

  • Do you want to update the already existing date field in collection-B from collection-A?
  • Do you want to insert the a new date field & value from collection-A to collection-B, while keeping the old date?
  • MongoDB Version
  • Expected Output document based off the two documents you have provided (as the articleid value is the same in both).

Regards,
Tarun

  • I want to update existing date field of collection - B from collection - A

  • I need to update date field from A to B but do not want to keep old date in the collection - B as it got updated from collection - A

  • MongoDB version is 4.2.14

  • Yes, if the articleid value is same then only it should update the date into collection - B

One way to do it is the following, assuming articleid is unique in collection-A.

const c_A = "collection-A" 
const c_B = "collection-B"
const article_id = "159448182"

const match = { "$match" : { 
    "articleid" : article_id
} } 

const lookup = { "$lookup" : {
    "from" : c_A ,
    "localField" : "articleid" ,
    "foreignField" : "articleid" ,
    "as" : "Date" ,
    "pipeline" : [
        { "$project" : { "_id" : 0 , "Date" : 1 } }
    ]
} }

const match_found = { "$match" : {
    "Date.0" : { "$exists" : true }
} }

const set = { "$set" : {
    "Date" : "$Date.0.Date"
} }

const unwind  = { "$unwind" : "$Date" }

const project = { "$project" : {
    "_id" : 1 ,
    "Date" : 1
} }

const merge = { "$merge" : {
    "into" : c_B ,
    "on" : "_id"
} }

/* And the whole pipeline being */
pipeline = [ match , lookup , match_found , unwind,  set , project , merge ] 

I really do not know if it is the best or if the performance are any good.

What I like about the concept of doing it with merge rather than a $set update is that you can verify what will be updated by removing the $merge. This way the merge is like a commit. You can build your update incrementally and do the commit once you are happy.

1 Like

thanks for this answer, but what if I need to match all articleid not only one article id [ means whatever articleid of collection-A matches with collection-B then update the field pubdate in collection-B from A]? also can I use it with a date range in collection-A? if yes please give a small example.

The match stage is optional and was posted to only update the sample document you provided.

There is no field named pubdate in the sample documents you shared.

I do not understand the following.

Oh that was my mistake totally a typo, I mean to say what if I need to match data from collection-A to collection-B, within a date range field - Date

  • How can I update the date field of collection-B from Collection-A by matching all articleid, if any article id matches then only a date will update in that matching document, also it will be helpful and quick if we can match within a date range of collection-A by using field date,
    i mean if we can select data within a date range in collection-A then start matching the articleid with collection-B and if any articleid matches of any document then only it will update the date in that document of collection-B from A

The stage

makes sure that only B’s document for which an A’s document is found are updated.

You may easily add a $match stage to the pipeline of inside the $lookup.