Merging documents into one document and accumulate fields and values

I thought it’s rather convenient to express what I want in code so I’ve written one.

const documents = [
  {
    id: '1',
    name: 'Ben'
  },
  {
    id: '2',
    name: 'John',
    age: 24
  },
  {
    id: '3',
    name: 'Jane',
    country: 'USA'
  }
]

const result = documents.reduce((acc, doc) => {
  // loop through the given document
  // add 'key(field name)' to the accumulator object and
  // push the 'value' to the set
  Object.entries(doc).forEach(([key, value]) => {
    // ignore id
    if (key === 'id') {
      return
    }
    acc[key] = acc[key] || new Set()
    acc[key].add(value)
  })
  return acc
}, {})

console.log(result)

// resulting in
// {
//   name: Set(3) { 'Ben', 'John', 'Jane' },
//   age: Set(1) { 24 },
//   country: Set(1) { 'USA' }
// }

How would I achieve this aggregation in mongodb? I have some clues but not really sure how.
I could make use of $mergeObjects, $group, $map, $reduce. The code is already there so I can simply run it on nodejs but I’m leavving it as a last ressort. And if I use $map or $reduce like in the code above, does it cause performance issue compare to other some smart looking aggregations?

One way to do it would be:

set_stage = { { "$set" : { "_array" : { "$objectToArray" : "$$ROOT" } } }
unwind_stage = { "$unwind" : "$_array" }
group_stage = { '$group': { _id: '$_array.k', v: { '$push': '$_array.v' } } }
pipeline = [ set_stage , unwind_stage , group_stage ]
collection.aggregate( pipeline )

The exercise I left to the reading to filter out (with a $match between $unwind and $group) as per the requirement:

1 Like

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.