I am trying to impl a Post-Like system with event queues. Each like operation consists of creation of Like document {postId, userId} and if the upsert is successful PostMetadata document {postId, likesCount} is updated with incremented likesCount. The unlike operation deletes the corresponding Like object if it exists and increments the likesCount with -1 if the deletion was successful. Likes are processed in parallel batches from workers. Each batch uses bulkWrite with a list of operations from the queue. The problem is that i can not guarantee consistency for unlike operations, i dont know which post metadata to update based on the result from the bulk. Can you recommend a workaround, approach or this is considered a missing functionallity in the api?
Perhaps you could leverage transactions? This means both your Like
document creation/deletion and the corresponding PostMetadata
update can occur within a single transaction, ensuring consistency… transactions will add a bit of overhead, however. What’s the volume look like? How many likes/unlikes are there for each post? What’s the time period look like for likes / unlikes on a popular post?
This is wrong. Imagine 2 users like the same post concurrently, then 2 transactions will try to increment likesCount of the same document => one of them will be cancelled due to the transaction ACID policy. I pointed that i use event queues, the system is large scale…I want like for upserts to be able to check whether an operation, e.g. the 3 operation in the batch which is unlike to check whether the deletion has actually removed a document so that later in another bulk to incr/decr the correspondign counters.
I think mongdb team will not add this to the bulkWrite because for efficiency reasons. But this makes impossible to make this entire operation in two bulk jobs.
I think i found some workaround which i am not sure whether it is efficient for large amount of likes imagine you have 1k like operations and each post has 10k likes. The idea is to count the number of likes for each post in an aggegation pipeline and then to use $merge to update the likesCount in the metadata collection. What do you think? Is this going to be efficient?