May be there is a better way with the new update with aggregation. But my approach would be to use lookup to get the correct value and then merge that value back into the collection.
Untested with many blanks still to fill.
We start we a match stage for
match_stage = { "$match" : {{ "type" : "DEVICE" } }
Then a lookup stage for
lookup_stage = { "$lookup" : {
"from" : "col1" ,
"localField" : "id" ,
"foreignField" : "id" ,
"as" : _tmp_lookup_result
} }
Then a small utility stage to get the first (should be the only one) element of the lookup.
set_first_stage = { "$set" : { "_tmp_first" : { "$first": "$_tmp_lookup_result" } } }
The next stage is a project that only keeps the _id and the value we want to merge into the collection. So far I used tmp* for field names because I like to see the intermediate results when I debug. The project gets rid of the the tmp*.
project_stage = { "$project" : { "entityId" : "$_tmp_first.Entity_ID' } }
If we just aggregate the pipeline
pipeline = [ match_stage , lookup_stage , set_first_stage , project_stage ]
we should get results like
[ { _id: ObjectId(...), entityId: 111 } ]
If happy with the result we can use a merge stage to set the new value in the collection
merge_stage = { "$merge" : { "into" : "col2" , "on" : "_id" } }
So the final pipeline that updates the documents is
pipeline = [ match_stage , lookup_stage , set_first_stage , project_stage , merge_stage ]