Transfer table from BigQuery to MongoDB using DataFlow

I am using Dataflow to move a table from BQ to MongoDB. My Dataflow instance will create a new VM to do the migration job. The issue is every time the job runs a new VM is created with its own IP. If I don’t whitelist the IP then the job fails. How to fix this challenges and make this pipeline work on schedule.

Hi @Arun_Shankar, welcome to the community.

Is the target of the migration a MongoDB Atlas cluster ?

The Atlas Administration API or Atlas CLI using api key can help automate updating the accessList.

Using the API directly:
https://www.mongodb.com/docs/atlas/security/ip-access-list/#add-ip-access-list-entries

Using Atlas CLI:
https://www.mongodb.com/docs/atlas/security/ip-access-list/#add-ip-access-list-entries

Atlas CLI has a nice --currentIP so something like this could be automated to add the currentIP :
atlas accessLists create --currentIp --deleteAfter $(date -d 'tomorrow' --iso-8601=seconds)

If you know what IP ranges are used by Dataflow adding the entire ranges would simplify the flow with the tradeoff being less secure.