Hi guys, we are currently developing an api in nodejs that is responsible for reading a CSV file of 3,000,000 records and uploading it to a collection in Mongo Atlas (M40). We are currently achieving an average speed of 1600 records per second between reading and writing. We would like to know if you know of any way to improve the speed of the process to upload the files in the shortest possible time.
The file CSV is loaded in memory from the file system.
Additionally, how do we control that when reading so many records, timeouts are not reached in the connection to mongo no matter the quantity of records?
We are using InsertMany method with the 3 million records divided into batches of 5000 records.
Thanks for your help guys.