Increase the performance while importing and exporting data from MongoDB

Hi,

I am struck in a situation and want to know if I can get any solution for this.
While exporting and importing data from MongoDB database, the api’s are taking more than 20000ms. I have done indexing for few fields which are necessary and I have used compound index and single index as well. But I am not able to increase the performance of my apis.

Any help or idea is greatly appreciated.

Thanks,
Nishchitha

These are the thing I need to do,

  • All the APIs should return quick response
  • Get API slowness
  • Bulk Import API slowness while importing from excel sheet

You are sharing way too little information about your full data set for anyone whom can help.

What is total sizes of everything?

What is your system configuration? Client? Server? Connection? RAM? Storage?

For the my collection I have almost 3000 documents. For bulk imports its taking too much time to fetch the data.
RAM - 16GB
System Type - x64-based PC
Processor - Intel(R) Core™ i5-10210U CPU @ 1.60GHz, 2112 Mhz, 4 Core(s), 8 Logical Processor(s)
Disks storage Size - 238.47 GB (256,052,966,400 bytes)

Still not enough information.

So client and server is running on same machine and fighting for the same resources.

Standalone instance or 3 nodes replica set on the same machine?

What is the source of your documents? JSON files, CSV, …

Which client are you using for bulk imports? mongoimport, mongosh, nodejs app?

3000 documents can be small (ex: 3000 x 1Kb) amount of data or HUGE (3000 x 1Mb) amout.

Disk storage size is one thing but must important is storage type, SSD, HD, NAS, SAN? What?

Is permanent DB storage on same partition of same disk as source file? Everything has an impact.

Is that single machine dedicated to that import but also serve as a web browsing or dev. machine? Any IDE running?

Linux or Windows?

1 Like