Bulk insert directory of JSONs into single collection with mongoimport in parallel

The following works for us, importing the single file our_file.json into our mongodb collection: mongoimport --uri "mongodb+srv://<username>:<password>@our-cluster.dwxnd.gcp.mongodb.net/dbname" --collection our_coll_name --drop --file /tmp/our_file.json

The following does not work, as we cannot point to a directory our_directory : mongoimport --uri "mongodb+srv://<username>:<password>@our-cluster.dwxnd.gcp.mongodb.net/dbname" --collection our_coll_name --drop --file /tmp/our_directory

We predictably get the error Failed: error processing document #1: read /tmp/our_directory: is a directory

We’ve come up with the solution: cat /tmp/our_directory/*.json | mongoimport --uri "mongodb+srv://<username>:<password>@our-cluster.dwxnd.gcp.mongodb.net/dbname" --collection our_coll_name --drop. It took 11 minutes to mongoimport a total of 103 JSON files in /our_directory with combined size of ~1GB into our mongoDB collection. We tested the mongoimport speed with a single 1GB file as well (rather than 103), and it took roughly 11 minutes as well.

  1. is this the right / best / optimal approach for bulk inserting an entire directory into a collection?
  2. is it possible to parallelize, or use multi-threading, so that the mongoimport of the 103 files outperforms the mongoimport of the 1 file (of the 103 files combined)?