Getting Connection Error Randomly While Trying to Use MongoDump In Bash Script

Let me explain my use case first:

I have 30 databases and each contains approximately 150 collections. I wrote a bash script to dump all the collections of all these databases based on an specified query. While running the bash script I am getting the following errors sometimes (approximately getting the error once for 30 collections).

Failed: can’t create session: failed to connect to {CONNECTION_STRING} connection() error occurred during connection handshake: context deadline exceeded.

Failed: can’t create session: failed to connect to {CONNECTION_STRING} connection() error occurred during connection handshake: dial tcp: lookup i/o timeout.

As we can’t execute dump for more than one collection for an specific query in a single command and I have a large amount of collection to dump, I have intentionally executed dump command in bash script parallelly to make the process faster. I have tried a fix which was sleeping for 1s after executing each dump command, but that took a large time to complete the dump!

Mentioning my bash script below:


#!/bin/bash

set -e

dumpAllClientDB() {

    All_Client_DB=($(jq -r '.AllDemoDB[]' config.json))
    CONNECTION_STRING=$(jq -r '.CONNECTION_STRING' config.json)

    QUERY="{\"CompanyId\": \"${ChoosenCompanyId}\"}"
    
    GET_COLLECTION_COMMAND="var collections = db.getCollectionNames(); \
                       for (var i = 0; i < collections.length; i++) { \
                           print(collections[i]); \
                       }"

    bg_process_ids=()

    for db in "${All_Client_DB[@]}";
    do
        echo "Dumping ${db} database for CompanyId ${ChoosenCompanyId}..."

        DB_COLLECTIONS=$("mongosh" "$CONNECTION_STRING/$db-Client" --quiet --eval "$GET_COLLECTION_COMMAND" | tr -d "',[]")

        # Start multiple background processes
        for collection in $DB_COLLECTIONS;
        do
            echo "Dumping ${collection} collection..."
            mongodump --uri="$CONNECTION_STRING" --db "$db-Client" --collection ${collection} --query "$QUERY" --readPreference secondary --out ${DEMO_TEMPLATE_ID} &
            bg_process_ids+=($!)
        done
    done

    # Wait for all background processes to complete
    for pid in "${bg_process_ids[@]}"; do
        wait "$pid"
    done

    echo -n "mongodump completed successfully."
}

DEMO_TEMPLATE_ID=$1

echo "DEMO_TEMPLATE_ID: $DEMO_TEMPLATE_ID"

ChoosenCompanyId=$(jq -r --arg key "${DEMO_TEMPLATE_ID}_companyId" '.[$key]' config.json)

mkdir -p ${DEMO_TEMPLATE_ID}

dumpAllClientDB


What can be the probable solution for the problem mentioned above? Or suggest me an alternative approach for my use-case keeping in mind that I have to dump data by a job not manually.