Mongo to S3 in Parquet

Hope you are doing well… I have a requirement to export huge mongo db collection ( >60 million docs) into S3. I want an initial export of full extract to S3 and then onwards incremental load from last updated docs onwards using previous successful exported date time. I have tried following the instructions in How to Automate Continuous Data Copying from MongoDB to S3 | MongoDB, but my trigger gets errored out due to “execution time limit exceeded” error. Looks like we cannot export huge collections to S3. Looking for options