How can i import data from AWS S3 buckets into an Atlas cluster with AWS Data Federation

Please can you help me
I am working on a project to migrate data from Dynamo DB to Mongo atlas (which already contains tables), in order to insert the data into the existing tables in mongo and to create the non-existing tables.

I have not found an etl that allows me to link the 2 DBs.
So I decided to create an S3 bucket in which I will store my data from Dynamo DB.

To this end, my project is now to migrate the data from S3 to Mongo atlas.
I opted for Aws Data Federation, I could connect S3 to Mongo atlas.
For the moment I can’t write the tigers that will retrieve the data.

On Google I could only find documentation that deals with the migration from mongo atlas to S3 and not in the opposite direction (as in my case).

So I ask for your help

Hi @tim_Ran and welcome in the MongoDB Community :muscle: !

I think you are looking for this:


Thanks for trying and I’ll get back to you :slightly_smiling_face:

1 Like

Alas, I am still at an impasse
Is it possible to have a test trigger to import data from S3 to Data federation ??

In App Services you can create a Trigger on a MongoDB Atlas write operation in a collection but not when something happens in S3.

Using $out though, you can write something to an Atlas collection or an S3 bucket.

Maybe there is an equivalent service in AWS that listens to write operations in S3 and trigger an event?

Maybe this?

From a Lambda function you can use the MongoDB Driver or the Atlas Data API to write stuff into MongoDB.

Take a look at this blog post:

And this doc to avoid creating a new Connection with each lambda execution (big big anti pattern):