MongoDB Atlas Data Lake is now an analytic-optimized object storage service for extracted data. Atlas Data Lake provides an analytic storage service optimized for flat or nested data with low latency query performance.
Atlas Data Lake requires an
M10 or higher backup-enabled Atlas cluster
with cloud backup jobs running on a specified cadence. To learn more
about cloud backups, see Back Up Your Database Deployment.
Atlas Data Lake supports collection snapshots from Atlas clusters as a data source for extracted data. Atlas Data Lake automatically ingests data from the snapshots, and partitions and stores data in an analytics-optimized format.
You can use Atlas Data Lake to:
Isolate analytical workloads from your operational cluster.
Provide a consistent view of cluster data from a snapshot for long running aggregations using
Query and compare across versions of your cluster data at different points in time.
Atlas Data Lake provides optimized storage in the following AWS regions:
Data Lake Regions
Sao Paulo, Brazil
Atlas Data Lake automatically selects the region closest to your Atlas cluster for storing ingested data.
You incur Atlas Data Lake charges per GB per month based on the AWS region where the ingested data is stored. You incur Atlas Data Lake costs for the following items:
Ingestion of data from your data source
Storage on the cloud object storage
Atlas Data Lake charges you for the resources utilized to extract, upload, and transfer data. Atlas Data Lake charges for the snapshot export operations is based on the following:
Cost per GB for snapshot extraction
Cost per hour on the AWS server for snapshot export download
Cost per GB per hour for snapshot export restore storage
Cost per IOPS per hour for snapshot export storage IOPS
Atlas Data Lake charges for storing and accessing stored data is based on the following:
Cost per GB per day
Cost for every one thousand storage access requests, where each access request is a dataset partition being processed
To learn more, see the Atlas pricing page.