Visualize MongoDB Atlas Database Audit Logs
Rate this tutorial
MongoDB Atlas has advanced security capabilities, and audit logs are one of them. Simply put, enabling audit logs in an Atlas cluster allows you to track what happened in the database by whom and when.
In this blog post, I’ll walk you through how you can visualize MongoDB Atlas Database Audit Logs with MongoDB Atlas Charts.
- In Atlas App Services Values, Atlas Admin API public and private keys and AWS API access key id and secret access have been defined.
- aws-sdk node package has been added as a dependency to Atlas Functions.
- Atlas Data Federation has been configured to query the data in a cloud storage - Amazon S3 of Microsoft Blob Storage bucket.
- Atlas Function retrieves both Atlas Admin API and AWS API credentials.
- Atlas Function calls the Atlas Admin API with the credentials and other relevant parameters (time interval for the audit logs) and fetches the compressed audit logs.
- Atlas Function uploads the compressed audit logs as a zip file into a cloud object storage bucket where Atlas has read access.
- Atlas Charts visualize the data in S3 through Atlas Data Federation.
The following items must be completed before working on the steps.
- Provision an Atlas cluster where the tier is at least M10. The reason for this is auditing is not supported by free (M0) and shared tier (M2, M5) clusters.
- Under the Security section on the left hand menu on the main dashboard, select Advanced. Then toggle Database Auditing and click Audit Filter Settings.
- For the sake of simplicity, check All actions to be tracked in Audit Configuration as shown in the below screenshot.
- If you don’t have your own load generator to generate load in the database in order to visualize through MongoDB Charts later, you can review this load generator in the Github repository of this blog post.
- Create an app in Atlas App Services that will implement our functions inside it. If you haven’t created an app in Atlas App Services before, please follow this tutorial.
- Create an AWS account along with the following credentials — AWS Access Key and AWS Secret Access Secret.
- Set an AWS IAM Role that has privilege to write into the cloud object storage bucket.
- Later, Atlas will assume this role to make write operations inside S3 bucket.
Atlas Admin API allows you to automate your Atlas environment. With a REST client, you can execute a wide variety of management activities such as retrieving audit logs.
In order to utilize Atlas Admin API, we need to create keys and use these keys later in Atlas Functions. Follow the instructions to create an API key for your project.
After you’ve successfully created public and private keys for the Atlas project, we can store the Atlas Admin API keys and AWS credentials in App Services Values and Secrets.
App Services Values and App Services Secrets are static, server-side constants that you can access or link to from other components of your application.
In the following part, we’ll create four App Services Values and two App Services Secrets to manage both MongoDB Atlas Admin API and AWS credentials. In order to create App Services Values and Secrets, navigate your App Services app, and on the left hand menu, select Values. This will bring you to a page showing the secrets and values available in your App Services app.
In this section, we’ll create two App Services Values and one App Services Secrets to store Atlas Admin API Credentials.
Value 1: AtlasAdminAPIPublicKey
This Atlas App Services value keeps the value of the public key of Atlas Admin API. Values should be wrapped in double quotes as shown in the following example.
Secret 1: AtlasAdminAPIPrivateKey
This Atlas App Services Secret keeps the value of the private key of Atlas Admin API. You should not wrap the secret in quotation marks.
Value 2: AtlasAdminAPIPrivateKeyLinkToSecret
We can’t directly access secrets in our Atlas Functions. That’s why we have to create a new value and link it to the secret containing our private key.
Until now, we’ve defined necessary App Services Values and Atlas App Services Secrets to access Atlas Admin API from an App Services App.
In order to access our S3 bucket, we need to utilize AWS SDK. Therefore, we need to do a similar configuration for AWS SDK keys.
In this section, we’ll create two App Services Values and one App Services Secret to store AWS Credentials. Learn how to get your AWS Credentials.
Value 3: AWSAccessKeyId
This Atlas App Services Value keeps the value of the access key id of AWS SDK.
Secret 2: AWSSecretAccessKey
This Atlas App Services Secret keeps the value of the secret access key of AWS SDK.
Value 4: AWSSecretAccessKeyLinkToSecret
This Atlas App Services Value keeps the link of Atlas App Services Secret that keeps the secret key of AWS SDK.
And after you have all these values and secrets as shown below, you can deploy the changes to make it permanent.
An external dependency is an external library that includes logic you'd rather not implement yourself, such as string parsing, convenience functions for array manipulations, and data structure or algorithm implementations. You can upload external dependencies from the npm repository to App Services and then import those libraries into your functions with a
require('external-module')
statement.In order to work with AWS S3, we will add the official aws-sdk npm package.
In your App Services app, on the left-side menu, navigate to Functions. And then, navigate to the Dependencies pane in this page.
In your App Services app, on the left-side menu, navigate to Functions. And then, navigate to the Dependencies pane in this page.
Click Add Dependency.
Provide aws-sdk as the package name and keep the package version empty. That will install the latest version of aws-sdk node package.
Now, the aws-sdk package is ready to be used in our Atlas App Services App.
In this tutorial, we’ll not go through all the steps to create a federated database instance in Atlas. Please check out our Atlas Data Federation resources to go through all steps to create Atlas Data Federated Instance.
As an output of this step, we’d expect a ready Federated Database Instance as shown below.
I have already added the S3 bucket (the name of the bucket is fuat-sungur-bucket) that I own into this Federated Database Instance as a data source and I created the collection auditlogscollection inside the database auditlogs in this Federated Database Instance.
Now, if I have the files in this S3 bucket (fuat-sungur-bucket), I’ll be able to query it using the MongoDB aggregation framework or Atlas SQL.
Let’s create an Atlas function, give it the name RetrieveAndUploadAuditLogs, and choose System for authentication.
We also provide the following piece of code in the Function Editor and Run the function. We’ll see the credentials have been printed out in the console.
We now continue to enhance our existing Atlas function, RetrieveAndUploadAuditLogs. Now, we’ll execute the HTTP/S request to retrieve audit logs into the Atlas function.
Following piece of code generates an HTTP GET request,calls the relevant Atlas Admin API resource to retrieve audit logs within 1440 minutes, and converts this compressed audit data to the Buffer class in JavaScript.
Until now, in our Atlas function, we retrieved the audit logs based on the given interval, and now we’ll upload this data into the S3 bucket as a zip file.
Firstly, we import aws-sdk NodeJS library and then configure the credentials for AWS S3. We have already retrieved the AWS credentials from App Services Values and App Services Secrets and assigned those into function variables.
After that, we configure S3-related parameters, bucket name, key (folder and filename), and body (actual payload that is our audit zip file stored in a Buffer Javascript data type). And finally, we run our upload command (S3.putObject()).
Here you can find the entire function code:
After we run the Atlas function, we can check out the S3 bucket and verify that the compressed audit file has been uploaded.
First, we need to add our Federated Database Instance that we created in Step 4 into our Charts application that we created in the prerequisites section as a data source. This Federated Database Instance allows us to run queries with the MongoDB aggregation framework on the data that is in the cloud object storage (that is S3, in this case).
Before doing anything with Atlas Charts, let’s connect to Federated Database Instance and query the audit logs to make sure we have already established the data pipeline correctly.
Now, we can get a record from the auditlogscollection.
Let’s check the audit log of an update operation.
You can import this dashboard into your Charts application. You might need to configure the data source name for the Charts for your convenience. In the given dashboard, the datasource was a collection with the name auditlogscollection in the database auditlogs in the Atlas Federated Database Instance with the name FederatedDatabaseInstance0, as shown below.
The following topics can be considered for more effective and efficient audit log analysis.
- You could retrieve logs from all the hosts rather than one node.
- Therefore you can track the data modifications even in the case of primary node failures.
- You might consider tracking only the relevant activities rather than tracking all the activities in the database. Tracking all the activities in the database might impact the performance.
- You can schedules your triggers.
- The Atlas function in the example runs once manually, but it can be scheduled via Atlas scheduled triggers.
- Then, date intervals (start and end time) for the audit logs for each execution need to be calculated properly.
- You could improve read efficiency.
- You might consider partitioning in the S3 bucket by most frequently filtered fields. For further information, please check the docs on optimizing query performance.
MongoDB Atlas is not only a database as a service platform to run your MongoDB workloads. It also provides a wide variety of components to build end-to-end solutions. In this blog post, we explored some of the capabilities of Atlas such as App Services, Atlas Charts, and Atlas Data Federation, and observed how we utilized them to build a real-world scenario.
Questions on this tutorial? Thoughts, comments? Join the conversation over at the MongoDB Community Forums!