Mongodb auditing feature

Hi ,
I am aware about mongodb auditing feature which is only available in enterprise edition.
I am using community edition is there any other 3rd party tool which can be used for auditing or to scan all the ongoing operations in db which we usually get in auditing feature.

Please advise.

Thanks

I think you can try Percona or Studio 3T

Is there a functional and technical difference between MongoDB and Percona MongoDB?

Wanna expand this question…

ok so audit only available in Enterprise…
Can you, if you have enterprise (audit any/all changes) on documents (based on conditions) and then output the audit stream to another Mongo platform, ie Atlas.
The installed/to be monitored is Mongo on Ubuntu on zVM / Z system.

G

ok so audit only available in Enterprise…
Can you, if you have enterprise (audit any/all changes) on documents (based on conditions) and then output the audit stream to another Mongo platform, ie Atlas.

You can connect Kafka / MongoDB Change stream and capture changes made to another platform. You can also capture the audit logs and download them via and API call and then ingest those audit logs somewhere.

If you’re using Atlas, the audit logging is not so great, you have to audit the audit logs and then you also have to audit activity logs. As admin at cloud level permissions adding users is not the audit log, but in the atlas activity logs. Then you have to join them.

The installed/to be monitored is Mongo on Ubuntu on zVM / Z system.

You can move them and monitor at this point to system of your choice like one you mentioned?

… so primary Database is Mongo Enterprise on zVM on Ubuntu on Z (LinuxOne)… this is a on premise deployment in customer DC.

Is there a way to tell this Enterprise deployment to push/insert/move the audit stream to a remote Mongo Atlas deployment… ie Native Mongo functionality, not using a Kafka or CDC…

yes I realise I can log locally to a collection, then use a Kafka Source Connector to source the data and use a Kafka Sink connector at Atlas.

Looking more native functionality with a remote out/push here.

G

anyone ?

Customer/Bank sort of needs a well designed audit solution and atm auditing to local server is not exactly ideal.

G

So scouring the docs a bit, it seems Enterprise can only either output to Syslogd, json text or bson on thes system itself…

Anyone willing to share how they created a pipeline to push this onto say a secondary/remote system for storage/analytics.

originally was thinking something like a filebeat reader, but that can’t insert directly onto mongo, it needs either log stash or Kafka and i’m trying to keep this light.

Some file watcher of sort… that can read lines as they’re appended… which in turns pushes them to a Kafka or MongoDB Atlas store.

otherwise make the file size for switch small enough to impart a 10min switch cycle, push that to S3, Python/Lambda trigger that then uploads file.

another option… same 10min switch cycle, move file to a staging directory and use mongo utility to upload json docs in file to a collection.

just some ideas, like to hear how others are managing audit of system…

G

So scouring the docs a bit, it seems Enterprise can only either output to Syslogd, json text or bson on thes system itself…

Did you ever find a solution?

nope…

Although there is potentially some ideas…

  1. On file switch copy/push file to AWS S3, then use a Lambda process to push directly into MongoDB Atlas, or push onto Kafka and then use a sink connector.
  2. use a FileBeats to push to Kafka and then sink Connector into MongoDB Atlas.
    But neither of those really follow a KISS, there are allot of benefits in the above pipeline, as you can put allot of Value added processes onto the pipeline… but if all you want is to Keep it simple…
    Option Might be MongoDB Import utility. pulling into local MongoDB, then using a stream process to remotely push to MongoDB Atlas or dedicated Audit Server/Collection.

For me first prize would be if the audit config allowed the administrator to define a remote Server/Collection as a option.

G

For me first prize would be if the audit config allowed the administrator to define a remote Server/Collection as a option.

Did you already explore outputing auditing as syslog? Maybe a tool like this can push syslogs to Atlas Cluster? (just throwing ideas)

looking… (thanks)

But anytime a audit stream can be outputted to a text file and then read again it introduces the possibility of being edited while in the text file…

With audit logs you want to make sure you eliminate as much potential points of possible interference…

G

1 Like

With audit logs you want to make sure you eliminate as much potential points of possible interference…

That’s a good point less chance of tampering.