MongoDB Auditing Documents using Change Streams

Written 5 months and 1 week ago

If you are using MongoDB Enterprise, then consider the native audit feature available to you.

MongoDB Enterprise includes an auditing capability for mongod and mongos instances. The auditing facility allows administrators and users to track system activity for deployments with multiple users and applications.

Yes, MongoDB Atlas issues Enterprise instances, you can enable Auditing from the Security tab.

If you do not use MongoDB Enterprise, then you can implement auditing using change streams.

// When updating a document, store the whole document, not just the changes.
// Makes lookup and restoration easy but storage usage increases.
const watchOptions = {
    fullDocument: 'updateLookup',

// When updating a document, store the differences, not the whole document.
// Improves storage usage but makes lookup and restoration harder.
const watchOptions = {};

// Watch all collections for changes
const changeStream =;

// Watch a single collection for changes
const changeStream = db.collection('...').watch(watchOptions);

// See for a full list
// of operation types.
// We are only watching document related changes.
const auditOperationTypes = new Set(['insert', 'replace', 'update', 'delete']);
function handleChange(event) {
    if (event.ns.coll !== 'audit' && auditOperationTypes.has(event.operationType)) {
            model: event.ns.coll,
            operation: event.operationType,
            // If it's a delete, then fullDocument is undefined use the
            // documentKey instead, which is { _id: ... }
            document: event.fullDocument || event.documentKey,
            // If it's an update and we are not storing the full document,
            // then store the updateDescription.
            // Otherwise, default to undefined.
            updateDescription: !event.fullDocument && event.updateDescription || undefined,

changeStream.on('change', handleChange);

WARNING: Enabling Audit authorization successes can severely impact cluster performance. Enable this option with caution.

With our solution above, we can expect the same degradation because we would be creating a document every time we insert, update, or delete a document.

If real-time auditing is not a requirement, then we can buffer our logs and insert them using insertMany in bulk. We can achieve this using the DataBuffer data structure.

const dataBuffer = new DataBuffer({ size: 100 });
dataBuffer.on('flush', db.collection('audit').insertMany);

Next, in our handleChange function replace the expression db.collection('audit').insertOne with dataBuffer.insert.

Other Articles

Storing Private Keys on Encrypted Flash Memory

written 5 months and 2 days ago

Storing my SSH key on flash memory in an encrypted VeraCrypt container.
Read more

MailChimp Subscription with JavaScript

written 5 months and 5 days ago

Client-side MailChimp subscription without redirecting the user away.
Read more

JavaScript Data Buffer

written 5 months and 1 week ago

Buffers are typically used when processing data is slower than the rate at which it is received.
Read more