Explore Developer Center's New Chatbot! MongoDB AI Chatbot can be accessed at the top of your navigation to answer all your MongoDB questions.

MongoDB Developer
MongoDB
plus
Sign in to follow topics
MongoDB Developer Centerchevron-right
Developer Topicschevron-right
Productschevron-right
MongoDBchevron-right

Testing and Packaging a Python Library

Mark Smith8 min read • Published Jan 04, 2024 • Updated Aug 14, 2024
MongoDBPython
FULL APPLICATION
Facebook Icontwitter iconlinkedin icon
Rate this tutorial
star-empty
star-empty
star-empty
star-empty
star-empty

Testing & Packaging a Python Library

This tutorial will show you how to build some helpful pytest fixtures for testing code that interacts with a MongoDB database. On top of that, I'll show how to package a Python library using the popular hatchling library, and publish it to PyPI.
This the second tutorial in a series! Feel free to check out the first tutorial if you like, but it's not necessary if you want to just read on.

Coding with Mark?

This tutorial is loosely based on the second episode of a new livestream I host, called "Coding with Mark." I'm streaming on Wednesdays at 2 p.m. GMT (that's 9 a.m. Eastern or 6 a.m. Pacific, if you're an early riser!). If that time doesn't work for you, you can always catch up by watching the recording!
Currently, I'm building an experimental data access layer library that should provide a toolkit for abstracting complex document models from the business logic layer of the application that's using them.
You can check out the code in the project's GitHub repository!

The problem with testing data

Testing is easier when the code you're testing is relatively standalone and can be tested in isolation. Sadly, code that works with data within MongoDB is at the other end of the spectrum — it's an integration test by definition because you're testing your integration with MongoDB.
You have two options when writing test that works with MongoDB:
  • Mock out MongoDB, so instead of working with MongoDB, your code works with an object that just looks like MongoDB but doesn't really store data. mongomock is a good solution if you're following this technique.
  • Work directly with MongoDB, but ensure the database is in a known state before your tests run (by loading test data into an empty database) and then clean up any changes you make after your tests are run.
The first approach is architecturally simpler — your tests don't run against MongoDB, so you don't need to configure or run a real MongoDB server. On the other hand, you need to manage an object that pretends to be a MongoClient, or a Database, or a Collection, so that it responds in accurate ways to any calls made against it. And because it's not a real MongoDB connection, it's easy to use those objects in ways that don't accurately reflect a real MongoDB connection.
My preferred approach is the latter: My tests will run against a real MongoDB instance, and I will have the test framework clean up my database after each run using transactions. This makes it harder to run the tests and they may run more slowly, but it should do a better job of highlighting real problems interacting with MongoDB itself.

Some alternative approaches

Before I ran in and decided to write my own plugin for pytest, I decided to see what others have done before me. I am building my own ODM, after all — there's only so much room for Not Invented Here™ in my life. There are two reasonably popular pytest integrations for use with MongoDB: pytest-mongo and pytest-mongodb. Sadly, neither did quite what I wanted. But they both look good — if they do what you want, then I recommend using them.

pytest-mongo

Pytest-mongo is a pytest plugin that enables you to test code that relies on a running MongoDB database. It allows you to specify fixtures for the MongoDB process and client, and it will spin up a MongoDB process to run tests against, if you configure it to do so.

pytest-mongodb

Pytest-mongo is a pytest plugin that enables you to test code that relies on a database connection to a MongoDB and expects certain data to be present. It allows you to specify fixtures for database collections in JSON/BSON or YAML format. Under the hood, it uses mongomock to simulate a MongoDB connection, or you can use a MongoDB connection, if you prefer.
Both of these offer useful features — especially the ability to provide fixture data that's specified in files on disk. Pytest-mongo even provides the ability to clean up the database after each test! When I looked a bit further, though, it does this by deleting all the collections in the test database, which is not the behavior I was looking for.
I want to use MongoDB transactions to automatically roll back any changes that are made by each test. This way, the test won't actually commit any changes to MongoDB, and only the changes it would have made are rolled back, so the database will be efficiently left in the correct state after each test run.

Pytest fixtures for MongoDB

I'm going to use pytest's fixtures feature to provide both a MongoDB connection object and a transaction session to each test that requires them. Behind the scenes, each fixture object will clean up after itself when it is finished.

How fixtures work

Fixtures in pytest are defined as functions, usually in a file called conftest.py. The thing that often surprises people new to fixtures, however, is that pytest will magically provide them to any test function with a parameter with the same name as the fixture. It's a form of dependency injection and is probably easier to show than to describe:
1# conftest.py
2def sample_fixture():
3
4    assert sample_fixture == "Hello, World"
As well as pytest providing fixture values to test functions, it will also do the same with other fixture functions. I'll be making use of this in the second fixture I write.
Fixtures are called once for their scope, and by default, a fixture's scope is "function" which means it'll be called once for each test function. I want my "session" fixture to be called (and rolled back) for each function, but it will be much more efficient for my "mongodb" client fixture to be called once per session — i.e., at the start of my whole test run.
The final bit of pytest fixture theory I want to explain is that if you want something cleaned up after a scope is over — for example, when the test function is complete — the easiest way to accomplish this is to write a generator function using yield instead of return, like this:
1def sample_fixture():
2    # Any code here will be executed *before* the test run
3    yield "Hello, World"
4    # Any code here will be executed *after* the test run
I don't know about you, but despite the magic, I really like this setup. It's nice and consistent, once you know how to use it.

A MongoClient fixture

The first fixture I need is one that returns a MongoClient instance that is connected to a MongoDB cluster.
Incidentally, MongoDB Atlas Serverless clusters are perfect for this as they don't cost anything when you're not using them. If you're only running your tests a few times a day, or even less, then this could be a good way to save on hosting costs for test infrastructure.
I want to provide configuration to the test runner via an environment variable, MDB_URI, which will be the connection string provided by Atlas. In the future, I may want to provide the connection string via a command-line flag, which is something you can do with pytest, but I'll leave that to later.
As I mentioned before, the scope of the fixture should be "session" so that the client is configured once at the start of the test run and then closed at the end. I'm actually going to leave clean-up to Python, so I won't do that explicitly myself.
Here's the fixture:
1import pytest
2import pymongo
3import os
4
5@pytest.fixture(scope="session")
6def mongodb():
7    client = pymongo.MongoClient(os.environ["MDB_URI"])
8    assert client.admin.command("ping")["ok"] != 0.0  # Check that the connection is okay.
9    return client
The above code means that I can write a test that reads from a MongoDB cluster:
1# test_fixtures.py
2
3def test_mongodb_fixture(mongodb):
4        """ This test will pass if MDB_URI is set to a valid connection string. """
5        assert mongodb.admin.command("ping")["ok"] > 0

Transactions in MongoDB

As I mentioned, the fixture above is fine for reading from an existing database, but any changes made to the data would be persisted after the tests were finished. In order to correctly clean up after the test run, I need to start a transaction before the test run and then abort the transaction after the test run so that any changes are rolled back. This is how Django's test runner works with relational databases!
In MongoDB, to create a transaction, you first need to start a session which is done with the start_session method on the MongoClient object. Once you have a session, you can call its start_transaction method to start a transaction and its abort_transaction method to roll back any database updates that were run between the two calls.
One warning here: You must provide the session object to all your queries or they won't be considered part of the session you've started. All of this together looks like this:
1session = mongodb.start_session()
2session.start_transaction()
3my_collection.insert_one(
4    {"this document": "will be erased"},
5    session=session,
6)
7session.abort_transaction()
That's not too bad. Now, I'll show you how to wrap up that logic in a fixture.

Wrapping up a transaction in a fixture

The fixture takes the code above, replaces the middle with a yield statement, and wraps it in a fixture function:
1@pytest.fixture
2def rollback_session(mongodb):
3    session = mongodb.start_session()
4    session.start_transaction()
5    try:
6        yield session
7    finally:
8        session.abort_transaction()
This time, I haven't specified the scope of the fixture, so it defaults to "function" which means that the abort_transaction call will be made after each test function is executed.
Just to be sure that the test fixture both rolls back changes but also allows subsequent queries to access data inserted during the transaction, I have a test in my test_docbridge.py file:
1def test_update_mongodb(mongodb, rollback_session):
2    mongodb.docbridge.tests.insert_one(
3        {
4            "_id": "bad_document",
5            "description": "If this still exists, then transactions aren't working.",
6        },
7        session=rollback_session,
8    )
9    assert (
10        mongodb.docbridge.tests.find_one(
11            {"_id": "bad_document"}, session=rollback_session
12        )
13        != None
14    )
Note that the calls to insert_one and find_one both provide the rollback_session fixture value as a session argument. If you forget it, unexpected things will happen!

Packaging a Python library

Packaging a Python library has always been slightly daunting, and it's made more so by the fact that these days, the packaging ecosystem changes quite a bit. At the time of writing, a good back end for building Python packages is hatchling from the Hatch project.
In broad terms, for a simple Python package, the steps to publishing your package are these:
  • Describe your package.
  • Build your package.
  • Push the package to PyPI.
Before you go through these steps, it's worth installing the following packages into your development environment:
  • build - used for installing your build dependencies and packaging your project
  • twine - used for securely pushing your packages to PyPI
You can install both of these with:
1python -m pip install –upgrade build twine

Describing the package

First, you need to describe your project. Once upon a time, this would have required a setup.py file. These days, pyproject.toml is the way to go. I'm just going to link to the pyproject.toml file in GitHub. You'll see that the file describes the project. It lists pymongo as a dependency. It also states that "hatchling.build" is the build back end in a couple of lines toward the top of the file.
It's not super interesting, but it does allow you to do the next step...

Building the package

Once you've described your project, you can build a distribution from it by running the following command:
1$ python -m build
2* Creating venv isolated environment...
3* Installing packages in isolated environment... (hatchling)
4* Getting build dependencies for sdist...
5* Building sdist...
6* Building wheel from sdist
7* Creating venv isolated environment...
8* Installing packages in isolated environment... (hatchling)
9* Getting build dependencies for wheel...
10* Building wheel...
11Successfully built docbridge-0.0.1.tar.gz and docbridge-0.0.1-py3-none-any.whl

Publishing to PyPI

Once the wheel and gzipped tarballs have been created, they can be published to PyPI (assuming the library name is still unique!) by running Twine:
1$ python -m twine upload dist/*
2Uploading distributions to https://upload.pypi.org/legacy/
3Enter your username: bedmondmark
4Enter your password:
5Uploading docbridge-0.0.1-py3-none-any.whl
6100% ━━━━━━━━━━━━━━━━━━━━ 6.6/6.6 kB • 00:00 • ?
7Uploading docbridge-0.0.1.tar.gz
8100% ━━━━━━━━━━━━━━━━━━━━8.5/8.5 kB • 00:00 • ?
9View at:
10https://pypi.org/project/docbridge/0.0.1/
And that's it! I don't know about you, but I always go and check that it really worked.
I'm on PyPI!

Onwards and upwards

This week's work has been super satisfying. I know this work done upfront to make tests run in transactions and to publish the library, while it's still relatively simple, will pay for itself in time.
In my next tutorial, I'm going to start looking at data embedded in documents. I'm going to extend the docbridge framework so that it can deal with embedded arrays, as work toward what I really want to get done next. I'm going to teach docbridge that arrays aren't always entirely contained within a document --- sometimes they're subsets, and sometimes they're extended references!
I'm really excited about some of the abstraction building blocks I have planned, so make sure to read my next tutorial, or if you prefer, join me on the livestream at 2 p.m. GMT on Wednesdays!
Top Comments in Forums
There are no comments on this article yet.
Start the Conversation

Facebook Icontwitter iconlinkedin icon
Rate this tutorial
star-empty
star-empty
star-empty
star-empty
star-empty
Related
Tutorial

Spring Data Unlocked: Advanced Queries With MongoDB


Nov 08, 2024 | 7 min read
Tutorial

Using MongoDB with Apache Airflow


Jun 12, 2023 | 8 min read
Code Example

GroupUs


Jul 07, 2022 | 1 min read
Tutorial

Introduction to LangChain and MongoDB Atlas Vector Search


Jan 12, 2024 | 5 min read
Table of Contents
  • Testing & Packaging a Python Library