Database Integration Testing – The Secret Recipe
Let’s talk about testing… Not about the importance of testing in general, but rather about automated tests against NoSQL Databases. The simplicity of writing and orchestrating them in build pipelines, and the power they unveil – high confidence in reliable code after making changes and before writing new code (TDD).
Table of contents
Why test against an actual database?
Some choose an easy path by avoiding (or delaying) integration tests by mocking up the database interaction.
I don’t want to diminish the value of unit tests for code in isolation, but often it’s not convincing enough that the overall system works. In that case, projects seek to confirm a specific behaviour in integration tests, and a mature persistence layer is the first candidate for test coverage.
The usual catch-22 for testing code that communicates with a database is that
- Unit tests for a simple project can be quickly done by isolating the persistence layer (e.g. Repository pattern). But this wouldn’t leave much of the code to test, so the value of those tests might become negligible.
- Step up in the project size, and the persistence layer becomes more mature (via ORM or abilities of the DB) – with rich entities, aggregates and other abstractions. The gains of separating it out (to a Repository, for example) are diminished, and the need for integration tests gets increased (see Oren Eini’s thoughts on Repository pattern).
How is testing usually done?
Usually, testing against a database is not trivial. Even before writing the first test, one has to:
- Spin off a DB server
- Create and populate a test database
- Clean up seeded records after each test
- Clean up the database, etc
And it all needs to be maintained through potential changes to the database schema, DB server version, etc… Yes, it is a pain (whelp, and it’s a bigger pain with SQL server). And then we start writing tests. How many takers did we get by this stage? 😊
The unique approach of RavenDB
One of the key distinctive features of RavenDB is the availability of an Embedded Server that fully supports the Client API.
It’s really a hidden gem that is not stressed enough in the docs or other articles. Just 80MB for an unpacked embedded server that comes as a NuGet package, works on all .NET platforms and supports all the indexes, queries, commands, etc. of its “big brother” – RavenDB Server. It’s like running SQLite with the full power of MS SQL Server in the RDBMS world.
That Embedded Server paved the way for RavenDB.TestDriver and easy integration testing. The TestDriver package is tiny and built around the RavenTestDriver class – a recommended base class for the tests. It’s framework agnostic, so works with xUnit and NUnit.
What does it do? For each test it:
- Takes care of creating an embedded RavenDB server,
- Makes populating a new DB a trivial task, and
- Cleans up the database on disposal.
That takes away a lot of pain for developers.
Let’s see the tests!
One of the practical examples from the YABT project would be a test against the map-reduce index named BacklogItems_Tags that I used for a TDD approach to build the functionality. The purpose of the tested index is to provide a list of used tags for all tickets with the number of tickets per tag. Here is a xUnit test for that:
class BacklogItemsTests: RavenTestDriver
{
private readonly IAsyncDocumentSession _session;
public BacklogItemsTests()
{
// Setup the database (it'll be disposed on disposing the class instance)
var store = GetDocumentStore();
// Create all the indexes registered in the assembly
IndexCreation.CreateIndexes(typeof(BacklogItemTask).Assembly, store, null, store.Database);
// Open async DB session
_session = store.OpenAsyncSession();
// Wait index updates on each change
_session.Advanced.WaitForIndexesAfterSaveChanges();
}
[Fact]
public async Task BacklogItems_Queried_By_Tag_Test()
{
// GIVEN 2 backlog items where one has tag "t1"
await _session.StoreAsync( new BacklogItem { Title = "Test1", Tags = new[] { "t1" } } );
await _session.StoreAsync( new BacklogItem { Title = "Test2" } );
await _session.SaveChangesAsync();
// WHEN query backlog items filtered by tag
var items = await _session.Query<BacklogItemTagsIndexed,BacklogItems_Tags>()
.Where(t => t.Name == "t1")
.ToArrayAsync();
// THEN the only backlog item with the tag gets returned
Assert.Single(items);
Assert.Equal(1, items.Single().Count);
}
public override void Dispose()
{
// The session would be disposed without a direct call but we keep up the good practice
_session.Dispose();
base.Dispose();
}
}
An actual test project would be less verbose as most boilerplate go to some helper methods.
A sneak peek at the DB when debugging
When running against an embedded database, sometimes it feels like working with a black box that doesn’t provide much transparency. Luckily for devs, there is an excellent feature allowing you to check out the state of the embedded database in the Studio when puzzled while debugging your tests.
To pause the tests and open the Studio in the browser, just drop this command anywhere in the code (docs):
WaitForUserToContinueTheTest(_session.Advanced.DocumentStore);
It does help to learn about Raven’s behaviour in complex scenarios and has saved me tons of debugging time.
Subtleties
Several things might need your attention when setting up a test project:
Single session instance
The example above is using one session _session
instance for seeding and querying data. Just keep in mind that sessions are meant to be short-lived (that’s the case for any ORM), and a longer session in tests may lead to two issues:
- Unexpected cached data.
The session caches data that sometimes may interfere with test results. If caching is a concern, then either use isolated sessions for seeding and querying or call_session.Advanced.Clear()
after saving changes. - Hitting the ceiling on the number of requests per session.
By default, the maximum number of requests per session is limited to 30. Expensive remote calls are not a concern in tests, so feel free to increase the limit by setting MaxNumberOfRequestsPerSession.
And sure, creating a new session instance for each code block in test methods would also prevent the above issues:
using (var session = _store.OpenAsyncSession())
{
await session.StoreAsync(...);
await session.SaveChangesAsync();
}
Stale indexes
Waiting for indexes to get updated becomes a thing in swift tests, as eventual consistency might be unnecessary in some scenarios. Here WaitForIndexesAfterSaveChanges() on opening the session or WaitForNonStaleResults() on querying come to the rescue.
Explicit disposing
RavenDB.TestDriver does a pretty good job in disposing the DocumentSession and DocumentStore objects at the end of the test class lifecycle (along with temporary files created by the embedded server).
Keep in mind that xUnit, by default, creates a new instance of the class for each test (if no fixtures used), when in NUnit, a single fixture instance is reused for all class tests (unless FixtureLifeCycle is not specified). Hence, explicitly disposing of the session and the store is required for the default NUnit setup and can be omitted for xUnit.For BDD tests, you’re likely to test some logic that relies on Dependency Injection and your project has an instance of IAsyncDocumentSession (and maybe IDocumentStore) registered within a DI container that controls the lifecycle. The same as in YABT domain tests. The usual test implementation would require a new class instance per test and the test constructors resolving the session instance without explicitly creating and disposing.
YABT examples
If you need more examples of RavenDB tests, then the YABT project has plenty at different levels:
Project | Description |
Database.Tests | Tests for the indexes and entities (to a lesser extent) that are used in TDD and tweaking indexes. |
Database.Migration.Tests | Tests for the migration logic to reduce the risk of messing up the production database on deploying critical DB updates. |
Domain.Tests | Tests for the domain services. It’s the main test project full of integration tests to ensure the expected behaviour around querying/filtering data and persisting changes. |
These examples show that RavenDB has made writing and debugging DB tests effortless and straightforward.
How do other NoSQL servers compare?
To be fair in claiming superiority of Raven tests, let’s check out the approach for other database engines (something more powerful than SQLite).
MongoDB – Docker or third-party NuGet
For MongoDB, the official packages like the C# Driver run tests against a life instance (e.g. running in Docker) and pass the connection string via environment variables.
Meanwhile, the community has tried to ease the suffering and came up with the Mongo2Go NuGet package that instantiates Mongo Community Server from included binaries (only v4.4.4 for Windows, Linux and macOS) that blow out the package size. In addition, it can download the latest version from the Internet when kicking off the tests. All of this comes in a bundle with fears of losing version availability & compatibility, Internet connection, etc.
Cosmos DB – emulator
Cosmos DB doesn’t provide any wiggle room to get off the Microsoft hook. A Docker image or testing NuGet package is unheard-of in that realm. However, it provides an Emulator for local use that runs on Windows only (with an antique workaround to run on Linux/macOS). Its limitations are less of an obstacle, though.
If you think of integrating it with your CI/CD pipeline, you are on your own unless using Azure DevOps, which provides a pipeline task.
DynamoDB – Docker of Java app
A more progressive Amazon provides DynamoDB Local, which can be installed locally (requires JRE) or run in Docker. It even supports in-memory databases that speed up testing. Not bad, but still misses the mark.
As you can see, RavenDB provides the combination of an embedded server and a powerful SDK that prioritizes testing and gives impressive results.
That’s it for today. Now go and write some tests!
Woah, already finished? 🤯
If you found the article interesting, don’t miss a chance to try our database solution – totally for free!