
Many years ago, when I was entering the enterprise software development world, my company had been awarded to maintain some e-government product and I was one of the first programmers to look after it. I spent hours making this product buildable in our development environment and, literally, I was pulling my hair out to make integration tests passing. I failed. It turned out that tests were dependent on the external test database that no one was maintaining for months. Last release of this product was launched in production around the same amount of time ago. I can see 2 possible reasons for this: this product was really great, bug free or there were simply no end users using it. As the customer wanted to extend the functions of the existing application, and the development team that created it fell apart, the application had to be transferred to a new development environment. The question arose: whether to restore the database for the purpose of passing programming tests, or simply to silence them. One of my managers said: “that’s the pain of maintaining integration tests depending on the external database”. To be honest, I haven’t thought about writing integration tests these years. I was disappointed at the waste of time I spent fighting an opponent I couldn’t beat.
One of the possible solutions to this problem is to use some alternative embeddable database management system that does not require maintenance, run in-memory. It’s not a bad substitute and many database systems of this kind have been used in many projects I have dealt with. Such database systems are probably suitable not only for testing purposes, but in terms of integration tests they have one disadvantage I encountered: they never reflected the database system used in the production environment as there were always some differences. They personify other database engines using different compatibility modes, but additional work was always needed to transform the solution into the production system as every database engine behaves a little bit differently.
The ideal solution would be the ability to run a piece of software on each environment, only for testing purposes, in isolation, with no side effects, without maintenance need, without updates, and so on. Now, in the software containerization era, applications, binaries and all its dependencies can be packaged up and run independently on any infrastructure. Is this not what we are looking for?

Testcontainers are the answer to these problems. Following testcontainers.org: “Testcontainers […] support […] tests, providing lightweight, throwaway instances of common databases, Selenium web browsers, or anything else that can run in a Docker container.” Initially Testcontainers have been created by Richard North as a Java library to support JUnit tests but now there are many other incarnations written for scala, node, python, dotnet and more (visit https://github.com/testcontainers to find out more). Looking for a possibility to create data access layer integration tests ? Use a containerized instance of a MySQL, PostgreSQL or elasticsearch or choose a module corresponding to your needs from the 30+ currently existing (https://github.com/testcontainers/testcontainers-java/tree/master/modules) without the need for complex configuration on developer computers. If you develop services using Spring Boot then you could probably need some set of Spring Boot auto-configurations, but the testcontainers library doesn’t support Spring Boot out of the box. Thanks to the kindness of Playtika guys, we even have access to the extended set of modules https://github.com/Playtika/testcontainers-spring-boot written for Spring Boot developers. Over 20 ready-to-use modules built on top of the testcontainers library.
If your favourite language or database is not there – initiate your own fork and start changing the world for the better 😉 Happy coding 🙂
Przemysław Fusik