Gone are the days when you could plead ignorance and dodge responsibility. In the era of information overload, ignorance is like trying to hide a pebble in a sea of sand. In fact, it's as risky as letting a baby drive a car.
As software developers, we hold the power to build amazing products that can change lives, but with great power comes great responsibility. And there's only one way to live up to that responsibility: let Wednesday take the reins. Just kidding (or are we?).
Jokes aside, the secret to consistently delivering high-quality software is to write tests. I know, I know, testing can be tedious, but it's like brushing your teeth - you may not feel the difference immediately, but in the long run, you'll thank yourself for doing it. Plus, let's face it, nobody wants to deal with a codebase that's more fragile than a porcelain vase in an earthquake.
So embrace testing; write integration tests, unit tests, and all the tests in between. It's the only way to ensure your code is as reliable as a Swiss knife, or a Wednesday product (had to throw that in there, sorry not sorry).
First of all start with Unit tests, and make sure each unit and each module works independently, exactly as intended in all situations. Here the assumption is that since each module works correctly, and the interfaces that connect them have been unit tested for inputs and outputs, the entire system will just work.
With unit tests, you ensure that you’re testing all the code that you’ve written and we assume that the code that we haven’t written just works. You mock your ORM, database, network calls, cache, etc. Since you’re able to mock a vast range of scenarios you’re able to ensure that the code that you wrote, works in all of these possible scenarios.
But what happens when:
- You update your database/cache version
- You’ve got raw queries that must be vetted against an actual database to ensure that the result is correct.
- Complicated workflows that are dependent on multiple data points across various tables, yada yada yada
You guessed it, INTEGRATION TESTS!
Oh don’t be fooled, integration tests are better than unit tests, but by no means is it a silver bullet. Robust E2E tests are the best way to guarantee release quality. Please take a look at this brilliant article from the talented Poojesh Shetty that takes you through seamlessly setting up and running E2E tests for your applications.
As you move up the test pyramid the amount of effort and time required to build out tests increases, and so does the value. But at the end of the day, the tech exists to support the business. This means that in an ideal world, I’d love to have E2E, integrations, and unit tests. But in most practical cases I must settle with unit and integration to tackle the 99.99% scenarios. Load tests and E2Es are extremely valuable and depending on the stage of the business investing in robust pipelines is a non-negotiable. However, for the purpose of this article, I’d like to take you through the nuances of setting up your first integration test suite.
Alright, so I’ll be using this as the starter base to add integration tests. But don’t worry. You can follow along in your own repo as well. Just make sure you set up the following properly.
- configure the paths that should be included while creating the coverage report
- configure the coverage thresholds
- configure paths that should be ignored
- We’re using sequelize as the ORM and sequelize cli to run migrations.
- We typically follow a db-first approach for migrations, it gives more granular control and allows us to switch between tools easily. We’ve written some nifty functions to help us with these migrate utils.
- We’re using sequelize to seed data into all our tables. This is super handy for - you guessed it right, “integration tests”, but also for example if you want the entire local setup to work end-to-end when onboarding a new dev, etc.
- There's quite a lot of stuff going on there, but here are the scripts of interest
“test”: “export ENVIRONMENT_NAME=test && npx jest --silent --forceExit --detectOpenHandles --ci --coverage --testLocationInResults --json --outputFile=\"report.json\”” - We need some packages for this setup. All of this should be added as dev dependencies.
- docker-compose
- is-port-reachable
- dotenv
Ok, now let’s take stock of the situation. At this point, you should have a node application with:
- a package json with scripts for
- running database migrations
- seeding the database with values for each table
- running tests using jest
- a docker-compose setup that creates a network with
- a node application
- a database
- a cache component (optional)
Next, create a folder that contains the setup for the integration tests. Create a test_setup folder in the root directory.
Okay fam, I hope you enjoyed the ride so far cause it's about to get crazy. Let’s jump in.
Setting env for test
We’ll be using the dotenv package to set up environment variables. If you’re already using it and have a .env.test for test environment variables that's great, else please create a .env.test file.
Paste the contents below
In this basic setup, we have the credentials of the database and the cache cluster that we created in the previous steps. This enables us to perform migrations and seed data as needed during the test setup, ensuring that the test environment is properly configured for running tests.
Setting jest config and jest setup
We already have jest.setup.js and jest.config.json but here we will be using different setups for both unit and integration tests. In the jest.setup.js file, we are mocking some of the things for the unit tests which we don’t want for integration tests so it’s better to differentiate those.
Now let’s create a jest.setup.integration.js file.
Paste the content below into the newly created jest.setup.integration.js file
Now let’s create the jest.config.integration.js file.
In jest.config.integration.js change the setupFilesAfterEnv to jest.setup.integration.js and roots to ["__tests__"] like below. __tests__ is where we will be writing our integration tests.
We don’t want to run integration tests with unit tests so for that add the __tests__ directory in the jest.config.json file as shown below.
Finally, let’s add a script for running integration tests in the package.json file. Add the below script.
The only thing different here as compared to our main test script is that we are setting the config to jest.config.integration.json.
Lastly, we will need to add a CI job for integration tests. Add a job for Integration test with yarn test:integration. With this, we should have clear different jobs for both unit and integration tests as shown in .github/workflows/ci.yml.
With this, we have different setups for running unit and integration tests.
Enjoying this tutorial? Don't miss out on more exclusive insights and real-life digital product stories at LeadReads. Read by Top C Execs.
Join here.
Setting up Docker Compose
Docker-compose will allow us to run multiple docker containers within a network. We define and configure our containers in a docker-compose.yml file. Create docker-compose.yml within the test_setup folder. This file will define the configuration for our PostgreSQL and Redis containers, along with any other services we want to add.
Paste the contents below into the file
- Above is Docker compose configuration file where we define our services. In #1 we are defining our Postgres service with the following properties.
- In #1.1 we are specifying a docker image for our Postgres service with postgres version 12. Here we are using alpine linux which is an exceptionally lightweight and secure Linux distribution.
- In #1.2 we are passing additional command-line options to optimize Postgres DB because this is not for production so we don’t need some functionalities. With fsync=off we are disabling the synchronous buffer cache which will improve write performance but will increase the risk of data loss which we don’t have to worry about in this case. Same with synchronous_commit=off we are disabling synchronous commits. We are disabling full-page writes and setting the cost of fetching a page to 1.0 which will optimize query performance.
- In #1.3 we are setting the environment variables for our container where we are specifying the database name, username, and password.
- In #1.4 in the ports property, we are specifying the host port and container port. The host port listens to incoming traffic from where traffic will be forwarded to the container port. This allows external access to the PostgreSQL service running inside the Docker container.
- Using tmpfs in #1.5 it increases performance by mounting a temporary file system in memory. We are storing it in the /var/lib/postgresql/data. This will store data in RAM instead of the host’s file system and this won’t be persistent. So the thing to note here is data will be lost if the container is stopped. This works for our use case though.
- In #2 we are defining our docker image of redis service with redis version 6.2 and we are providing the port it should listen on in the network.
- In #3 we are defining a network named test-server-network under the network section which will allow the containers to communicate with each other. Containers can communicate by using container names or service names by using defined networks instead of IP addresses. default is the network driver provided by Docker.
Starting Containers
We will use the docker-compose-up command to start containers before running our integration tests and configuring our services.
Create global-setup.js in test-setup. In our jest.config file we will invoke global-setup.js which will take care of starting and setting up the containers before execution of the tests.
Create the globalSetup file
Paste the content below in the newly created file.
- Check if the DB is already running. Only if it is not, do we need to start the container.
- Spin up all the containers basis the docker-compose.yml file
- Start running the database migrations
- Seed the database tables
Now add the value for the globalSetup in the jest.config.integration.json file
And voila!
💡 Bonus Tip: Leveraging faker along with custom data for database seeding
In our project setup, we use seeders with the Faker library to generate dummy data. However, we may need to specific values alongside the randomly generated data in some cases. To achieve this, we concatenate the manually added data with the data generated by the seeders during execution, allowing us to seamlessly incorporate both types of data in our test database. This approach enables us to set up test data that aligns with our existing tests and specific requirements while efficiently integrating them into a Docker environment. Here is an example of seeder file for same:
In the global setup file, we spin up containers and run migrations for the project. However, we also need to take care of the local setup to avoid creating containers repeatedly. This is because during local testing, we may need to run tests multiple times or run individual tests, and running migrations and seeding data each time can be time-consuming.
To address this, we implement a check to determine if the database is already up and running. If the database is already available, we skip running migrations and seeding data to avoid redundant setup. This approach helps optimize the local setup process, allowing us to run tests more efficiently without unnecessary overhead of recreating containers and re-running migrations and seeding data.
Running test
Now here comes the final part where we will create and run our first integration test.
For this create folder __tests__ we will add all of the integration tests here.
Let’s add the test for the product. Create server -> gql -> model -> products.test.js folders within __tests__
The test setup includes setting environment variables using a test utility function getMockDBEnv() and configuring a Redis port. The beforeEach() and afterAll() functions are used to reset the environment variables before and after each test, respectively.
The first test case uses the getResponse() function to send a GraphQL query and retrieve the response. It then uses expect assertions to check that the response contains at least one product with an id and a name property.
The second test case creates a Redis client using the redis-mock library, sets a key-value pair in the Redis database, and retrieves the value using a callback. It then uses expect assertions to check that the retrieved value matches the expected value and marks the test as done using the done() callback.
Overall, this code demonstrates integration testing techniques for checking GraphQL queries and working with a Redis database in a test environment.
Destroying Containers
After the tests are done we need to destroy the container to create global-teardown.js in the test-setup. In the jest.config.integration.json file add a path for the globalTeardown as shown below.
In global-teardown.js add the following code which will destroy the container.
In order to optimize the setup process during local testing, we can add a check to determine if the current environment is a Continuous Integration (CI) environment. If it is, we can destroy the container to avoid creating it repeatedly for every test. This is done to save time and resources, as running individual tests can be performed on an already-up container.
Bonus: How to cache docker images with Github Actions
We have successfully completed our setup. Now, Docker image building is time-consuming and resource-intensive because it builds an image from scratch. To address this, we can cache our Docker images, which will allow us to avoid unnecessary rebuilds, resulting in faster build times and improved efficiency.
We can use GitHub Actions to cache Docker images. Let's see how we can do this:
Docker Image Layer Caching
With this, we can cache and save the state of the docker image in the local file system at each milestone. Here layer will act as a cache. If nothing is changed in a layer then we can simply reuse it without building it again.
To set up the cache the official actions/cache@v2 action from GitHub has to be used. The action will automatically take care of fetching the cache when the build starts, and uploading the cache after the build succeeded.
This is a simple example let’s break it down. We use cache action provided by github with uses: actions/cache@v2.
The path represents the path or directory we need to cache or restore. It can be an absolute or relative path. This is required.
The key is required as well. It can be any combination of variables, context values, strings, and functions. It should not exist the length of 512 characters.
restore-keys is an optional property that the system can use for finding the cache if there is no cache hit for the primary key.
This comes with a big price though. Here you can check out the pricing plan for the same.
Docker image registry caching
You can also cache Docker images in a Docker image registry, such as Docker Hub or a private container registry. During the build process, you can push the Docker images to the registry, and in subsequent workflow runs, you can pull them from the registry, which can be faster than rebuilding the images from scratch. Note that you will need to provide the appropriate authentication credentials and registry information for pushing and pulling Docker images from a Docker image registry.
Wrap Up!
Congratulations! You now possess the knowledge to level up your integration tests with Docker and jest using docker-compose. By seamlessly integrating your tests with Docker containers, you can create a realistic and powerful testing environment that mirrors real-world scenarios. Check out the code here.
With the ability to easily configure and spin up containerized services, you can thoroughly validate your application's functionality, performance, and resilience. By harnessing the power of Docker, you can ensure that your tests are run in a consistent and reproducible environment, enabling you to catch potential issues early and deliver high-quality software.
So, say goodbye to mundane and unrealistic test data, and embrace the flexibility and versatility of Dockerized integration tests. Whether you're working on a small project or a large-scale application, Docker and jest with docker-compose can become your trusted allies in building robust, reliable, and scalable software.
So, go forth and Dockerize your tests, harness the power of containers, and watch your testing game soar to new heights. Happy testing, and may your code always sail smoothly in the Dockerized seas of software development!