Unlocking Test Portability

Leveraging Containerization for Seamless End-to-End Testing in Distributed API Environments

Austin Mehmet
Senior Technology Engineer

As a software engineer, I’ve learned end-to-end-testing is crucial to ensure the smooth functioning of complex systems; however, it’s one of the biggest pain points in the software development lifecycle. It becomes even more challenging when dealing with multiple APIs in a distributed environment and even harder when each of these services all interact together in a sequence and require foundational data to work.

The Scenario

Assume a fleet of three APIs: A, B, C. Service A sets up foundational data that is going to be utilized by other services in the future. Service B uses the foundational data and performs its own functions on top of it and Service C does the same as B. Both Service B and C cannot perform their own operations until they have data created by Service A. A simple diagram is as shown.

In the world of insurance, we can think of these systems as one that quotes policy lines, one that generates policy documents and a third that allows purchase of the policy. You cannot generate policy documents without issuing a customer an initial quote and the same for purchasing the policy. Writing any test for a documents or purchase service will require setup of initial quote data. And inversely, validation of the quote service may require running of documents and purchase service tests to ensure no negative downstream impact. How then can we create a framework that tests these various APIs? How do we organize our tests so that we can leverage reusable code? How can we setup a separate testing framework so that these individual APIs can run tests in their own CI/CD pipelines?

Let’s explore how containerization can streamline end-to-end testing in such an environment. We are going to setup a framework that allows you to maintain a single core testing library and that allows running of our tests seamlessly across different repositories using Docker, CodeceptJS and Gitlab CI/CD.

Creating the tests with CodeceptJS

Let’s start by setting up an initial testing repository using CodeceptJS. The testing framework here isn’t very important so feel free to try this out with whatever your favorite testing library is.

For CodeceptJS we can create a new project through

npx create-codeceptjs .

CodeceptJS is more traditionally used as a Web UI testing framework; however, you can use it to test REST APIs as well. CodeceptJS comes with a lot of features and functions out of the box that is incredibly useful to a testing library as well as the framework itself is fairly extensible. For the purpose of this article, we will not go into the specifics of writing tests and configuring the CodeceptJS library. However, if you are interested go and checkout the CodeceptJS documentation at https://codecept.io/. One feature to mention is, like many other testing frameworks, CodeceptJS allows you to tag your tests so that you can organize and categorize your tests. We want to create three tests and tag each with @ServiceA, @ServiceB and @ServiceC so that later on we can utilize these tags to only run the individual service tests in CI. One additional thing to note about project setup is that when you create the CodeceptJS project, it may try and download Playwright or another web UI driver. You can discard this as we will only be making REST API calls. Your codecept.conf.js will only require the REST helper.

// codecept.conf.js
exports.config = {
    ...
    helpers: {
        REST: {
            timeout: 30000,
            prettyPrintJson: true,
            onRequest(request) => {
                // manipulate the request object on all REST calls
                request.headers.auth = '123'
            }
        }
    }
    ...
}

Additional documentation of the REST helper can be found here.

For now, a simple test can look like the following:

Scenario('Sample SerivceA Test', async ({ I }) => {
  const response = await I.sendPostRequest('https://servicea.your-own-domain.com/api/v1/resource', {some: 'data'})

  I.expect(response).toBe(200)
  I.expect(response.data.id).toBeDefined()
}).tag('@ServiceA')

Pro tip - you can add the expect library onto your I actor to do all your expectations as seen above.

// src/steps_file.js
const expect = require('expect')

module.exports = function () {
  return actor({
    expect,
  })
}

And in your codecept.conf.js you will need to include in your config:

exports.config = {
  include: {
    I: './src/steps_file.js',
  }
}

Go ahead and create multiple more tests; one for each service. Once you have these tests in place you now have the fundamental structure of this testing framework. The simple fact that now all your test code is organized under this centralized testing framework solves one of the challenges of our scenario. With us having only this one testing repository you now have the easy ability to share code across services. When we write our tests for Service A, all of that code will be available to utilize when we write tests for Service B and C. This almost monorepo like structure on top of containerization covered in the next step will give us everything we need to solve our testing challenges.

If you are looking past the foundational setup, you can also consider extending your testing framework by allowing it to accept a range of environment variables to change its functionality. These can be passed from CI and allow some additional flexibility into your testing framework as each service may have its own requirements. Some examples of variables our internal framework accepts are:

Environment Variable Purpose
LANE Allows you to pick the environment you hit during testing. Examples would be dev, QA, or production
TAGS Allows you to pick what tags run in CI. Examples would be @ServiceA or CodeceptJS also allows regex so something like "\@ServiceA|\@ServiceB" for running Service A and Service B tests or something like "(?=.*)^(?!.*@ServiceC)" for running all tests except for Service C
GATE Determines what percentage of tests should pass in CI for the job to be successful. Useful when you have flaky tests and if a large percentage pass you are still confident in the system for a release
ENABLE_GATEWAY Determines if we hit our API Gateway or if we hit the service URLs directly
URL_OVERRIDE Lets the consumer override whatever the testing framework’s base URL is. We create dynamic environments in CI per branch and need to pass in these URLs to the framework

Containerization with Docker

Once you have a few tests for each service, the goal now is to make them portable. We want these tests to run anywhere and be easily ran from individual application CIs.

We can containerize our tests with a super simple Dockerfile like the following:

FROM node:latest

WORKDIR /app/tests
  
COPY . .

RUN npm install --production

CMD ["bash"]

From your testing projects CI, build the docker image and push it to your project’s container registry.

build:
  image: docker:20.10.16
  stage: build
  services:
    - docker:20.10.16-dind
  script:
    - docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY
    - docker build -t $CI_REGISTRY/group/project/image:latest .
    - docker push $CI_REGISTRY/group/project/image:latest

Creating Consumption Templates with Gitlab CI/CD

Once deployed to your container registry, anyone is now able to access the tests by creating a Gitlab CI/CD job where that image is the base. To get your service consumers off the ground faster, provide them with a template CI that lives in the testing framework repository but can be imported into their own CI. Create a file named .gitlab-ci-consumer-template.yml and include the following:

.testing-template:
  image: $CI_REGISTRY/group/project/image:latest
  script:
    - >
        cd /app/tests
        if [ -z $TAGS ]; then
            codeceptjs run;
        else
            codeceptjs run --grep $TAGS;
        fi        

Now for any service that wants to consume these tests all they need to include in their .gitlab-ci.yml file is:

include:
  - project: "group/project"
    file: "/.gitlab-ci-consumer-template.yml"

e2e testing:
  stage: e2e
  extends: .testing-template
  variables:
    LANE: QA
    TAGS: "@ServiceA"
    GATE: 80

Each service can now pull in that consumer template and run whatever subset of tests they like. Service A could now run not only its own tests but also a combination of other services' tests to ensure that changes to Service A does not negatively impact dependent downstream services.

Conclusion

You now have a framework where all your tests are consolidated into a single repository allowing reutilization of code across services and you also have the ability to hand these tests off to API teams using containerization and let individual applications run the tests they find important within their own CI/CD pipelines. Containerizing your end-to-end tests and having a centralized testing library provides a unified approach to your test automation across multiple repositories and distributed APIs. Leveraging Gitlab CI/CD and containerization, developers and testers can efficiently manage and execute tests in a distributed API environment which leads to faster development cycles, reduced overhead, and more reliable releases.

To learn more about technology careers at State Farm®, or to join our team visit, https://www.statefarm.com/careers.

Information contained in this article may not be representative of actual use cases. The views expressed in the article are personal views of the author and are not necessarily those of State Farm Mutual Automobile Insurance Company, its subsidiaries and affiliates (collectively “State Farm”). Nothing in the article should be construed as an endorsement by State Farm of any non-State Farm product or service.