🚀
Testmo logo
GUIDE

CircleCI Parallel Test Automation Pipeline Workflow

By Dennis Gurock
.
9 min read
As your projects and the number of automated tests grow, running your entire test suite as part of your CircleCI workflow can take more and more time. This is problematic as it slows down feedback for your development team and makes deployments slower. Fortunately, automated tests are usually easy to run in parallel, as each individual test should be executable independently of other tests. In this article we explain how to add concurrency to your CircleCI workflow and speed up the execution time of your automated tests in your CI pipeline. If you are new to setting up test automation with a CI pipeline, we recommend starting with our introductory guide on test automation with CircleCI. Once you understand the basics of setting up the CI workflow, you can further optimize your test suites with parallel execution as outlined below.

Parallel CI Test Automation Strategy

In order to execute tests in parallel for faster test execution times, we need to run multiple test jobs at the same time and execute only a subset of our tests in each job. Once all test jobs have completed successfully, we can continue with our CI workflow. This is what a typical CI workflow with parallel test execution looks like:
CircleCI Parallel Test Automation Pipeline Workflow -

If you want to deploy your code at the end of the CI workflow as in the above illustration, the pipeline needs to wait until all test jobs have completed without errors. This is pretty easy in CircleCI (and other CI tools), as you would just set a job requirement/dependency.

In CircleCI you do this as part of the workflows section. In the following example we have buildtest and deploy jobs. By setting the dependencies accordingly, CircleCI waits until all previous jobs have been completed successfully. This also includes parallel jobs. In order for the workflow to continue, all parallel jobs need to complete and run without errors. If a job failed it will stop the rest of the workflow automatically.

# .circleci/config.yml
# [..]
workflows:
  version: 2
  full:
    jobs:
      - build
      - test:
          requires:
            - build
      - deploy:
          requires:
            - test
CircleCI Parallel Test Automation Pipeline Workflow - logo dark
Start testing with Testmo free
#1 Unified Modern Test Management Software
CircleCI Parallel Test Automation Pipeline Workflow -
CircleCI Parallel Test Automation Pipeline Workflow - cli example 2

Example Project & Test Automation Suite

For this article we are extending our previous example with a more extensive test suite. Instead of a single test suite with just a couple of tests, we are adding more test files under the tests directory. This allows us to run different test suites in multiple parallel jobs. Here's the basic file structure with the relevant files for this example project:

.circleci/config.yml

package-lock.json
package.json

tests/test1.js
tests/test2.js
tests/test3.js
tests/test4.js
tests/test5.js
tests/test6.js
tests/test7.js
tests/test8.js

We are storing the files in a new example-circleci-parallel repository in GitHub and follow the project in CircleCI so that newly pushed code triggers a new CI run. You can take a look at the full repository in GitHub. For this project we are using some basic Mocha/Chai JavaScript tests, but you can use the same approach with any other test automation framework and programming language. All our test suites include same basic test cases and look like this:

// tests/test1.js
const chai = require('chai');
const assert = chai.assert;

describe('files', function () {
    describe('export', function () {
        it('should export pdf', function () {
            assert.isTrue(true);
        });

        it('should export html', function () {
            assert.isTrue(true);
        });

        it('should export yml', function () {
            assert.isTrue(true);
        });

        it('should export text', function () {
            assert.isTrue(true);
        });
    });
});

Parallel Execution With CircleCI

We first start by updating our CircleCI config.yml file to run multiple instances of our test job in parallel. This is very easy to do, as you just need to add the parallelism option to the CircleCI job definition. We have 8 test suites and we are going to tell CircleCI to run 4 instances of our test job in parallel to speed up the test execution.

Now if we just executed our Mocha tests in our jobs without changes, it would always run the entire test suite with all 8 files in each of the 4 containers. Instead we need to somehow split the entire test suite into smaller pieces and only run a subset in each container.

Fortunately this is very easy, as CircleCI's built-in circleci command line tool comes with features to accomplish exactly this. We first use the circleci test glob command to list all test files. We then pipe the result to the circleci test split command, which will just select and return a subset of our tests in each container. It uses the numer of containers and the current container index to make sure that each container executes a different subset, and to make sure that all tests are executed overall.

We can then pass the subset of files to our test runner, which will execute the subset of tests for the container and generate the JUnit XML result files. By default the circleci command splits the tests equally based on the number of files. But it can also split the tests based on file size, past execution times etc. You can learn more about advanced splitting options in CircleCI's documentation.

# .circleci/config.yml
version: 2.1

jobs:
  build:
	# [ .. ]

  test:
    docker:
      - image: cimg/node:current
    parallelism: 4
    steps:
	  # [..]

      # Split and run tests
      - run:
          name: Split and run tests
          command: |
            circleci tests glob 'tests/**/*.js' |
            circleci tests split |
            xargs npm run mocha-junit

Once you push the new configuration to GitHub, CircleCI will run the workflow and will start 4 separate containers for the test job. Each of the instances will reuse the result and stored workspace files of the build job. So any files we prepare in the build step will be available in all test instances.

At the end of the test job we also store the JUnit result files as artifacts with our build in CircleCI and submit the test results. You can see the full example in this article's example repository.

CircleCI Parallel Test Automation Pipeline Workflow -

Test Reporting & Management

Now that we have configured the testing jobs to run in parallel, next we can look at submitting the test results to a test management tool such as our Testmo. Reporting the test results allows us to track the results over time, make the results more accessible to the entire dev team, help us identify slow, flaky or broken tests and store a full archive of our test automation results.

As we are executing our tests in parallel, it would be good to make sure that we can see the results of each of the different containers separately, as well as see the entire test suite combined. Fortunately Testmo fully supports parallel test execution with its unique Threads feature. It allows us to submit each container's results in a separate thread, along execution statistics such as execution time, console output etc. And you can easily see all test results for the entire suite combined, optionally focus just on errors, identify slow tests and so on.

This is what an automated test run with multiple threads looks like in Testmo (note the separate threads in the Execution section below):

CircleCI Parallel Test Automation Pipeline Workflow - YXNzZXRzL3NjcmVlbnNob3RzL2F1dG9tYXRpb24tcnVucy12aWV3LWZ1bGwucG5n

So how do we submit the test results to Testmo during our CI run? This is very easy, as we can just use the testmo command line tool for this. You could just install the tool in each test container by using npm install @testmo/testmo-cli inside the CircleCI pipeline at runtime and we have done just that in our previous example article. This time we will need to use the testmo command in multiple containers though. So instead of installing the package in each container at runtime, we can just add the package dependency to our package.json. When we install all required packages during our build job and store the project files for the following jobs, the package is already installed and ready-to-use for all jobs in our CI workflow:

# Run this inside our dev container and then 
# commit package.json & package-lock.json to GitHub
$ npm install --save-dev @testmo/testmo-cli

In our previous simple example we could just use the testmo automation:run:submit command to create a test run in Testmo, submit all test result files and mark the run as completed all in one step. This time, because we want to submit multiple separate threads, we want to use separate commands for each task.

In order to submit the test results to Testmo, we want to create a test run before our parallel testing jobs execute, then submit a separate thread with its results in each test container, and finally mark the test run as completed at the end of the test execution. To implement this, we simply add additional test-setup and test-complete jobs to our workflow like this:

CircleCI Parallel Test Automation Pipeline Workflow -

In our CircleCI configuration file we add the test-setup job to create a new test run in Testmo. Testmo returns the ID of the new test run, which we need to remember for the subsequent calls to submit the threads and test results. We simply write the ID to a file called testmo-run-id.txt and use CircleCI's persist_to_workspace command to store the file for the following jobs.

We also update the test job to submit the test results to Testmo. We submit a separate thread with its results from each parallel container, referencing the previously created test run ID.

To make things easier, we moved the entire CircleCI split command to a separate alias in our package.json file, so we can pass the entire script as a single parameter to the testmo call. Testmo automatically measures the test execution time, captures any console output and uploads the results then.

# .cicleci/config.yml
# [..]
test-setup: # [..]
    steps:
      # Create test run in Testmo and store ID in file
      - run:
          name: Create test run in Testmo
          command: |
            npx testmo automation:run:create \
              --instance "$TESTMO_URL" \
              --project-id 1 \
              --name "Parallel mocha test run" \
              --source "unit-tests" > testmo-run-id.txt

      # Update workspace to persist run ID file
      - persist_to_workspace:
          root: ~/
          paths:
            - project

test: # [..]
    parallelism: 4
    steps:
      # Start with the initial build
      - attach_workspace:
          at: ~/

      # Split & run tests and report results to Testmo
      - run:
          name: Split and run tests
          command: |
            npx testmo automation:run:submit-thread \
              --instance "$TESTMO_URL" \
              --run-id "$(cat testmo-run-id.txt)" \
              --results results/*.xml \
              -- npm run mocha-junit-parallel # Note space after --

      # Store test artifacts and results
      # [..]

Finally, in our test-complete job we are letting Testmo know that we are done with uploading results and mark the test run as completed. Testmo automatically flags the run as passed or failed based on the submitted test results, so we don't need to do anything else here:

# .circleci/config.yml
# [..]
test-complete: # [..]
    steps:
      # Start with the initial build
      - attach_workspace:
          at: ~/

      # Mark run as completed
      - run:
          name: Mark run as completed
          command: |
            npx testmo automation:run:complete \
              --instance "$TESTMO_URL" \
              --run-id "$(cat testmo-run-id.txt)"

What about failures?

If one of the previous test jobs failed, CircleCI doesn't execute any of the following jobs anymore. In our scenario here this means that CircleCI would not call our test-complete job if a test failed. This also means that we cannot tell Testmo that the run should be completed. Many other CI tools support running jobs on failures as well, and CircleCI also plans to implement this in the future according to their roadmap.

In the meantime this is not a problem for us here though, as Testmo automatically marks test runs as completed after 30 minutes without new results by default. So even if any tests failed and we cannot mark the run as completed, Testmo automatically takes care of this for us.

We can review our test results by thread or the entire test suite at once in Testmo after submitting the results. Here's what a test automation run looks like in Testmo:

CircleCI Parallel Test Automation Pipeline Workflow - YXNzZXRzL3NjcmVlbnNob3RzL2F1dG9tYXRpb24tcnVucy1yZXN1bHRzLWNvbXBhY3QucG5n 2

Test Suites, Threads & Parallel Performance

Now we have everything in place to work with our test results in our test management tool. Whenever we push new code to GitHub, CircleCI starts a new CI workflow, checks out and prepares the code and runs all tests in parallel. We also submit all test results to Testmo so we can easily track the test suite over time, review individual threads, measure test times, identify problematic tests and archive our test run results.

If our test suite grows over time, we can also easily increase the number of parallel jobs we use to run our tests. We only need to update the parallelism flag in our workflow configuration. All other aspects of our workflow, such as the test splitting logic, submitting the threads etc. automatically adjust. So we can just fine-tune the number of parallel containers as needed over time without changing anything else.

CircleCI Parallel Test Automation Pipeline Workflow -

Testmo also comes with full test case management and exploratory testing built-in, allowing for unified testing in a single tool. Making the test results and test run archive accessible to the entire QA & dev team helps increase awareness of build performance, flaky tests and test failures to improve the quality of the entire test suite.

Improving your build times and configuring your CI workflow to run your test automation in parallel (as explained in this article) is a first good step towards scalable CI automation, so give it a try!

PS: We regularly publish original software testing & QA research, including free guides, reports and news. To receive our next postings, you can subscribe to updates. You can also follow us on Twitter and Linkedin.

More from Testmo

Get news about Testmo & software testing

Also receive our free original testing & QA content directly in your inbox whenever we publish new guides and articles.
We will email you occasionally when news about Testmo and testing content is published. You can unsubscribe at any time with a single click. Learn more in our privacy policy.