Add automated tests to your full-stack JavaScript project with Mocha, Jest and Polly

Published: 2019-07-30 by Lars  guidecode

(updated on 2020-10-25)

Preconditions

This tutorial builds on top of my previous tutorial Build and deploy your own full-stack JavaScript project from scratch with React and PostgreSQL, and to follow along with the tutorial, you should first have completed the previous one. Again, familiarity with automated testing might be useful, but the instructions are meant to be detailed enough so that you can complete the tutorial even without knowing Mocha, Jest or Polly yet.

If you get stuck you may refer to this project on GitHub which has a fully working example built with this guide.

Introduction

We have already created a very simple web application that can display a list of things ("dreams") stored in a database. You are able to run the application on your own computer as well as in the cloud.

In this tutorial we will add automated tests so we can easily re-test both the back-end code and the front-end code as we evolve the application. By the end of the tutorial, the tests will be fast and full integration tests.

Back-end test

We will start with a simple test for our server code, and then incrementally add improvements. We will use Mocha and Chai to write back-end tests. Install them into your project with:

cd server
npm install mocha chai --save-dev

The server has a single end-point (/dreams) so the test should invoke that end-point and verify the response. On the front-end we use the fetch() API provided by the browser, and we want our tests to do the same, so we also need to install a package providing the fetch() API in Node.js:

npm install isomorphic-fetch --save-dev

Now create the initial version of our test in server/test/server.test.js:

const { expect } = require('chai');
require('isomorphic-fetch');
const { describe, it } = require('mocha');

const port = 8888;

describe('server', function () {
  it('should fetch all dreams', async function () {
    const response = await fetch(`http://localhost:${port}/.netlify/functions/dreams`);
    expect(response.status).to.equal(200);
    const dreamList = await response.json();
    expect(dreamList).to.deep.equal([
      {title: 'Learn French', id: dreamList[0].id},
      {title: 'Visit Albania', id: dreamList[1].id}
    ]);
  });
});

Before we can run the test we need to tell Mocha how to find our test file. In the scripts section in server/package.json, replace the existing "test" line with:

"test": "mocha --recursive test/**/*.test.js --watch"

Now make sure that the server is running in one terminal window

cd server
npm start

and then start the Mocha test runner in another terminal window:

cd server
npm test 

This will run Mocha in watch-mode: When you make changes to the code Mocha will automatically re-run all the tests. You can stop Mocha with Ctrl+C.

The test will fail with a very detailed and informative error message, looking somewhat like this:

server
  1) should fetch all dreams

0 passing (623ms)
1 failing

1) server
     should fetch all dreams:

    AssertionError: expected [ Array(3) ] to deeply equal [ Array(2) ]
    + expected - actual

     [
       {
         "id": "1"
    -    "title": "Compose a tune"
    +    "title": "Learn French"
       }
       {
         "id": "2"
    -    "title": "Visit Uruguay"
    +    "title": "Visit Albania"
       }
    -  {
    -    "id": "3"
    -    "title": "Write a sci-fi novel"
    -  }
     ]

    at Context.<anonymous> (src\server.test.js:14:31)

Our test fails - for a good reason: the test is expecting different "dreams" than those that are actually in the production database.

We don't want to change the test to match whatever data happens to be in the production database at the moment. Instead we will establish some standard test data.

Test data

The server code was originally written to always fetch data from the production database. To make the server code testable we will need to refactor it, so the tests can run the server code against a different test database containing predictable test data. We will divide the refactoring into smaller steps:

  1. Create netlify.js
  2. Rewrite dreams.js
  3. Create test database instance
  4. Populate with test data
  5. Extend server.test.js

Extract netlify.js

We originally used netlify dev to start our server, but when running tests we want to start and also stop a separate test instance of the server and configure it to use a separate test database. As this is a somewhat tricky way of using Netlify, we will create a separate code file for this in server/test/netlify.js:

const { spawn } = require('child_process');
const kill = require('tree-kill');

async function starting({port}) {
  netlifyProcess = spawn(`netlify dev --port ${port}`, {
    env: {
      ...process.env,
      NODE_ENV: 'test'
    },
    shell: true,
    stdio: 'inherit'
  });
  await new Promise(resolve => setTimeout(resolve, 10000)); // Note: wait for server to start
  return netlifyProcess;
}

async function stopping({netlifyProcess}) {
  kill(netlifyProcess.pid);
}

module.exports = {
  starting,
  stopping
};

We are using a third-party library to stop the running test server, so we need to install that library, from the terminal:

cd server
npm install --save-dev tree-kill

Rewrite dreams.js

The way our netlify.js helper tells the test server that it should use a test database, is by passing in the value TEST in the environment variable NODE_ENV. We can adapt db.js to load the right database configuration depending on the value of this environment variable. Change the first line of server/src/utils/db.js to:

const env = process.env.NODE_ENV === 'test' ? require('../env.test.json') : require('../env.json')

This completes the refactoring, although the tests are not quite ready yet. But having changed the server code, we should now verify that the server still works as expected. Switch back to the terminal window where the server is running, stop it with Ctrl+C and restart it with npm start. Then open http://localhost:8888/.netlify/functions/dreams in a browser and verify that it still returns the dreams we know are in the production database:

[
  {"id": "1", "title": "Compose a tune"},
  {"id": "2", "title": "Visit Uruguay"},
  {"id": "3", "title": "Write a sci-fi novel"}
]

Next we will work on getting the tests to pass.

Create test database instance

First we need to create an additional database instance dedicated to testing. Follow all the instructions from the Database section of the previous blog post.

This new database instance will have a different connection string. Create a new file server/src/env.test.json with this content:

{
  "DATABASE": "your PostgreSQL connection URL for your test database instance"
}

Change your root .gitignore file to ignore all the .env-files to avoid accidentally pushing your secret database passwords.

env*.json

Populate with test data

When writing tests, we want to ensure that the database contains a predictable set of standard test data, so we know what the test can expect. We will create a function to reset and populate the test database in a new file server/src/setupTestData.js:

async function resetting ({db}) {
  await db.query(`
      delete from dream;
      alter sequence dream_id_seq restart;
      insert into dream (title) values ('Visit Albania');
      insert into dream (title) values ('Learn French');
    `);
}

module.exports = {
  resetting
};

Extend server.test.js

We now have all the building blocks to extend our test to run against a freshly reset test database. The test itself (the it block) is unchanged, and we have added a before and after block to set up the environment for the test to run in (new code highlighted):

const { expect } = require('chai');
require('isomorphic-fetch');
const { describe, it } = require('mocha');
const pgp = require('pg-promise')();

const env = require('../src/env.test.json');
const { starting, stopping } = require('./netlify');
const { resetting } = require('./setupTestData');

const dbConnectionString = env.DATABASE;
const port = 3011;

describe('server', function () {
  let db, netlifyProcess;

  before(async function () {
    this.timeout(30000);
    db = pgp(dbConnectionString);
    await resetting({db});
    netlifyProcess = await starting({port});
  });

  after(async function () {
    if (netlifyProcess) await stopping({netlifyProcess});
    pgp.end();
  });

  it('should fetch all dreams', async function () {
    const response = await fetch(`http://localhost:${port}/.netlify/functions/dreams`);
    expect(response.status).to.equal(200);
    const dreamList = await response.json();
    expect(dreamList).to.deep.equal([
      { title: 'Learn French', id: dreamList[0].id },
      { title: 'Visit Albania', id: dreamList[1].id }
    ]);
  });    
});

When we now re-run our tests (you might need to Ctrl+C and then npm test), the test will succeed with output similar to:

server
  √ should fetch all dreams (231ms)

1 passing (10s)

We have now completed the back-end testing part of this tutorial. Time for a break before we continue with the front-end testing!


Front-end tests

Because we used create-react-app to create the initial skeleton for our front-end code, we already have Jest and React Testing Library installed.

And we also need to use a slightly newer version of jsdom, so change the line with the test script in app/package.json to:

"test": "react-scripts test --env=jsdom-fourteen",

Now create the initial version of our test in app/src/DreamList.test.js:

import React from 'react';
import {render, screen} from '@testing-library/react';
import DreamList from './DreamList';

describe('DreamList', function () {
  describe('render', function () {
    it('should render data', async () => {
      render(<DreamList />);
      await screen.findByText('All my dreams');
      const dreamElements = await screen.findAllByRole('listitem');
      expect(dreamElements.map(el => el.textContent)).toEqual([
        'Learn French',
        'Visit Albania'
      ]);
    });
  });
});

We can start the Jest test runner in a new terminal window:

cd app
npm test 

This will run Jest in watch-mode: When you make changes to the code, Jest will automatically re-run the affected tests. You can stop Jest with Ctrl+C.

Initially the test fails with an error message similar to this:

● DreamList › render › should render data

  TypeError: Network request failed

This is because the front-end expect the server to listen on port 80, but the server is actually listening on port 3001.

We can temporarily restart the server on port 80 instead (Note: this might require administrative privileges, so you may skip this step)

netlify dev --port 80

and then re-run the front-end tests, and instead we get this more familiar error message:

● DreamList › render › should render data

  expect(received).toEqual(expected) // deep equality

  - Expected
  + Received

    Array [
  -   "Learn French",
  -   "Visit Albania",
  +   "Compose a tune",
  +   "Visit Uruguay",
  +   "Write a sci-fi novel",
    ]

So we can see that the test properly waits for the data to be loaded and the component to be re-rendered before running the expect checks, which none-the-less still fails, now because the data being returned is production data, not standard test data.

(Note: continue here if you skipped because of missing privileges for port 80).

However, we still don't want to run our tests against the production server, so let's re-start the server on port 8888 as before and find a better solution:

cd server
npm start

So we need to ensure one more thing: when testing the front-end, the fetch() calls should respond with predictable responses. There are 3 potential ways to ensure this:

  1. Make every front-end test reset test data and stop and start the server
  2. Manually write mock data for the calls to fetch
  3. Record and playback HTTP responses

We want to avoid coupling the front-end tests to a real back-end, primarily for performance reasons. Most of the code in modern web applications are written for the front-end, so we want those tests to be fast. So we don't want to do 1).

We also want to avoid manually mocking the calls to fetch, because those mocks very easily grows out-of-sync with the actual back-end. If we did manual mocking we are no longer doing integration tests. So we also don't want to do 2).

So instead we will record and playback HTTP responses. This allows our front-end tests to be fast and full integration tests. Read an earlier blog post of mine for more information: Unit test your service integration layer.

Record and playback HTTP requests

We will use a tool called Polly to record and playback HTTP requests. Since Polly needs to be involved in both the back-end tests (record) and front-end tests (playback) we will create a new folder, test outside of server and app:

cd your-project-directory-name
mkdir test
cd test
npm init --yes

And then install the Polly modules we need:

npm install @pollyjs/core @pollyjs/adapter-node-http @pollyjs/persister-fs --save-dev

We will parameterize the Polly configuration for each of the two use-cases in a new file test/setupPolly.js (and note that you can easily enable logging if you need to debug a failing test):

const path = require('path');

const { Polly } = require('@pollyjs/core');
const NodeHttpAdapter = require('@pollyjs/adapter-node-http');
const FsPersister = require('@pollyjs/persister-fs');
const PollyUtils = require('@pollyjs/utils');

Polly.register(NodeHttpAdapter);
Polly.register(FsPersister);

function mockHttp(name, mode) {
  const pollyOptions = {
    mode,
    adapters: ['node-http'],
    persister: 'fs',
    persisterOptions: {
      fs: {
        recordingsDir: path.join(__dirname, './recordings')
      }
    },
    // logging: true,
    recordFailedRequests: mode === PollyUtils.MODES.RECORD,
    recordIfMissing: mode === PollyUtils.MODES.RECORD,
    matchRequestsBy: {
      headers: false,
      order: false,
      url: {
        port: false,
        hostname: false,
        protocol: false,
      }
    }
  };
  return new Polly(name, pollyOptions);
}

const recordHttp = (name) => mockHttp(name, PollyUtils.MODES.RECORD);
const stubHttp = (name) => mockHttp(name, PollyUtils.MODES.REPLAY);

module.exports = {
  recordHttp,
  stubHttp
};

We can now use Polly in server.test.js. First make it available:

const { recordHttp } = require('../../test/setupPolly');

and declare a polly instance inside describe():

let polly;

Start recording as the last line of before():

polly = recordHttp('dream');

And stop recording as the first line of after():

if (polly) await polly.stop();

After re-running the server tests (npm test) you can now inspect the recorded HTTP interaction in test/recordings/dream_####/recording.har. This file is in the HAR-file format (HTTP Archive), which is also supported by the Network tab in browsers' developer tools. You should be able to find the following lines in there:

    "url": "http://localhost:3011/.netlify/functions/dreams"
    ...
    "text": "[{\"id\":\"2\",\"title\":\"Learn French\"},{\"id\":\"1\",\"title\":\"Visit Albania\"}]"

This is exactly the data we need in our front-end test, so let's switch over to DreamList.test.js.

First we will make Polly available:

import {stubHttp} from '../../test/setupPolly';

Then we will start and stop playback inside describe('DreamList'):

let polly;

beforeAll(function () {
  polly = stubHttp('dream');
});

afterAll(async function () {
  if (polly) await polly.stop();
});

Re-running the tests (npm test) will now show that the front-end tests passes as well:

PASS  src/DreamList.test.js

Congratulations: Your application is now covered with fast and full integration tests that you can easily evolve to keep your application working as it grows!

Learn more

To extend your tests you may need to learn more about how to use the tools we used. Here is a list of links to documentation and tutorials.

Discuss on Twitter