Add automated tests to your full-stack JavaScript project with Mocha, Jest and Polly

Published: 2019-07-30 by Lars

Preconditions

This tutorial builds on top of my previous tutorial Build and deploy your own full-stack JavaScript project from scratch with React and PostgreSQL, and to follow along with the tutorial, you should first have completed the previous one. Again, familiarity with automated testing might be useful, but the instructions are meant to be detailed enough so that you can complete the tutorial even without knowing Mocha, Jest or Polly yet.

If you get stuck you may refer to this project on GitHub which has a fully working example built with this guide.

Introduction

We have already created a very simple web application that can display a list of things ("dreams") stored in a database. You are able to run the application on your own computer as well as in the cloud.

In this tutorial we will add automated tests so we can easily re-test both the back-end code and the front-end code as we evolve the application. By the end of the tutorial, the tests will be fast and full integration tests.

Back-end test

We will start with a simple test for our server code, and then incrementally add improvements. We will use Mocha and Chai to write back-end tests. Install them into your project with:

cd server
npm install mocha chai -D

The server has a single end-point (/api/dreams) so the test should invoke that end-point and verify the response. On the front-end we use the fetch() API provided by the browser, and we want our tests to do the same, so we also need to install a package providing the fetch() API in Node.js:

npm install isomorphic-fetch -D 

Now create the initial version of our test in server/src/server.test.js:

const { expect } = require('chai');
require('isomorphic-fetch');
const { describe, it } = require('mocha');

const port = 3001;

describe('server', function () {
  it('should fetch all dreams', async function () {
    const response = await fetch(`http://localhost:${port}/api/dreams`);
    expect(response.status).to.equal(200);
    const dreamList = await response.json();
    expect(dreamList).to.deep.equal([
      {title: 'Learn French', id: dreamList[0].id},
      {title: 'Visit Albania', id: dreamList[1].id}
    ]);
  });
});

Before we can run the test we need to tell Mocha how to find our test file. In the scripts section in server/package.json, replace the existing "test" line with:

"test": "mocha --recursive src/**/*.test.js --watch"

Now make sure that the server is running in one terminal window

cd server
npm start

and then start the Mocha test runner in another terminal window:

cd server
npm test 

This will run Mocha in watch-mode: When you make changes to the code Mocha will automatically re-run all the tests. You can stop Mocha with Ctrl+C.

The test will fail with a very detailed and informative error message, looking somewhat like this:

server
  1) should fetch all dreams

0 passing (623ms)
1 failing

1) server
     should fetch all dreams:

    AssertionError: expected [ Array(3) ] to deeply equal [ Array(2) ]
    + expected - actual

     [
       {
         "id": "1"
    -    "title": "Compose a tune"
    +    "title": "Learn French"
       }
       {
         "id": "2"
    -    "title": "Visit Uruguay"
    +    "title": "Visit Albania"
       }
    -  {
    -    "id": "3"
    -    "title": "Write a sci-fi novel"
    -  }
     ]

    at Context.<anonymous> (src\server.test.js:10:31)

Our test fails - for a good reason: the test is expecting different "dreams" than those that are actually in the production database.

We don't want to change the test to match whatever data happens to be in the production database at the moment. Instead we will establish some standard test data.

Test data

The server code was originally written to always fetch data from the production database. To make the server code testable we will need to refactor it, so the tests can run the server code against a different test database containing predictable test data. We will divide the refactoring into smaller steps:

  1. Extract db.js
  2. Extract server.js
  3. Rewrite main.js
  4. Create test database instance
  5. Populate with test data
  6. Extend server.test.js

Extract db.js

We will need to access the database from multiple places in the code (from the server code and when populating test data). Create a new file server/src/db.js with the following code, which has mostly been extracted from the original main.js:

const pgp = require('pg-promise')();

function connect (dbConnectionString) {
  const db = pgp(dbConnectionString);
  return db;
}

function disconnect () {
  pgp.end();
}

module.exports = {
  connect,
  disconnect
};

Extract server.js

We will also need to access the database from multiple places in the code (from the server code and from the test code). Create a new file server/src/server.js with the following code, which has also been extracted from the original main.js:

const express = require('express');

async function starting({db, port}) {
  const app = express();

  app.use(express.static('app'));
  app.get('/', function (request, response) {
    response.sendFile(__dirname + '/app/index.html');
  });

  async function dreamsGetHandler(request, response, next) {
    try {
      const rowList = await db.query('select * from dream order by title');
      response.send(rowList);
    } catch (error) {
      response.status(500).send(error.message)
    }
  }
  app.get('/api/dreams', dreamsGetHandler);

  let server;
  await new Promise(resolve => {
    server = app.listen(port, resolve)
  });
  return server;
}

async function stopping({server}) {
  await new Promise(resolve => server.close(resolve));
}

module.exports = {
  starting,
  stopping
};

Rewrite main.js

With these two new files, main.js can be rewritten as a much smaller piece of bootstrapping code, again mostly extracted from the original main.js:

require('dotenv').config();
const { connect } = require('./db');
const { starting } = require('./server');

const dbConnectionString = process.env.DATABASE;
const db = connect(dbConnectionString);

const port = 3001; // Note: must match port of the "proxy" URL in app/package.json

async function running() {
  await starting({db, port});
  console.log(`Server is listening on port ${port}`);
}

running();

This completes the refactoring, although the tests are not quite ready yet. But we should now verify that the server still works as expected. Switch back to the terminal window where the server is running, stop it with Ctrl+C and restart it with npm start. Then open http://localhost:3001/api/dreams in a browser and verify that it still returns the dreams we know are in the production database:

[
  {"id": "1", "title": "Compose a tune"},
  {"id": "2", "title": "Visit Uruguay"},
  {"id": "3", "title": "Write a sci-fi novel"}
]

Next we will work on getting the tests to pass.

Create test database instance

First we need to create an additional database instance dedicated to testing. Follow all the instructions from the Database section of the previous blog post.

This new database instance will have a different connection string. Create a new file server/.env.test with this content:

DATABASE=your PostgreSQL connection URL for your test database instance

Change your root .gitignore file to ignore all the .env-files to avoid accidentally pushing your secret database password.

.env*

Populate with test data

When writing tests, we want to ensure that the database contains a predictable set of standard test data, so we know what the test can expect. We will create a function to reset and populate the test database in a new file server/src/setupTestData.js:

async function resetting ({db}) {
  await db.query(`
      delete from dream;
      alter sequence dream_id_seq restart;
      insert into dream (title) values ('Visit Albania');
      insert into dream (title) values ('Learn French');
    `);
}

module.exports = {
  resetting
};

Extend server.test.js

We now have all the building blocks to extend our test to run against a freshly reset test database. The test itself (the it block) is unchanged, and we have added a before and after block to set up the environment for the test to run in (new code highlighted in bold):

    require('dotenv').config({path: '.env.test'});
    const { expect } = require('chai');
    require('isomorphic-fetch');
    const { describe, it } = require('mocha');
    
    const { connect, disconnect } = require('./db');
    const { starting, stopping } = require('./server');
    const { resetting } = require('./setupTestData');
    
    const dbConnectionString = process.env.DATABASE;
    const port = 3011;
    
    describe('server', function () {
      let db, server;
    
      before(async function () {
        db = connect(dbConnectionString);
        await resetting({db});
        server = await starting({db, port});
      });
    
      after(async function () {
        if (server) await stopping({server});
        disconnect();
      });
    
      it('should fetch all dreams', async function () {
        const response = await fetch(`http://localhost:${port}/api/dreams`);
        expect(response.status).to.equal(200);
        const dreamList = await response.json();
        expect(dreamList).to.deep.equal([
          { title: 'Learn French', id: dreamList[0].id },
          { title: 'Visit Albania', id: dreamList[1].id }
        ]);
      });    
    });

When we now re-run our tests (you might need to Ctrl+C and npm test), the test will succeed with output similar to:

server
  √ should fetch all dreams (48ms)

1 passing (228ms)

We have now completed the back-end testing part of this tutorial. Time for a break before we continue with the front-end testing!


Front-end tests

Because we used create-react-app to create the initial skeleton for our front-end code, we already have Jest installed. However, we also want to install Enzyme to simplify component testing:

cd app
npm install enzyme enzyme-adapter-react-16 -D 

We can request Jest to initialize Enzyme by creating a file app/src/setupTests.js:

import { configure } from 'enzyme';
import Adapter from 'enzyme-adapter-react-16';

configure({ adapter: new Adapter() });

Now create the initial version of our test in app/src/DreamList.test.js:

import React from 'react';
import {mount} from 'enzyme';
import DreamList from './DreamList';

describe('DreamList', function () {
  describe('render', function () {
    it('should render data', async () => {
      const w = mount(<DreamList />);
      expect(w.find('h3').text()).toEqual('All my dreams');
      expect(w.find('ul > li').map(e => e.text())).toEqual([
        'Learn French',
        'Visit Albania'
      ]);
    });
  });
});

We can start the Jest test runner in a new terminal window:

cd app
npm test 

This will run Jest in watch-mode: When you make changes to the code, Jest will automatically re-run the affected tests. You can stop Jest with Ctrl+C.

The test will fail with a very detailed and informative error message, looking somewhat like this:

● DreamList › render › should render data

  expect(received).toEqual(expected) // deep equality

  - Expected
  + Received

  - Array [
  -   "Learn French",
  -   "Visit Albania",
  - ]
  + Array []

     8 |       const w = mount(<DreamList />);
     9 |       expect(w.find('h3').text()).toEqual('All my dreams');
  > 10 |       expect(w.find('ul > li').map(e => e.text())).toEqual([
       |                                                    ^
    11 |         'Learn French',
    12 |         'Visit Albania'
    13 |       ]);

    at Object.toEqual (src/DreamList.test.js:10:52)

Note that the first expect on line 9 succeeds, so the component was rendered, however with an empty list of "dreams". To get the second expect on line 10 to pass we need to ensure a few more things.

When you run the app (npm start) you can probably see that the component is initially rendered without any "dreams" being listed, and after a short while the "dreams" are rendered. The test above only test the initial render, so first we need to make the test wait for the asynchronous DreamList.componentDidMount() to finish fetching data and re-rendering the component.

React does not have any mechanism of letting the test know about when life-cycle methods, such as componentDidMount() runs. So instead, we will need to let the DreamList component handle this explicitly.

We need a small utility function to create a Promise that can be explicitly resolved outside the scope of the usual promise constructor, create it in: app/src/resolvable.js:

export function resolvable () {
  let _resolve;
  const promise = new Promise(resolve => _resolve = resolve);
  promise.resolve = () => _resolve();
  return promise;
}

Import this function into DreamList.test.js:

import {resolvable} from "./resolvable";

and expand the mount(<DreamList />) line into these 4 lines:

const loading = resolvable();
const w = mount(<DreamList loading={loading}/>);
await loading;
w.update();

and add this line to the end of componentDidMount() in DreamList.js:

if (this.props.loading) this.props.loading.resolve();

Now the test fails with an error message similar to this:

● DreamList › render › should render data

  TypeError: Network request failed

This is because the front-end expect the server to listen on port 80, but the server is actually listening on port 3001. If we temporarily change server/src/main.js to use port 80 instead of port 3001 and re-run the server and front-end tests we instead get this more familiar error message:

● DreamList › render › should render data

  expect(received).toEqual(expected) // deep equality

  - Expected
  + Received

    Array [
  -   "Learn French",
  -   "Visit Albania",
  +   "Compose a tune",
  +   "Visit Uruguay",
  +   "Write a sci-fi novel",
    ]

So we can see that the test now properly waits for the data to be loaded and the component to be re-rendered before running the expect checks, which none-the-less still fails, now because the data being returned is production data, not standard test data.

However, we still don't want to run our tests against the production server, so let's change server/src/main.js back to use port 3001 as before and find a better solution.

So we need to ensure one more thing: when testing the front-end, the fetch() calls should respond with predictable responses. There are 3 potential ways to ensure this:

  1. Make every front-end test reset test data and stop and start the server
  2. Write manual mocks for the calls to fetch
  3. Record and playback HTTP responses

We want to avoid coupling the front-end tests to a real back-end, primarily for performance reasons. Most of the code in modern web applications are written for the front-end, so we want those tests to be fast. So we don't want to do 1).

We also want to avoid manually mocking the calls to fetch, because those mocks very easily grows out-of-sync with the actual back-end. If we did manual mocking we are no longer doing integration tests. So we also don't want to do 2).

So instead we will record and playback HTTP responses. This allows our front-end tests to be fast and full integration tests. Read an earlier blog post of mine for more information: Unit test your service integration layer.

Record and playback HTTP requests

We will use a tool called Polly to record and playback HTTP requests. Since Polly needs to be involved in both the back-end tests (record) and front-end tests (playback) we will create a new folder, test outside of server and app:

cd your-project-directory-name
mkdir test
cd test
npm init --yes

And then install the Polly modules we need:

npm install @pollyjs/core @pollyjs/adapter-node-http @pollyjs/persister-fs -D

We will parameterize the Polly configuration for each of the two use-cases in a new file test/setupPolly.js (and note that you can easily enable logging if you need to debug a failing test):

const path = require('path');

const { Polly } = require('@pollyjs/core');
const NodeHttpAdapter = require('@pollyjs/adapter-node-http');
const FsPersister = require('@pollyjs/persister-fs');
const PollyUtils = require('@pollyjs/utils');

Polly.register(NodeHttpAdapter);
Polly.register(FsPersister);

function mockHttp(name, mode) {
  const pollyOptions = {
    mode,
    adapters: ['node-http'],
    persister: 'fs',
    persisterOptions: {
      fs: {
        recordingsDir: path.join(__dirname, './recordings')
      }
    },
    // logging: true,
    recordFailedRequests: mode === PollyUtils.MODES.RECORD,
    recordIfMissing: mode === PollyUtils.MODES.RECORD,
    matchRequestsBy: {
      headers: false,
      order: false,
      url: {
        port: false,
        hostname: false,
        protocol: false,
      }
    }
  };
  return new Polly(name, pollyOptions);
}

const recordHttp = (name) => mockHttp(name, PollyUtils.MODES.RECORD);
const stubHttp = (name) => mockHttp(name, PollyUtils.MODES.REPLAY);

module.exports = {
  recordHttp,
  stubHttp
};

We can now use Polly in server.test.js. First make it available:

const { recordHttp } = require('../../test/setupPolly');

and declare a polly instance inside describe():

let polly;

Start recording as the last line of before():

polly = recordHttp('dream');

And stop recording as the first line of after():

if (polly) await polly.stop();

After re-running the server tests (npm test) you can now inspect the recorded HTTP interaction in test/recordings/dream_####/recording.har. This file is in the HAR-file format, which is also supported by the Network tab in browsers' developer tools. You should be able to find the following lines in there:

"url": "http://localhost:3011/api/dreams"
...
"text": "[{\"id\":\"2\",\"title\":\"Learn French\"},{\"id\":\"1\",\"title\":\"Visit Albania\"}]"

This is exactly the data we need in our front-end test, so let's switch over to DreamList.test.js.

First we will make Polly available:

import {stubHttp} from '../../test/setupPolly';

Then we will start and stop playback inside describe('DreamList'):

let polly;

beforeAll(function () {
  polly = stubHttp('dream');
});

afterAll(async function () {
  if (polly) await polly.stop();
});

Re-running the tests (npm test) will now show that the front-end tests passes as well:

PASS  src/DreamList.test.js

Congratulations: Your application is now covered with fast and full integration tests that you can easily evolve to keep your application working as it grows!

Learn more

To extend your tests you may need to learn more about how to use the tools we used. Here is a list of links to documentation and tutorials.

Discuss on Twitter