Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds

Improving Code Quality

Save for later
  • 18 min read
  • 22 Sep 2014

article-image

In this article by Alexandru Vlăduţu, author of Mastering Web Application Development with Express, we are going to see how to test Express applications and how to improve the code quality of our code by leveraging existing NPM modules.

(For more resources related to this topic, see here.)

Creating and testing an Express file-sharing application

Now, it's time to see how to develop and test an Express application with what we have learned previously.

We will create a file-sharing application that allows users to upload files and password-protect them if they choose to. After uploading the files to the server, we will create a unique ID for that file, store the metadata along with the content (as a separate JSON file), and redirect the user to the file's information page. When trying to access a password-protected file, an HTTP basic authentication pop up will appear, and the user will have to only enter the password (no username in this case).

The package.json file, so far, will contain the following code:

{
  "name": "file-uploading-service",
  "version": "0.0.1",
  "private": true,
  "scripts": {
    "start": "node ./bin/www"
  },
  "dependencies": {
    "express": "~4.2.0",
    "static-favicon": "~1.0.0",
    "morgan": "~1.0.0",
    "cookie-parser": "~1.0.1",
    "body-parser": "~1.0.0",
    "debug": "~0.7.4",
    "ejs": "~0.8.5",
    "connect-multiparty": "~1.0.5",
    "cuid": "~1.2.4",
    "bcrypt": "~0.7.8",
    "basic-auth-connect": "~1.0.0",
    "errto": "~0.2.1",
    "custom-err": "0.0.2",
    "lodash": "~2.4.1",
    "csurf": "~1.2.2",
    "cookie-session": "~1.0.2",
    "secure-filters": "~1.0.5",
    "supertest": "~0.13.0",
    "async": "~0.9.0"
  },
  "devDependencies": {
  }
}

When bootstrapping an Express application using the CLI, a /bin/www file will be automatically created for you. The following is the version we have adopted to extract the name of the application from the package.json file. This way, in case we decide to change it we won't have to alter our debugging code because it will automatically adapt to the new name, as shown in the following code:

#!/usr/bin/env node
var pkg = require('../package.json');
var debug = require('debug')(pkg.name + ':main');
var app = require('../app');

app.set('port', process.env.PORT || 3000);

var server = app.listen(app.get('port'), function() {
  debug('Express server listening on port ' + server.address().port);
});

The application configurations will be stored inside config.json:

{
  "filesDir": "files",
  "maxSize": 5
}

The properties listed in the preceding code refer to the files folder (where the files will be updated), which is relative to the root and the maximum allowed file size.

The main file of the application is named app.js and lives in the root. We need the connect-multiparty module to support file uploads, and the csurf and cookie-session modules for CSRF protection. The rest of the dependencies are standard and we have used them before. The full code for the app.js file is as follows:

var express = require('express');
var path = require('path');
var favicon = require('static-favicon');
var logger = require('morgan');
var cookieParser = require('cookie-parser');
var session = require('cookie-session');
var bodyParser = require('body-parser');
var multiparty = require('connect-multiparty');
var Err = require('custom-err');
var csrf = require('csurf');
var ejs = require('secure-filters').configure(require('ejs'));
var csrfHelper = require('./lib/middleware/csrf-helper');

var homeRouter = require('./routes/index');
var filesRouter = require('./routes/files');

var config = require('./config.json');
var app = express();
var ENV = app.get('env');

// view engine setup
app.engine('html', ejs.renderFile);
app.set('views', path.join(__dirname, 'views'));
app.set('view engine', 'html');

app.use(favicon());
app.use(bodyParser.json());
app.use(bodyParser.urlencoded());
// Limit uploads to X Mb
app.use(multiparty({
  maxFilesSize: 1024 * 1024 * config.maxSize
}));
app.use(cookieParser());
app.use(session({
  keys: ['rQo2#0s!qkE', 'Q.ZpeR49@9!szAe']
}));
app.use(csrf());
// add CSRF helper
app.use(csrfHelper);

app.use('/', homeRouter);
app.use('/files', filesRouter);

app.use(express.static(path.join(__dirname, 'public')));

/// catch 404 and forward to error handler
app.use(function(req, res, next) {
  next(Err('Not Found', { status: 404 }));
});

/// error handlers

// development error handler
// will print stacktrace
if (ENV === 'development') {
  app.use(function(err, req, res, next) {
    res.status(err.status || 500);
    res.render('error', {
      message: err.message,
      error: err
    });
  });
}

// production error handler
// no stacktraces leaked to user
app.use(function(err, req, res, next) {
  res.status(err.status || 500);
  res.render('error', {
    message: err.message,
    error: {}
  });
});

module.exports = app;

Instead of directly binding the application to a port, we are exporting it, which makes our lives easier when testing with supertest. We won't need to care about things such as the default port availability or specifying a different port environment variable when testing.

To avoid having to create the whole input when including the CSRF token, we have created a helper for that inside lib/middleware/csrf-helper.js:

module.exports = function(req, res, next) {
  res.locals.csrf = function() {
    return "<input type='hidden' name='_csrf' value='" + req.csrfToken() + "' />";
  }

  next();
};

For the passwordprotection functionality, we will use the bcrypt module and create a separate file inside lib/hash.js for the hash generation and passwordcompare functionality:

var bcrypt = require('bcrypt');
var errTo = require('errto');

var Hash = {};

Hash.generate = function(password, cb) {
  bcrypt.genSalt(10, errTo(cb, function(salt) {
    bcrypt.hash(password, salt, errTo(cb, function(hash) {
      cb(null, hash);
    }));
  }));
};

Hash.compare = function(password, hash, cb) {
  bcrypt.compare(password, hash, cb);
};

module.exports = Hash;

The biggest file of our application will be the file model, because that's where most of the functionality will reside. We will use the cuid() module to create unique IDs for files, and the native fs module to interact with the filesystem.

The following code snippet contains the most important methods for models/file.js:

function File(options, id) {
  this.id = id || cuid();
  this.meta = _.pick(options, ['name', 'type', 'size', 'hash', 'uploadedAt']);
  this.meta.uploadedAt = this.meta.uploadedAt || new Date();
};

File.prototype.save = function(path, password, cb) {
  var _this = this;

  this.move(path, errTo(cb, function() {
    if (!password) { return _this.saveMeta(cb); }

    hash.generate(password, errTo(cb, function(hashedPassword) {
      _this.meta.hash = hashedPassword;

      _this.saveMeta(cb);
    }));
  }));
};

File.prototype.move = function(path, cb) {
  fs.rename(path, this.path, cb);
};

For the full source code of the file, browse the code bundle. Next, we will create the routes for the file (routes/files.js), which will export an Express router. As mentioned before, the authentication mechanism for password-protected files will be the basic HTTP one, so we will need the basic-auth-connect module. At the beginning of the file, we will include the dependencies and create the router:

var express = require('express');
var basicAuth = require('basic-auth-connect');
var errTo = require('errto');
var pkg = require('../package.json');
var File = require('../models/file');
var debug = require('debug')(pkg.name + ':filesRoute');

var router = express.Router();

We will have to create two routes that will include the id parameter in the URL, one for displaying the file information and another one for downloading the file. In both of these cases, we will need to check if the file exists and require user authentication in case it's password-protected. This is an ideal use case for the router.param() function because these actions will be performed each time there is an id parameter in the URL. The code is as follows:

router.param('id', function(req, res, next, id) {
  File.find(id, errTo(next, function(file) {
    debug('file', file);

    // populate req.file, will need it later
    req.file = file;

    if (file.isPasswordProtected()) {
      // Password

protected file, check for password using HTTP basic auth
      basicAuth(function(user, pwd, fn) {
        if (!pwd) { return fn(); }

        // ignore user
        file.authenticate(pwd, errTo(next, function(match) {
          if (match) {
            return fn(null, file.id);
          }

          fn();
        }));
      })(req, res, next);
    } else {
      // Not password

protected, proceed normally
      next();
    }
  }));
});

The rest of the routes are fairly straightforward, using response.download() to send the file to the client, or using response.redirect() after uploading the file:

router.get('/', function(req, res, next) {
  res.render('files/new', { title: 'Upload file' });
});

router.get('/:id.html', function(req, res, next) {
  res.render('files/show', {
    id: req.params.id,
    meta: req.file.meta,
    isPasswordProtected: req.file.isPasswordProtected(),
    hash: hash,
    title: 'Download file ' + req.file.meta.name
  });
});

router.get('/download/:id', function(req, res, next) {
  res.download(req.file.path, req.file.meta.name);
});

router.post('/', function(req, res, next) {
  var tempFile = req.files.file;
  if (!tempFile.size) { return res.redirect('/files'); }

  var file = new File(tempFile);

  file.save(tempFile.path, req.body.password, errTo(next, function() {
    res.redirect('/files/' + file.id + '.html');
  }));
});

module.exports = router;

The view for uploading a file contains a multipart form with a CSRF token inside (views/files/new.html):

<%- include ../layout/header.html %>

<form action="/files" method="POST" enctype="multipart/form-data">
  <div class="form-group">
    <label>Choose file:</label>
    <input type="file" name="file" />
  </div>

  <div class="form-group">
    <label>Password protect (leave blank otherwise):</label>
    <input type="password" name="password" />
  </div>

  <div class="form-group">
    <%- csrf() %>
    <input type="submit" />
  </div>
</form>

<%- include ../layout/footer.html %>

To display the file's details, we will create another view (views/files/show.html). Besides showing the basic file information, we will display a special message in case the file is password-protected, so that the client is notified that a password should also be shared along with the link:

<%- include ../layout/header.html %>

<p>
  <table>
    <tr>
      <th>Name</th>
      <td><%= meta.name %></td>
    </tr>
      <th>Type</th>
      <td><%= meta.type %></td>
    </tr>
      <th>Size</th>
      <td><%= meta.size %> bytes</td>
    </tr>
      <th>Uploaded at</th>
      <td><%= meta.uploadedAt %></td>
    </tr>
  </table>
</p>

<p>
  <a href="/files/download/<%- id %>">Download file</a> | 
  <a href="/files">Upload new file</a>
</p>

<p>
  To share this file with your friends use the <a href="/files/<%- id %>">current link</a>.
  <% if (isPasswordProtected) { %>
  <br />
  Don't forget to tell them the file password as well!
  <% } %>
</p>

<%- include ../layout/footer.html %>

Running the application

To run the application, we need to install the dependencies and run the start script:

$ npm i


$ npm start

The default port for the application is 3000, so if we visit http://localhost:3000/files, we should see the following page:

improving-code-quality-img-0

After uploading the file, we should be redirected to the file's page, where its details will be displayed:

improving-code-quality-img-1

Unit tests

Unit testing allows us to test individual parts of our code in isolation and verify their correctness. By making our tests focused on these small components, we decrease the complexity of the setup, and most likely, our tests should execute faster.

Using the following command, we'll install a few modules to help us in our quest:

$ npm i mocha should sinon––save-dev

We are going to write unit tests for our file model, but there's nothing stopping us from doing the same thing for our routes or other files from /lib.

The dependencies will be listed at the top of the file (test/unit/file-model.js):

Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at AU $19.99/month. Cancel anytime
var should = require('should');
var path = require('path');
var config = require('../../config.json');
var sinon = require('sinon');

We will also need to require the native fs module and the hash module, because these modules will be stubbed later on. Apart from these, we will create an empty callback function and reuse it, as shown in the following code:

// will be stubbing methods on these modules later on
var fs = require('fs');
var hash = require('../../lib/hash');

var noop = function() {};

The tests for the instance methods will be created first:

describe('models', function() {
  describe('File', function() {
    var File = require('../../models/file');

    it('should have default properties', function() {
      var file = new File();

      file.id.should.be.a.String;
      file.meta.uploadedAt.should.be.a.Date;
    });

    it('should return the path based on the root and the file id', function() {
      var file = new File({}, '1');
      file.path.should.eql(File.dir + '/1');
    });

    it('should move a file', function() {
      var stub = sinon.stub(fs, 'rename');

      var file = new File({}, '1');
      file.move('/from/path', noop);

      stub.calledOnce.should.be.true;
      stub.calledWith('/from/path', File.dir + '/1', noop).should.be.true;

      stub.restore();
    });

    it('should save the metadata', function() {
      var stub = sinon.stub(fs, 'writeFile');
      var file = new File({}, '1');
      file.meta = { a: 1, b: 2 };

      file.saveMeta(noop);

      stub.calledOnce.should.be.true;
      stub.calledWith(File.dir + '/1.json', JSON.stringify(file.meta), noop).should.be.true;

      stub.restore();
    });

    it('should check if file is password protected', function() {
      var file = new File({}, '1');

      file.meta.hash = 'y';
      file.isPasswordProtected().should.be.true;

      file.meta.hash = null;
      file.isPasswordProtected().should.be.false;
    });

    it('should allow access if matched file password', function() {
      var stub = sinon.stub(hash, 'compare');

      var file = new File({}, '1');
      file.meta.hash = 'hashedPwd';
      file.authenticate('password', noop);

      stub.calledOnce.should.be.true;
      stub.calledWith('password', 'hashedPwd', noop).should.be.true;

      stub.restore();
    });

We are stubbing the functionalities of the fs and hash modules because we want to test our code in isolation. Once we are done with the tests, we restore the original functionality of the methods.

Now that we're done testing the instance methods, we will go on to test the static ones (assigned directly onto the File object):

    describe('.dir', function() {
      it('should return the root of the files folder', function() {
        path.resolve(__dirname + '/../../' + config.filesDir).should.eql(File.dir);
      });
    });

    describe('.exists', function() {
      var stub;

      beforeEach(function() {
        stub = sinon.stub(fs, 'exists');
      });

      afterEach(function() {
        stub.restore();
      });

      it('should callback with an error when the file does not exist', function(done) {
        File.exists('unknown', function(err) {
          err.should.be.an.instanceOf(Error).and.have.property('status', 404);
          done();
        });

        // call the function passed as argument[1] with the parameter `false`
        stub.callArgWith(1, false);
      });

      it('should callback with no arguments when the file exists', function(done) {
        File.exists('existing-file', function(err) {
          (typeof err === 'undefined').should.be.true;
          done();
        });

        // call the function passed as argument[1] with the parameter `true`
        stub.callArgWith(1, true);
      });
    });

  });
});

To stub asynchronous functions and execute their callback, we use the stub.callArgWith() function provided by sinon, which executes the callback provided by the argument with the index <<number>> of the stub with the subsequent arguments. For more information, check out the official documentation at http://sinonjs.org/docs/#stubs.

When running tests, Node developers expect the npm test command to be the command that triggers the test suite, so we need to add that script to our package.json file. However, since we are going to have different tests to be run, it would be even better to add a unit-tests script and make npm test run that for now. The scripts property should look like the following code:

  "scripts": {
    "start": "node ./bin/www",
    "unit-tests": "mocha --reporter=spec test/unit",
    "test": "npm run unit-tests"
  },

Now, if we run the tests, we should see the following output in the terminal:

improving-code-quality-img-2

Functional tests

So far, we have tested each method to check whether it works fine on its own, but now, it's time to check whether our application works according to the specifications when wiring all the things together.

Besides the existing modules, we will need to install and use the following ones:

  • supertest: This is used to test the routes in an expressive manner
  • cheerio: This is used to extract the CSRF token out of the form and pass it along when uploading the file
  • rimraf: This is used to clean up our files folder once we're done with the testing

We will create a new file called test/functional/files-routes.js for the functional tests. As usual, we will list our dependencies first:

var fs = require('fs');
var request = require('supertest');
var should = require('should');
var async = require('async');
var cheerio = require('cheerio');
var rimraf = require('rimraf');
var app = require('../../app');

There will be a couple of scenarios to test when uploading a file, such as:

  • Checking whether a file that is uploaded without a password can be publicly accessible
  • Checking that a password-protected file can only be accessed with the correct password

We will create a function called uploadFile that we can reuse across different tests. This function will use the same supertest agent when making requests so it can persist the cookies, and will also take care of extracting and sending the CSRF token back to the server when making the post request. In case a password argument is provided, it will send that along with the file.

The function will assert that the status code for the upload page is 200 and that the user is redirected to the file page after the upload. The full code of the function is listed as follows:

function uploadFile(agent, password, done) {
  agent
    .get('/files')
    .expect(200)
    .end(function(err, res) {
      (err == null).should.be.true;

      var $ = cheerio.load(res.text);
      var csrfToken = $('form input[name=_csrf]').val();

      csrfToken.should.not.be.empty;

      var req = agent
        .post('/files')
        .field('_csrf', csrfToken)
        .attach('file', __filename);

      if (password) {
        req = req.field('password', password);
      }

      req
        .expect(302)
        .expect('Location', /files/(.*).html/)
        .end(function(err, res) {
          (err == null).should.be.true;

          var fileUid = res.headers['location'].match(/files/(.*).html/)[1];

          done(null, fileUid);
        });
    });
}

Note that we will use rimraf in an after function to clean up the files folder, but it would be best to have a separate path for uploading files while testing (other than the one used for development and production):

describe('Files-Routes', function(done) {
  after(function() {
    var filesDir = __dirname + '/../../files';
    rimraf.sync(filesDir);
    fs.mkdirSync(filesDir);

When testing the file uploads, we want to make sure that without providing the correct password, access will not be granted to the file pages:

  describe("Uploading a file", function() {
    it("should upload a file without password protecting it", function(done) {
      var agent = request.agent(app);

      uploadFile(agent, null, done);
    });

    it("should upload a file and password protect it", function(done) {
      var agent = request.agent(app);
      var pwd = 'sample-password';

      uploadFile(agent, pwd, function(err, filename) {
        async.parallel([
          function getWithoutPwd(next) {
            agent
              .get('/files/' + filename + '.html')
              .expect(401)
              .end(function(err, res) {
                (err == null).should.be.true;
                next();
              });
          },
          function getWithPwd(next) {
            agent
              .get('/files/' + filename + '.html')
              .set('Authorization', 'Basic ' + new Buffer(':' + pwd).toString('base64'))
              .expect(200)
              .end(function(err, res) {
                (err == null).should.be.true;
                next();
              });
          }
        ], function(err) {
          (err == null).should.be.true;
          done();
        });
      });
    });
  });
});

It's time to do the same thing we did for the unit tests: make a script so we can run them with npm by using npm run functional-tests. At the same time, we should update the npm test script to include both our unit tests and our functional tests:

  "scripts": {
    "start": "node ./bin/www",
    "unit-tests": "mocha --reporter=spec test/unit",
    "functional-tests": "mocha --reporter=spec --timeout=10000 --slow=2000 test/functional",
    "test": "npm run unit-tests && npm run functional-tests"
  }

If we run the tests, we should see the following output:

improving-code-quality-img-3

Running tests before committing in Git

It's a good practice to run the test suite before committing to git and only allowing the commit to pass if the tests have been executed successfully. The same applies for other version control systems.

To achieve this, we should add the .git/hooks/pre-commit file, which should take care of running the tests and exiting with an error in case they failed. Luckily, this is a repetitive task (which can be applied to all Node applications), so there is an NPM module that creates this hook file for us. All we need to do is install the pre-commit module (https://www.npmjs.org/package/pre-commit) as a development dependency using the following command:

$ npm i pre-commit ––save-dev

This should automatically create the pre-commit hook file so that all the tests are run before committing (using the npm test command).

The pre-commit module also supports running custom scripts specified in the package.json file. For more details on how to achieve that, read the module documentation at https://www.npmjs.org/package/pre-commit.

Summary

In this article, we have learned about writing tests for Express applications and in the process, explored a variety of helpful modules.

Resources for Article:


Further resources on this subject: