Performance Testing Express, Fastify, and NestJS (With Express/Fastify)

Generate insightful benchmark with Artillery.io

In the past few years, I used Express quite extensively, before moving to the amazing NestJS. I also used Fastify occasionally and, while it's not the framework I have the most experience with, I'm well aware of its performance capabilities and maturity.

NestJS with Express or Fastify?

If you aren't aware yet, NestJS is not an HTTP server framework by itself. It uses an actual server framework under the hood. It's configured by default with ExpressJS and provides easy integration with Fastify.

If you are just like me, you've read everywhere that higher-level frameworks tend to have lower performances. The features and ease of development often come with additional abstractions and processes, leading to a decline in performance.

But is that really true? Does Fastify indeed have better performance than Express? Also, what's the impact of using NestJS?

Applications

Luckily, I personally maintain four boilerplate projects with the same features, made with Express, Fastify, NestJS (with Express), and NestJS (with Fastify).

They're quite simple Back-End applications, a newsletter! To be more precise, they are made with:

  • TypeScript

  • Prisma and PostgreSQL

  • Passport (and JWT) for authentication

  • Class-validator for data validation

They provide a restricted set of functionalities:

  • Authentication (Register, Login)

  • Manage users (Update, Delete)

  • Manage articles (Create, Delete, List, Detail)

Testing context

Once I have access to those applications, it's possible to performance test them, and compare the results. Before doing so, let's define the context of our tests.

First of all, the database will be hosted on AWS RDS, and the applications themselves will be on Beanstalk (on a t2.medium).

Then, we need a solution to actually run the performance tests and generate results. I do so thanks to the amazing Artillery.

Artillery Setup

You can find the complete Artillery setup on GitHub.

My strategy is quite straightforward. I'll set up four environments with the different applications, and apply the same scenario on each of them.

Here, you can see I:

  • Create a new user

  • Login

  • Create a new article

  • Get a list of articles

In order to run this script, I need to generate unique data. It's necessary to avoid creating a user with the same email, or articles with the same title. With Artillery, it can be done thanks to a processor:

Finally, I have three external configurations:

Those files define the number of requests done in different phases. Artillery works with virtual users, defined by arrivalRate. To be more precise, it defines the number of virtual users created per second.

The goal with those configurations is to simulate different levels of loads on our servers. This way, we can see the evolution of response time depending on usage. Also, with the failure-usage configuration, we can see at which point our server fails.

Running tests

With artillery, we can run our tests with the created configuration, for example with:

artillery run -o ./report-low-usage.json -c low-usage.yml -e fastify ./common-scenario.yml

Here, I'm running my main scenario, with the fastify environment, using the low-usage configuration. The results are then exported to ./report-low-usage.json.

I'm not really interested in a JSON file. Thankfully, we can generate a visual report based on it:

artillery report ./report-low-usage.json

We now have access to an HTML report, with the following information at the top:

I'm not really interested in this high-level overview, but we have access to more detailed results below. For example, we can have a look at the created virtual users:

Here, we have a report every 10 seconds of the created users. On average we have 20 users, which makes sense as we used the configuration with 2 users per second.

Then, we have the most interesting part of the report under http.response_time:

Here, we can find the response that took the least time, the most time, the median time, as well as p95 and p99. If you aren't sure:

  • p95 defines the maximum response time among the best 95% requests.

  • p99 defines the maximum response time among the best 99% requests.

Comparing performance

While the previous report is a great way to read a single result, it doesn't allow for easy comparison. Instead, I'll merge results from every application, and separate them by the simulated load.

Low load

First, we can have a look at the response time under a low load:

Here, we can find that the results are quite similar. The biggest difference can be found under the max response, but I don't think it's significant. A single max response is not a very good indicator as, all you need, is a single request taking longer.

Medium load

A more interesting report is for a medium load, closer to real-world usage. To be more precise, for the median, p95, and p99 responses:

This time, we have very interesting results. To be honest, they are even quite surprising! It seems like Fastify indeed has significantly better performance.

Not only has Fastify better performance, it seems like the NestJS counterpart for both Express and Fastify, have the same performance as the low-level http frameworks.

Failure

Because the third configuration overloads the applications until they cannot respond, the response time becomes unexploitable. Instead, on the default report, we have access to a errors.ETIMEDOUT section.

It tells us how many requests fail, but also when they start to fail. Below, you can find a comparison of the Fastify-based application (top) and Express-based application (bottom).

Here, we can see the Express-based application fails to answer requests earlier and more often than the Fastify application.

For Express, it starts to fail around 2 minutes, which corresponds to 5 virtual users on our configuration. For Fastify, it starts to fail 1 minute later, when we have 6 virtual users.

Interestingly enough, the results are exactly the same for NestJS with Express and NestJS with Fastify.

Conclusion

From those results, I have two main conclusions:

Fastify indeed has significantly better performance, which transposes to NestJS with Fastify.

NestJS is even better than I thought, not only does it bring us higher-level tools, but it barely has any impact on performance.


This article was originally a video. If you want to learn how to create production-ready Back-End application, have a look here.