Photo by Victoire Joncheray on Unsplash
Performance Testing a Real-World Back-End: NodeJS vs Go
Are Go performance the real deal?
To be more precise, I'm about to compare two Back-End apps made with NestJS (configured with Fastify) and Gin.
Why this specific choice you may ask? NestJS and Gin are both used extensively and are great solutions for building production-ready Back-End applications.
Configuring NestJS with Fastify makes it one of the most performant setups for NodeJS. It could be a handicap for Go and Gin, unless Gin absolutely destroys one of the best solutions with NodeJS, performance-wise?
This article is built on top of a previous performance comparison for NodeJS, feel free to have a look.
Applications
Both applications provide the same functionalities, a basic newsletter with authentication, user management, and article management.
The NodeJS application is made on top of TypeScript, Prisma, and does some validation with class-validator.
On the other side, the Gin application is made with GORM, and uses the integrated validator.
They both provide authentication capabilities with JWTs and are configured with a PostgreSQL database.
Again, the database is hosted on AWS RDS, while the applications are on Beanstalk (on a t2.medium). If you read the previous article, you already know I'm about to use Artillery to run the actual performance tests.
Tests configuration
I'll apply the same strategy to both environments (with Nest and Gin):
Here, I'm creating a new user and logging in, before creating a new article and fetching different pages of articles (5 on each page). I still need a processor to generate users and articles with unique data:
If you don't know yet, Artillery works with a virtual user system. The number of virtual users can be configured, and one will be created per second, to execute the entire scenario you can see above.
In our current scenario, we have 6 requests:
Register
Login
Create an article
Fetch 3 pages
We can multiply the number of virtual users by 6 to get the request count per second (or 360 to find the count per minute, etc.).
Response time
To get started, we can run our performance test by simulating a medium load on our server. This way, we have exploitable results, without any of our Back-End actually failing.
To do so, 4 virtual users are enough (making 1.440 requests per minute if you don't feel like doing math). With Artillery, we can generate a nice visual report, with response time among other things:
I'm not a monster, I'll provide you with the actual values:
(In milliseconds) | NestJS + Fastify | Gin |
Min | 31 | 29 |
Max | 993 | 246 |
Median | 153 | 40.9 |
p95 | 487.9 | 165.7 |
p99 | 550.1 | 190.6 |
The first time I saw those values, I thought to myself:
Wow, it's looking grim for NodeJS
The Min
and Max
response time are interesting metrics, but that's not what I'm looking for here. You only need a single request to perform well or badly among thousands to get different results.
On the other hand, the median response time is an incredible metric, as well as p95
and p99
. They respectively show us the maximum response time among the 95% and 99% of the best requests.
There, the Gin-based Back-End is indeed destroying the NestJS one, performance-wise. The difference is almost threefold, even I didn't expect that!
Making our servers crash
With the visual report from Artillery, we can also read the requests in timeout. It's a great indicator of when our APIs start to fail. I used the following configuration with my NestJS application:
Here, we have 4 virtual users during the first minute. Then, we go up to 5 users for 2 minutes, and 6 users for 2 more minutes. We can see the requests in timeout for NestJS below:
Here, we can see our server starts to fail around 4 minutes, when we use 6 virtual users (2.160 requests per minute).
We can observe a similar graph with the Go/Gin application, with... 14 virtual users.
Conclusion
I'm first and foremost a JavaScript developer, and have extensive experience on the NodeJS environment.
I worked with NestJS, Prisma, and everything around it for years now. I even have Prisma stickers and developed a library dedicated to unit testing it.
If I made some configuration mistake or introduced bad practices, it might be on the Gin application, not NestJS.
If there is one thing to remember for this exercise, it's that Go performs two to three times better with a handicap, compared to NodeJS.