Node.js, lots of ways to block your event-loop (and how to avoid it)
In this article, we will see some ways to quickly block or slow down the Event-loop of Node.js. If you are not familiar with the concept of “Event-loop” in Node.js, I recommend you read some articles first about this subject. The most important thing to remember is: the Event-loop is single-threaded, so if you block it or slow it down then this will impact your entire application.
Event loop quick overview
I will not go into a long explanation about the Event-loop since many have done it before and better than me. For the understanding of this article you just have to remember 2 things :
the Event-loop is the heart of Node.js, it can be seen as an abstraction of how Node.js executes code and runs your application
it must run without interruption and without slowing down otherwise, your users will quickly become frustrated
The Event-loop can be schematized as follows (thanks to Bert Belder for the diagram):
Event-loop overview
Event loop limitations & dangers
As previously said it’s crucial to have a running event loop and keep its latency as low as possible. The latency is basically the average time separating two successive iterations of your Event-loop.
A potential point of failure in Node.js application can come from two factors:
The Event-loop is single-threaded
Also known as the “one instruction to block them all” factor. Indeed in Node.js you can block every request just because one of them had a blocking instruction. A good review of your code should always start with a distinction between blocking and non-blocking code.
Non-blocking code (simple instructions):
let a = 0
for (let i = 1; i < n; i++) {
a = a + i
}
Observation 1: do not confuse blocking code and infinite loop, a blocking code is generally a long operation (more than a few milliseconds).
Observation 2: try to differentiate long operations and operations that will slow down or block the Event-loop. Some long operations can be handled asynchronously without disturbing your app (like database access).
Observation 3: long response time of your application does not necessarily mean you have a blocking task (it can be related to long DB access, external API calls, etc).
Difference between, long operations, blocking code, etc
2. Thread pool limit
Node.js tries to always process a blocking operation with async APIs or with a thread pool. In this manner, some blocking operations become non-blocking from your application’s point of view. As much as possible it will use async APIs as it’s a more powerful and lightweight system and it keeps the usage of thread pool when no other choice is possible. Why? Only because a thread has a bigger footprint on your system and consumes more resources.
There are a few cases where Node.js has to use the thread pool:
all fs (File system) operations, except fs.FSWatcher()
And this thread pool has a size limit, by default, Node.js has access to only 4 threads, so you can parallelize only 4 operations at the same time.
This value can be customized with the variable UV_THREADPOOL_SIZE.
UV_THREADPOOL_SIZE=16 node index.js
In any case, every operation that uses the thread pool behind the scenes is a potential performance bottleneck.
How to slow down the event loop
CPU-intensive operations: crypto
The Node.js crypto lib is known to have a lot of functions that use a lot of CPU. In a real case, it means you can quickly slow down your application. The problem becomes critical when this lib is used in every incoming request. It will:
slow down all individual requests (and generate users frustration)
generate too many instances to compensate for the increase in CPU consumption
We prefer to generate it only once and then reuse it. In this manner, you don’t slow down the event loop for each new request.
// Generate your token where you want but only once
const salt = crypto.randomBytes(128).toString('base64');
const hash = crypto.pbkdf2Sync('myPassword', salt, 10000, 512, 'sha512')
app.get('/myRoute', async (req, res) => {
try {
const response = await axios.get(`/data`, { headers: { Authorization: hash } })
res.status(200).send(response.data)
} catch (err) {
console.log(err)
res.status(400).send({ data: null })
}
})
Of course, it’s a simple example but here is the difference in terms of performance:
Before:
After:
We go from 195 requests in 10 seconds to 39,434: no possible comparison!
In a real case, it means you will decrease the number of instances you need to serve the same amount of requests and/or you can use smaller servers to do the same work.
JSON.parse / JSON.stringify
Another interesting point is the famous JSON parser. We commonly use JSON.stringify and JSON.parse functions, but these two methods have a complexity of O(n) where n is the length of your JSON object.
Let’s see the difference when we use JSON.stringify with a small JSON file (~0.4Kb) and a large JSON file (~9Mb).
With a small JSON file
With a big JSON file
We go from 252 requests in 10 seconds to 75k. The solution can be to work with small files only or to load large files only once.
If you really need to work with large JSON objects you should take a look at these solutions:
As said before, each time you read a file you will potentially create a performance bottleneck, especially if you read a file each time a request is processed. Sometimes this operation is hidden inside a dependency and it’s hard to detect.
I will talk about a concrete example we encountered in one of our projects at Voodoo. We use the MaxMind database to extract the user’s country from the IP address. To do that we simply use an existing npm module. Basically it uses readFile from Node.js core (fsmodule) under the hood. It’s an asynchronous operation, so it should be a piece of cake, right?
But for every new incoming request, we read the DB file (remember we have a limited number of threads for this). So in a high traffic API, it tends to slow down the Event-loop.
Solution: store all the DB in memory during server startup.
The following chart should speak for itself concerning the performance gain.
Average latency after the deployment
Vulnerable regexp
A vulnerable regular expression is one on which your regular expression engine might take exponential time.
Most of the time your regexp complexity will be O(n) (where n is the length of your input) but it some cases it can beO(n^2) and it can lead to REDOS.
Let see a simple regexp to check if an email address is valid.
[a-z]+@[a-z]+([a-z\.]+\.)+[a-z]+
Now we can measure the execution time with a simple email address and with a fake email.
Vulnerable regexp can block the Event-loop
If you add some points at the end of the input it will quickly block your app. In this simple example, we go from 0.05s to 8.4s. And you can add a few more points to completely block your Node.js instance.
To avoid it you can check your regexp with some tools like safe-regex, or you can use solutions that will handle regexp for you like validator.js.
How to block the event loop
Programmatic errors
Of course, the easiest way to block your application is to insert an infinite loop. It seems obvious to detect and to avoid but it’s still possible especially when you work a lot with modules or with events.
Sometimes this kind of behavior is created faster than you might think, even by good programmers. Let see the example with date and while loop.
const end = Date.now() + 5000;
while (Date.now() < end) {
// do something here ...
}
Still not convinced? What about process.nextTick()?
process.nextTick() & infinite loop
process.nextTick() will invoke a callback at the end of the current operation, before the next event loop tick starts.
process.nextTick() in the Event-loop
It can be used in some cases, but the problem is:
it will prevent the event loop to continue its cycle until your callback is finished
it allows you to block every I/O by making recursive process.nextTick() calls. It’s not technically an infinite loop but it will produce the same effect, like a bad recursive function without termination condition.
Recursion has something to do with infinity
Sync operations
This is not a surprise, synchronous operations in Node.js are bad practices. If you have read this whole article it should be obvious to you! Every time you use them, you will block your entire application until the operation is finished. Node.js will not be able to use the thread pool or async APIs and the event-loop activity will be suspended.
const fs = require('fs')
const content = fs.readFileSync('myFile', 'utf8')
// code waiting for the sync operation to be completed
// ...
const execSync = require('child_process').execSync
code = execSync('node -v')
// code waiting for the sync operation to be completed
// ...
How to create an infinite event loop (your program will never exit)
Let’s say you want to create a simple program that needs to exit after a simple task is finished, like a worker or a simple script. Those programs are supposed to stop in any case and very quickly. But you can create a situation where the Event-loop will never exit. Do you remember the first diagram? There is a ref which is a simple counter of all pending tasks in the Event-loop. If this ref is greater than 0, then the program will not exit and Node.js will check every pending task. If a task is finished then the ref will be decrement. So you will only be able to exit your program once all the tasks are finished and so if the refis equal to 0.
setInterval
Timers are the best example! If you introduce a simple setInterval inside a script, if you don’t clear this timer, it will run forever, and your program will too.
console.log('Start ...');
// do some stuff here
// ...
setInterval(() => {
// whatever is here you are trapped
}, 1000)
// do some stuff here
// ...
process.on('exit', (code) => {
// you will never see this
return console.log(`Stop with code ${code}`);
});
To avoid this, you can:
clear all your timers when they become no longer useful
use process.exit() or process.abort() or process.kill()
Event listeners (no problem)
An event listener can be seen as a background task that will go on forever until you clean it. We can assume it will increment the ref counter of the EventLoop and so create a kind of infinite loop. But it’s not the case, even if you forget to remove your handlers.
const events = require('events')
EventEmitter = events.EventEmitter
const eventEmitter = new EventEmitter()
eventEmitter.on("eventName", (data) => {
// do some stuff
})
// ....
process.on('exit', () =>{
// you will reach this point
})
Even if you don’t block the Event-loop with an EventEmitter it’s always a best practice to clean your listeners. You can use removeListener or removeAllListeners methods.
const events = require('events')
EventEmitter = events.EventEmitter
const eventEmitter = new EventEmitter()
// here you create an infinite task
eventEmitter.on("eventName", (data) => {
// do some stuff
})
process.on('exit', () => {
// you will reach this point
})
// ....
eventEmitter.removeAllListeners('eventName')
Monitoring
Modules
Some tools can help you to inspect the Event-loop state and to visualize its behavior:
you can use the internal methods directlyprocess._getActiveRequests()and process._getActiveHandles() which will give you the raw data about tasks inside your Event-loop.
Some APM solutions provide information about Event-loop and its latency. It can be useful to detect an instance in a bad state.
Some of them display information about Garbage Collector which is another key concept to better understand Node.js and to debug your application. If you want to learn more about it you can read my article about GC.
Charts helps us to visualize large amount of data in an easy to understand and interactive way. This helps businesses to grow more by taking important decisions from the data. For example, e-commerce can have charts or reports for product sales, with various categories like product type, year, etc. In angular, we have various charting libraries to create charts. Ngx-charts is one of them. Check out the list of best angular chart libraries . In this article, we will see data visualization with ngx-charts and how to use ngx-charts in angular application ? We will see, How to install ngx-charts in angular ? Create a vertical bar chart Create a pie chart, advanced pie chart and pie chart grid Introduction ngx-charts is an open-source and declarative charting framework for angular2+. It is maintained by Swimlane . It is using Angular to render and animate the SVG elements with all of its binding and speed goodness and uses d3 for the excellent math functio...
forRoot / forChild is a pattern for singleton services that most of us know from routing. Routing is actually the main use case for it and as it is not commonly used outside of it, I wouldn’t be surprised if most Angular developers haven’t given it a second thought. However, as the official Angular documentation puts it: “Understanding how forRoot() works to make sure a service is a singleton will inform your development at a deeper level.” So let’s go. Providers & Injectors Angular comes with a dependency injection (DI) mechanism. When a component depends on a service, you don’t manually create an instance of the service. You inject the service and the dependency injection system takes care of providing an instance. import { Component, OnInit } from '@angular/core'; import { TestService } from 'src/app/services/test.service'; @Component({ selector: 'app-test', templateUrl: './test.component.html', styleUrls: ['./test.compon...
During the automation of multiple tasks on my job and personal projects, i decided to move on Puppeteer instead of the old school PhantomJS. One of the most usual problems with pages that contain a lot of content, because of the ads, images etc. is the load time, an exception is thrown (specifically the TimeoutError) after a page takes more than 30000ms (30 seconds) to load totally. To solve this problem, you will have 2 options, either to increase this timeout in the configuration or remove it at all. Personally, i prefer to remove the limit as i know that the pages that i work with will end up loading someday. In this article, i'll explain you briefly 2 ways to bypass this limitation. A. Globally on the tab The option that i prefer, as i browse multiple pages in the same tab, is to remove the timeout limit on the tab that i use to browse. For example, to remove the limit you should add: await page . setDefaultNavigationTimeout ( 0 ) ; COPY SNIPPET The setDefaultNav...
Comments
Post a Comment