the first basic argument for node. JS is the high I/O overhead.
Waiting for I/O completion in the current programming technology can waste a lot of time. There are several ways to handle this performance impact:
- Sync : Process one request at a time, in turn. Pros: simple; disadvantage: Any request can block all other requests.
- Fork A new process : Open a new process to process each request. Pros: Easy; disadvantage: not good expansion, hundreds of connections means hundreds of processes. Fork () function is equivalent to Unix the programmer's hammer, as it is useful, each problem looks like a nail and is usually overused. ( Translator Note: Literal translation is a bit of a mouthful, I understand thatUnix programmers can use the fork () function to solve a lot of problems )
- Threads : Open a new thread to process each request. Pros: Easy, more friendly to the kernel than using fork , because threads are usually less expensive, disadvantage: machines may not have threads, thread programming can be very complex and fast, but you must focus on permissions issues for shared resources.
The second basic argument is that each connection opens a thread with a high memory overhead.
Apache is multithreaded: it generates a thread ( or process, depending on The configuration in Profile conf )for each request. As the number of concurrent connections increases, you can see how the basic overhead runs out of memory and requires more threads to serve the clients simultaneously online. Nginx and node. js are not multi-threaded, because threads and processes bring a lot of memory overhead. They are single-threaded, but event-based. The overhead of handling multiple connections in one thread is less expensive than handling connections in thousands of threads or processes.
node. js keeps your code single-threaded ...
It's really a single thread running: You can't have any parallel code operations, such as simulating a "sleep" blocking server for one second:
var New Date (). GetTime (); while (new Date (). GetTime () < now + 1000) {}
When this code executes, node.JS does not respond to any requests from the client because it has only one thread to execute your code. Just like you. Some CPUs are performing intensive tasks, such as scaling a picture, which blocks all other requests.
...... No matter what, everything can be parallel, except for your code
The code cannot be parallelized in one request. However, all I/O operations are event-and asynchronous-based, so the following code does not block the server:
C.query (' SELECT SLEEP (20); ' , function (err,results,fields) { if(err) { throw err; } Res.writehead (200,{' content-type ': ' text/html '); Res.end (''); C.end ();
If you do this in a request, other requests can be handled well when the database is performing sleep .
Why is this good for you? When can we change from synchronous to asynchronous / parallel execution?
Synchronous execution is advantageous because it is possible to write code easily ( the concurrency problem has a Tendency to lead to wtfs compared to multithreading ) ( Translator Note:WTF Here is what the fuck abbreviation ).
In node.js I/O When you operate, you only need to use a callback function; You can guarantee that your code will not be interrupted, I/O / process cost For example
Asynchronous I/O operations also benefit because I/O operations are much more expensive than most code, and we should do something better than just waiting for I/O operations.
Event polling is " " i/o call is some call point, node.js i/o when called, Your code saves the callback and returns the process control to node.js the operating environment. The callback method is called when the data is available.
Of course, in the background or threads and processes for database access and process execution. But your code doesn't need to show the touch of these, so you don't have to worry about them, just know I/O interaction is on the line. For example, from the point of view of each request, the database or other process needs to be asynchronous because the results of these threads are polled by the event to return to your code. compared to the Apache module, you can reduce thread and thread overhead because you do not need to create threads for each connection, even if you are Certain that something needs to be done in parallel To deal with this kind of management.
Apart fromI/Ocalled,node. jsIt is hoped that all requests can be returned quickly;CPUintensive tasks should be separated into another process, then interacted through events, or by using an abstract concept likewebworkers. It(obvious.)It means you can't parallelize your code unless there's another thread in the background that you can interact with through events. Basically all the objects that can emit an event(For exampleEventemitterinstances of)supports asynchronous event interaction, and you can interact with blocking code in this way, such as the file being used,Socketsor sub-processes, all of which arenode. jsin theeventemitters. Multicore can use this method;Node-http-proxy.
Internal implementation
Internally,node. js relies on Libev to provide event polling,Libev relies on Libeio, Libeio uses line pool to provide asynchronous I/O. To learn more about events, see the Libev documentation.
So how do you do it asynchronously in node. js ?
- function. For example, we pass functions as data, change them, and execute them when needed.
- Compound function. Also known as an anonymous function or a closure, executes when an event occurs in event-based I/O .
(The first is to pass a defined function, the second is to pass an anonymous function )
Original link http://blog.mixu.net/2011/02/01/understanding-the-node-js-event-loop/
Understanding the node. JS Event Polling