The most common way to use Nodejs for instant communication is Socket.io,
First of all, Socket.io.
Advantages:
1. Easy to get started
2. Active community (important indicators for evaluating outstanding open source projects)
3. Developer-friendly, avoid developers to adapt to browsers such as IE does not support websocket
4. Ecological integrity, with corresponding Android,ios SDK
Disadvantages:
Pit 1. A production environment on a variety of high-memory or high CPU consumption
Pit 2. Scalability
Pit 3. Non-compatibility upgrade from 0.9x
Pit 1 resolution of memory problems
Socket.io after running for a period of time, memory consumption does not come down, guess if disconnected, memory is not released? But by heapdump analysis of the snapshot, record the next three states to compare,1> when the service starts 2> use peak when 3> use trough, the three contrast found not to guess the situation, socket.io its own variable is already destroyed, after several optimization attempts, It turns out that the V8 engine did not release the memory, Solution: Enable--EXPOSE-GC, scheduled global. GC (),
Pit 2 Solution of CPU problem
At that time, the Socket.io version was also 0.9.16, there is no official recommended load scheme, using cluster long process, whether using the Redis store or memory store will cause the Nodejs service or Redis service CPU high pressure, Cause service response to slow, solution: Use only single process, not long process load (service scale-out is discussed later)
Remaining optimizations: The V8 engine in 64-bit machine-man defaults in allowing a process to consume 1.4G of memory, increasing the server memory usage by setting the startup parameters--max-old-space-size and--max-new-space-size to increase the upper limit of V8 usage memory
Here's the most important scale-out for the service
Demand 1. Users may open multiple Web pages and mobile apps at the same time, the user's actions on multiple pages need to synchronize (for example, switch chat objects, send messages, etc.)
Demand 2. Minimize message synchronization between services
Status 1. All certifications receive session information through NODEJS read user request cookie
Status 2. The common Nginx load scheme has Round-robin,least-connected,ip-hash, but these three kinds do not satisfy the demand very well, when backend increases, if uses the NODEJS official plan to synchronize with all messages between the service, Can cause a very large amount of synchronization.
Solution: forward the session to Nginx, and dynamically route
Use Nginx+lua (openresty) to read session information and dynamically route to the specified backend service, such as User1 5 Web connections and an app connection all routed to Server1
Here the user connection routing rules refer to the Common database table rules, based on the user ID to calculate the corresponding chat server, a single user connection locked on a single service, the advantage is that, whether it is LUA, Nodejs or other languages can easily calculate the user's corresponding connection to the chat server, the Chat service and other services to facilitate the docking.
message synchronization between services : Using Redis's pub/sub, each server subscribes to its own channel, and when user 1 sends a message to a user connected to another service 2 o'clock, Nodejs calculates the server corresponding to user 2 according to the routing rules described above. Publish to User 2 corresponding Server2, Server2 pushes the message to User2
After writing to find can not post code, next time with Markdown to write ...
Reference links
Nginx Lua Wiki
https://www.nginx.com/resources/wiki/modules/lua/
Openresty Chinese Wiki
Https://github.com/iresty/nginx-lua-module-zh-wiki
Openresty Series Courses
http://www.stuq.org/course/detail/1015
Socket.io
Socket.io
Build a scalable Chat service with Nodejs