The Facebook chat article has been around for a long time in infoiq, and piaoger has seen a ppt from Facebook's elder brother on Erlang-factory.
# Keywords
Realtime messaging, C ++, Erlang, long-polling, thrift
# Challenges
How does synchronous messaging work on the web?
"Presence" is hard to scale
Need a system to queue and deliver messages
Millions of connections, mostly idle
Need logging, at least between page loads
Make it work in Facebook's Environment
It is very naive and ridiculous to notify all friends when a user goes online or offline. The cost for this is O (average number of friends × Number of users during peak hours × online/offline frequency) messages/s. The online/offline frequency refers to the average number of online and offline messages per second. When the average number of friends for each user is about several hundred, and the number of online users is about millions at peak hours, the efficiency of this implementation method is almost intolerable.
Piaoger:
When will the online prouct I work with be confused? It will be painful and happy:
When the number of customers of a product increases from zero to 70 million overnight, scalability becomes a problem that must be considered from the very beginning.
# System Overview
System Overview (front-end)
Mix of client-side JavaScript and server-side PHP
Regular Ajax for sending messages, fetching conversation history
Periodic Ajax polling for list of online friends
Ajax long-polling for messages (COMET)
System Overview (back-end)
Discrete responsibilities for each service
-Communicate via Thrift
Channel (Erlang): Message Queuing and delivery
-Queue messages in each user's "Channel"
-Deliver messages as responses to long-polling HTTP requests
Presence (C ++): Aggregates online info in memory (pull-based presence)
Chatlogger (C ++): stores conversations between page loads
Web Tier (PHP): serves our vanilla Web requests
In the cluster and partition subsystems, Facebook chose a combination of C ++ and Erlang. The C ++ module is used to record chat information, while the Erlang module "stores conversations of online users in memory and supports long-polling requests ".
# Realtime messaging
Facebook uses the client to pull new messages directly from the server, which is similar to the comet xhr long polling process.
A Facebook page will load an IFRAME for transferring messages between users. The javascript code in this IFRAME sends an http get request, which establishes a persistent connection with the server, until a message is returned to the user.
# Channel server architecture
Overview
One channel per user
Web Tier delivers messages for that user
Channel state: Short queue of sequenced messages
Long poll for streaming (COMET)
Clients make an HTTP request
-Server replies when a message is ready
-One active request per browser tab
Details
Distributed Design
-User ID space is partitioned (Division of Labor)
-Each partition is serviced by a cluster (availability)
Presence Aggregation
-Channel servers are authoritative
-Periodically shipped to presence servers
Open Source: Erlang, mochiweb, thrift, scribe, fb303, et al.
Channel servers
Channel applcations
# Dark launch
The method for starting this service is also interesting-use the so-called "touch the dark launch )".
The secret to changing the number of customers from zero to 70 million overnight is to avoid completing this process step by step. We will first simulate a lot of user access scenarios, which are achieved through a phase called "touch the dark. in this phase, the Facebook page will connect to the chat server without any UI elements, ask for online information and simulate the information sending process.
Piaoger: Is this thing comparable to our warmup ??
# References
[Facebook chat architecture (CHS)] (http://www.infoq.com/cn/news/2008/05/facebookchatarchitecture)
[Facebook chat architecture (en)] (http://www.infoq.com/news/2008/05/facebookchatarchitecture)
[Facebook chat] (http://www.facebook.com/note.php? Note_id = 14218138919 & id = 9445547199 & Index = 0)
[Erlang at Facebook] (http://www.erlang-factory.com/upload/presentations/31/EugeneLetuchy-ErlangatFacebook.pdf)