Craigslist database architecture Author: fenng | English version URL: http://www.dbanotes.net/database/craigslist_database_arch.html
Craigslist is definitely a legend of the Internet. According to the previous report:
More than 10 million users use the website service each month, with more than 3 billion million page views each month. (Craigslist adds nearly 1 billion new posts each month ??) The number of websites is growing at nearly times a year. Craigslist has only 18 employees so far (there may be more employees now ).
Tim o'reilly interviewed Eric Scheide of Craigslist, so through this database war stories #5: Craigslist, we can take a look at Craigslist's database architecture and data volume information.
The database software uses MySQL. To make full use of MySQL capabilities, the database uses 64-bit Linux servers and 14 local disks (72*14 = 1 TB ?), 16 GB memory.
Different services use different database clusters.
Forum
1 master (master) 1 slave (slave ). Most slave tables are used to back up. MyISAM tables. The index reaches 17 GB. The maximum number of tables is nearly 42 million rows.
CATEGORY Information
1 master 12 slave. Slave has its own purposes. The current data includes an index of 114 GB and a maximum of 56 million rows (the table data will be archived regularly ). Use MyISAM. What is the volume of classified information? "Craigslist adds nearly 1 billion new posts each month", which seems exaggerated. Eric Scheide said that there were more than 330000 data records yesterday. If so, each month, the number of new posts is about 0.1 billion.
Archive database
1 master 1 slave. Place all posts over 3 months. Similar to the structure of the classification information database, but larger, the data volume is 238 GB, and the maximum table size is 96 million rows. A large number of merge tables are used for ease of management.
Search databases
Four clusters use 16 servers. The active post is divided by region/type and the full-text index of MyISAM is used. Each post contains only one subset of data. This index solution is still available, and may not be available in the next few years.
Authdb
1 master and 1 slave, very small.
Currently, Craigslist ranks 30 on Alexa. The above data only reflects the interview time (latency l 28,200 6). After all, the Craigslist data volume is still growing at a rate of 200% per year.
Craigslist's data solution is low in terms of hardware and software. Excellent MySQL database administrators are a key factor for Web 2.0 projects.