Recently in the optimization of a relatively large high-concurrency site, a lot of revenue, experience to share to everyone.
1. Table field Segmentation. frequently accessed tables, rows of large tables, remember to keep the minimum length field, not the data in the select list, do a vertical split. Example: The article table, do not have the content of the article body part of the article Information table, you can store this field independently to another table Articlecontent (Articleid,articlecontext). Because this field is large in length, it affects the row scan at query time.
2. Frequently updated fields do a vertical split of the table. When doing the update operation, usually do a row lock, and some will be set to table lock. Then, when you do a query, all queries are queued until the update operation is complete.
3. Try to do the cache field, reduce the table association query.
4. For some non-important, non-real-time strong data, do a regular update, rather than real-time updates.
5. Index. A good index can have hundreds or thousands of times the effect of a query, but the premise is a good data table structure design. Too many indexes can cause updates to become slower. Note The fields in where and order by. Regarding the index, I will open the topic to explain the optimization.
6. Rebuild the index on a regular basis. After a certain amount of time, there is an issue with index fragmentation, such as disk defragmentation.
7. Filter slow query, the general system query time of more than 0.1s to check.
8. Use explain to analyze query cost. All statements with full table scan are considered optimized, except for small data meters within hundreds of.
9. Configure parameter optimization. such as temp table size, lock mechanism, database engine type.
Combat experience: MySQL builds high-concurrency website performance optimization