A possible optimization method for mysqllimit large offset bitsCN.com
A possible optimization method for mysql limit large offset
When the mysql limit statement has a large amount of data, the offset After limit is too large, and the first query will be very slow, because mysql enables the query cache by default, the second re-execution of the large offset query will not be affected.
Example: For a table in 1 million, id is used as the primary key, auto_increment.
Query required:
SELECT * FROM table ORDER BY id DESC LIMIT 990000,100
Relatively slow.
The common method is:
SELECT * FROM table WHERE id >=(SELECT id FROM table ORDER BY id DESC LIMIT 990000,1) ORDER BY id DESC LIMIT 100
IDS may not be consecutive, and sorting may not depend solely on ids. this method cannot be applied in actual projects.
For example, there are complex sorting methods in our actual project:
ORDER BY (column1 + column2) * column3 DESC
And so on. if the offset of 1 million is 990000, the reverse sorting should be closer to the header:
------------------------------------------------------------------- [0.99 million-| 100] -- 1 million
Implementation:
SELECT * FROM table ORDER BY columns DESC LIMIT 990000, 100
We can infer a solution: reverse sort, intercept the header, and reverse it again. The result is as follows:
$ Head = max (1 million-0.99 million-100, 0); SELECT * FROM (SELECT * FROM table order by columns asc limit $ head, 100) AS t order by columns DESC
The optimization is completed. the speed near the end is as fast as that at the beginning, but there is no difference if the offset in the middle is obtained.
BitsCN.com