Cache (high-speed buffer memory) is widely used in computers of more than 386 grades to improve system efficiency. The current system can even have multi-level cache. The cache is actually a small amount of ultra-high-speed static memory (SRAM) between the CPU and DRAM primary memory, usually 8 KB ~ 512kb.
The working principle of cache can be described as follows: when the CPU needs to access the memory, it first checks whether the required data is in the cache. If yes, you can directly access the data without inserting any waiting state. This is the best case, known as high-speed hit. When the information required by the CPU is not in the cache, you need to switch to the primary storage, because the speed is slow, you need to insert a wait, this situation is said to be high-speed miss. When the CPU accesses the primary memory, it writes the storage information to the cache at the same time according to the optimization principle to ensure the next possible high-speed hit. Therefore, the same data may be stored in the primary storage and cache at the same time. Similarly, the optimization algorithm can eliminate some infrequently used data in the cache.
Therefore, the best way to increase the high-speed hit rate is to make the cache store the commands and data that the CPU has been using recently. When
After the cache is full, you can delete data that is not used for a long time to improve the efficiency of cache usage. To maintain
Cache
The consistency between the data in the cache and the data in the primary storage. This avoids the loss of new data in the cache during the CPU reading and writing process, resulting in incorrect read data.
The data in the update process does not disappear because of overwriting.
Data updates in the primary storage are promptly and accurately reflected in the primary storage. This is a write process. Generally, three processing methods are used: Direct Writing, buffer direct writing, and write-back.
1. Direct Write System:
When the CPU writes data to the cache, the data is written to the primary storage at the same time. This ensures that the content in the cache is exactly the same as that in the primary storage. This method is intuitive and simple and reliable. However, the primary storage must be written every time the cache is updated, and this must be done through the system bus. Therefore, the bus works frequently, the system running speed will be affected.
2. Buffer Direct Writing System: To solve the problem that the direct writing system affects the bus speed, add a buffer zone when writing data to the primary storage. When the data to be written into the primary storage is locked by the buffer, the CPU
You can perform operations in the next cycle without waiting for data to be written to the primary storage.
This adds a one-way, single-time high-speed cache to the primary memory. For example, after the write cycle, a read/write cycle can be followed by a read/write cycle in which the data already exists in the cache. This avoids the operation delay caused by the direct write system. However, this buffer can only store data written once. When two write operations occur consecutively, the CPU still needs to wait.
3. Write-back system: the previous two write methods are used to write data to the primary memory while writing data to the cache. In fact, this is not only the occupation of bus bandwidth, but also a waste of valuable execution time. In some cases, it is unnecessary. You can add additional standards to determine whether it is necessary to update data. The write-back system adds an update bit to the flag field of each data block in the cache to solve unnecessary write operations on the primary memory. For example, if the data in the cache has been updated by the CPU but the master memory has not been updated at the same time, the update bit is set to 1. Each time the CPU writes a new content to the cache, it first checks the update bit of the data block in the cache. If the update bit is 0, the data is directly written to the cache. Otherwise, if the update bit is 1
The content in the cache is written to the corresponding location in the primary storage, and new data is written back to the cache.
Compared with a direct-Write System, a write-back system can save unnecessary immediate write-back operations, which frequently occur in many cases. Even if a cache is updated, if it is not replaced by new data, there is no need to write the primary memory immediately. That is to say, the number of actual writes to the primary storage may be less than the number of Write cycles actually executed by the CPU, but the structure of the write-back system is more complex, the cache must also use additional capacity to store the logo.
Because of the high efficiency of the write-back system, most modern cache operations are performed in this way.