I. Concept
Buffer: An area where data is transferred between devices that are not synchronized or that have different priority levels.
Cache: A small but high-speed memory located between the CPU and the main memory. Holds part of the data that the CPU has just used or recycled.
Second, the application scenario
buffer: allocated by various processes, used in the input queue and so on.
cache: used on disk I/O requests, if more than one process to access a file, then the file is made into the cache to facilitate
The next time you are accessed, improve system performance.
Third, the role
Buffer: Based on the read/write design of the disk, the decentralized write operations are centralized, reducing disk fragmentation and hard disk re-seeking. Through buffers, you can reduce the number of mutual waits between processes, allowing you to read data from slow devices. The operating process of a fast device is uninterrupted.
Cache: To save the read data, re-read if the hit will not read the hard disk, otherwise, read the hard disk. The data will be organized according to the frequency of reading, the most frequently read content in the most easily found in the location, the content is no longer read to the back row, until deleted.
Learning Notes---The difference between buffer and cache