Differences between cache and Buffer

Source: Internet
Author: User

1. buffer is a buffer.

2. cache is a high-speed cache, which is divided into library cache, data dictionary cache, and database buffer.


Buffer cache high-speed cache, used to cache data read from the hard disk, reduce disk I/O.

3. buffer has a shared SQL zone and PL/SQL zone, and the database buffer cache has an independent subcache.
4. The pool is a shared pool used to store recently executed statements.

5. cache:
A cache is a smaller, higher-speed component that is used to speed up

Access to commonly used data stored in a lower-speed, higher-capacity


Database buffer cache:
The database buffer cache is the portion of the SGA that holds copies of data

Read from data files. All user processes concurrently (simultaneously, concurrently) connected

To the instance share access to the database buffer cache.

Buffer cache is read and written in blocks.

The cache stores the read data. When you re-read the data, if it hits (find the required data), it will not

Read the hard disk. If the hard disk is not hit. The data is organized according to the read frequency to read the most frequently.

The retrieved content is placed in the easiest location, and the unread content is arranged until it is deleted.

Buffer is designed based on the disk's read and write operations. Distributed write operations are performed in a centralized manner to reduce disk fragments and

To improve the system performance. In Linux, a daemon regularly clears the buffer content (such

Disk), you can also use the sync command to manually clear the buffer. For example, I have an ext2 USB flash drive.

Cp a 3 m MP3 flash drive, but the USB flash drive does not beat. After a while (or manually enter sync) the USB flash drive light

And it jumps. The buffer is cleared when the device is detached, so it takes several seconds to detach the device.

Modify the number on the Right of VM. swappiness in/etc/sysctl. conf to adjust the swap Usage Policy at next boot.

. The value range is 0 ~ 100, the larger the number, the more inclined to use swap. The default value is 60. You can try it again.
Both are data in Ram. In short, the buffer is about to be written to the disk, and the cache is written from the disk.

Buffer is allocated by various processes and used in the input queue. A simple example is as follows:

Multiple fields are read. Before all fields are fully read, the process puts the previously read fields in the buffer for storage.

Cache is often used in disk I/O requests. If multiple processes need to access a file, the file is

Make it a cache to facilitate next access, which provides system performance.

A buffer is something that has yet to be "written" to disk. A cache is

Something that has been "read" from the disk and stored for later use.

For more details, see difference between buffer and cache.

Shared Memory is mainly used to share data between different processes in UNIX environments,

Is a method for inter-process communication. Generally, applications do not apply to use shared memory, and I have not verified

The effect of the memory on the above equation. If you are interested, see: What is shared memory?

Differences between cache and buffer:

Cache: high-speed cache is a memory with a small capacity but high speed located between the CPU and the main memory. Because

The CPU speed is much higher than the master memory. The CPU needs to wait for a certain period of time to directly access data from the memory.

When the CPU uses this part of data again, you can directly call it from the cache.

In this way, the CPU wait time is reduced and the system efficiency is improved. Cache is classified into Level 1 cache (L1 cache)

And L2 cache (L2 cache), L1 cache is integrated into the CPU, L2 cache is usually soldered on the motherboard in the early stage, now

It is also integrated into the CPU. Common capacities include 256kb or 512kb L2 cache.

Buffer: a buffer, which is used to transmit data between devices with Different Storage speeds or devices with different priority levels.

. Through the buffer zone, mutual waits between processes can be reduced, so that devices with slow speed can read data.

The operation process of the Fast-speed device is not interrupted.

Buffer and cache in free: (both occupy memory ):

Buffer: the memory used as the buffer cache. It is the read/write buffer of Block devices.

Cache: Memory Used as the page cache, the cache of the file system

If the cache value is large, the number of files in the cache is large. If files frequently accessed can be

Cache, the disk read Io BI will be very small.


Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.