In the database test of our virtualization series, we see that the database testing process will use a lot of memory, which can easily reach a limit of 32bit windows: The process memory is limited to 2GB, and usually the server inside 4GB or more memory is very common, so how much memory is used? How do they play a role in database applications? Our following tests can answer a series of questions.
SQL Server 2005 is a popular relational database system
This restriction that programs can only use 2GB of memory is caused by a 32-bit operating system architecture. The traditional 32bit operating system uses 32bit of memory address, so that the addressing range has been limited to 4gb--4g is 2 of the 32, but usually the design of the operating system for security considerations, the application and kernel memory address space is independent of each other, that is to say, The application and kernel each have access to 2GB of memory space. Although different operating system implementations have different values, most of the current operating systems are consistent at this point.
To allow the program to break the 2GB addressing limit, the modern Windows NT core provides a workaround: 4GB Memory tuning optimization technology, which extends the addressing space of user mode to 3GB, so that the core addressing space is limited to 1GB, Applications that require a large amount of memory can derive performance improvements from this feature, such as the SQL Server database type. To use this 4GB memory optimization technique, users need to include the/3GB switch in the startup parameters of the Windows Server operating system. This feature also requires the operating system to turn on DEP (Data Execution protection, in fact the/3GB switch requires PAE support).