Dimension cache in Analysis Service 2005
SQL Server 2000 and SQL Server 2005 separate profiling services vary greatly when working with dimension members. In SQL Server 2000, all dimension members in all databases need to be loaded into the server's address space at startup. In this case, memory cannot provide a good service for other programs, and the data cache will exceed the dimension's memory. That's a big limitation. This means that on a 32-bit system (only 3GB of virtual address space, but the analysis service is unaware of this), the maximum number of dimension members you can have is millions of. If you limit the number of member attributes, and keep the names of these properties short, then you can reach 3 to 4 million members. As long as you exceed this limit, you will have to use a 64-bit server to get a larger virtual address space. This is because the item dimension contains about 7 million members. The customer dimension has approximately 5 million records. This has greatly exceeded the capacity of the SQL Server 2000 Analytics service on 32-bit systems.
In SQL Server 2005, the profiling service uses a dynamic dimension cache, not all members are statically mapped into memory. The system loads the members into memory when they are required. When other dimension members are requested, the previous members are freed. We have successfully built the entire project REAL system on 32-bit hardware and 64-bit hardware (in fact, there are many versions of this system). And for this large system, the 32-bit system can also operate well.
We have also discovered the dimensions that cannot be built onto a 32-bit system. But it's a lot bigger than the upper limit for SQL Server 2000. This depends largely on the hash table that is used to process the attribute hierarchy of the dimension key attribute. This dimension cannot be processed when you use too many attributes (or if the attributes are too large) and then eventually exceed the available memory.
For dimensions, which fall into the bounds of not being built because they are too large to be loaded into memory, there are two ways to reduce the memory needed to handle them.