Recently and colleagues to deal with a small program, the data volume is not particularly large, a table of data records: 7000W records around, but from the change to execute a query, but found that the query is not fast, and the most obvious problem is cpu100%.
SQL statements:
Select from where LNG> and LNG< and lat> and Lat< Max_lat;
To find the problem, use the following statement to see if the database has a lock presence:
--To View the locked table: Selectrequest_session_id asspidobject_name(resource_associated_entity_id) asTableName fromSys.dm_tran_lockswhereResource_type='OBJECT' --SPID lock table Process --tableName locked Table name --Unlock: Declare @spid int Set @spid = $ --Lock Table Process Declare @sql varchar( +) Set @sql='Kill'+cast(@spid as varchar) exec(@sql)
When the execution of the view is a list of the time, suddenly found that nearly 100 locks are pointing to the table: Finger_lib_server table.
Therefore, the Query Optimization discovery table on SQL Server Finger_lib_server table does not create a corresponding index for the query above:
After the index is created, it is found that not only the CPU occupies less than 30%, but also the query speed is basically between 0~60ms.
SQL Server Optimization: When the data volume query is not particularly much, but the database server CPU resources have been 100%, how to optimize?