How to query the ASP. NET 2.0 Membership table without shutting down the site
This kind of query will run in your development environment as soon as possible:
Select * from aspnet_users where UserName = 'blabla'
|
Or you can obtain the profile of some users without any problem:
Select * from aspnet_profile where userID = '…...'
|
You can easily update your email address in the aspnet_membership table as follows:
Update aspnet_membership SET Email = 'newemailaddress@somewhere.com' Where Email = '…'
|
But when there is a huge database on your product server, running these scripts will bring down your server. The reason is that although these queries seem to be commonly used statements, they do not have any indexes. Therefore, all of the above results in "Table Scan" are poorly queried) will respond to millions of rows of records.
What does this phenomenon mean for us. We use these fields, such as UserName, Email, UserID, and IsAnonymous. Many market reports about Pageflakes. Some of these reports can be used only by the marketing department, and others cannot. Currently, the website runs well, but the Marketing Department uses it many times a day and users often call to tell us that the website is too slow, "the user reports that the website performance is very slow !", Some pages even experience timeout.
Often, when they call us, we will tell them to "Wait, check now", and then we start to check the entire website. We use the SQL profiler tool to check whether an error has occurred. But we cannot find any problems. Profiler displays the chunks for query. The CPU load is within the appropriate parameter range. The site runs well and is fluent. We called them and said, "We can't find any problem. What happened ?".
Therefore, when we try to investigate this problem and find that the site is really slow without investigation, why can't we see any problems with its slowdown?
The marketing department sometimes needs to run analysis reports many times a day to query data. No matter when they execute these queries, when these fields are not indexed, it will make the server's IO throughput higher and CPU usage higher:
We use a 15000 rpm scsi drive, which is expensive but fast. The CPU is Dual-core Dual Xeon 64-bit. Despite these powerful hardware support, this huge database overwhelmed us during query execution.
But when the marketing department calls us, these problems have never happened, and we have not hung up and tried to find out the problem. Because when they call us, they will talk to us, and they do not run any reports that cause the server to collapse. Let them stay in other places on the site, almost done the same thing as the user.
Let's take a look at the following indexes:
Table: aspnet_users
◆ Cluster Index = ApplicationID, LoweredUserName
◆ Non-Cluster Index = ApplicationID, LastActivityDate
◆ Primary key = UserID
Table: aspnet_membership
◆ Cluster Index = ApplicationID, LoweredEmail
◆ Non-Cluster Index = UserID
Table: aspnet_Profile
◆ Cluster Index = UserID
Most indexes contain ApplicationID. Unless you place ApplicationID = '... 'In the WHERE clause, it will not use any index at this time. As a result, all queries are scanned for the entire table. Only put ApplicationID in the where clause and find your ApplicationID from the aspnet_Application table) All queries will become very fast.
Do not use Email or UserName in the WHERE clause. They are not part of the index, thus replacing the LoweredUserName and LoweredEmail fields. All queries must have ApplicationID in the where clause.
Our management site contains many such reports and each report contains many such queries on the aspnet_users, aspnet_membership, and aspnet_Profile tables. As a result, when the marketing department tries to generate reports, they will occupy all CPU and HDD resources, resulting in a very slow or even unresponsive site.
Make sure that you always check all WHERE and JOIN clauses. Otherwise, a major problem may occur when your website goes online.
Block Denial-of-Service (DOS) attacks
For hackers, Web services are the most vulnerable to attacks, because even an entry-level hacker can repeatedly call an expensive Web service to make the server die. Websites like Pageflakes that are built with Ajax technology and use it as the start page will be the biggest target of DOS attacks, because if you only access the home page repeatedly without saving cookies, in this way, each click will generate a new user, a new page configuration, and a new widgets. The first access is the most expensive.
However, this is also the most likely opportunity to expose and disrupt the site. You can try this method by yourself. You only need the following simple code:
for( int i = 0; i < 100000; i ++ ) { WebClient client = new WebClient(); client.DownloadString("http://www.pageflakes.com/default.aspx"); }
|
To your surprise, you will notice that after two calls, you will not get a valid response. This does not mean that you have successfully destroyed the server. This is because your request is rejected. You may be happy that you no longer receive any services, so you have never served yourself. We are glad to reject your service DYOS ).
This simple technique I proposed is to remember the requests sent from a special IP address. When the number of requests exceeds the valve value, the subsequent requests will be rejected. This method is to remember the caller's IP address in the ASP. NET cache and maintain a count request for each IP address. When the Count value exceeds the predefined limit value, the request sent after a period of time, such as 10 minutes, is rejected. 10 minutes later, allow requests from that IP address again.
I have a class named ActionValidator, which includes some specific activities such as first access, re-access, asynchronous return, add new widgets, and add new pages. Check whether the Count value of this specified activity points to an IP valve value.
public static class ActionValidator { private const int DURATION = 10; // 10 min period public enum ActionTypeEnum { FirstVisit = 100, // The most expensive one, choose the value wisely. ReVisit = 1000,// Welcome to revisit as many times as user likes Postback = 5000,// Not must of a problem for us AddNewWidget = 100, AddNewPage = 100, }
|
This enumeration contains the activity type for checking their valve values at a certain duration-10 minutes.
Use a static method named IsValid for checking. Returns true if the request restriction fails. If the request needs to be rejected, false is returned. Once you get false, you can call the Request. End () method and prevent ASP. NET from future processing. You can switch to display "Congratulations! You have succeeded in Denial of Service Attack. "page.
public static bool IsValid( ActionTypeEnum actionType ) { HttpContext context = HttpContext.Current; if( context.Request.Browser.Crawler ) return false;
string key = actionType.ToString() + context.Request.UserHostAddress; var hit = (HitInfo)(context.Cache[key] ?? new HitInfo());
if( hit.Hits > (int)actionType ) return false; else hit.Hits ++; if( hit.Hits == 1 ) context.Cache.Add(key, hit, null, DateTime.Now.AddMinutes(DURATION), System.Web.Caching.Cache.NoSlidingExpiration, System.Web.Caching.CacheItemPriority.Normal, null); return true; }
|
The built-in cache key combines the activity type and Client IP address. First, it checks whether there are any portals for activity and Client IP addresses stored in the cache. If not, start counting and remember the IP address in the cache for the specified duration. The absolute expiration date in the cache item ensures that the cache item can be cleared and recalculated after a period of time. When an entry already exists in the cache, obtain the last number of clicks and check whether the limit is exceeded. If the value does not exceed, the counter is added.
You do not need to use Cache [url] = hit to store the updated value in the Cache. Because the clicked object is a reference object, changing it means it will also change in the Cache. In fact, if you put it into the cache again, the cache expiration counter will re-start and cause the failure logic to re-calculate after the specified duration.
The usage is very simple. The code on the default. Aspx page is as follows:
protected override void OnInit(EventArgs e)
{
base.OnInit(e);
// Check if revisit is valid or not
if( !base.IsPostBack )
{
// Block cookie less visit attempts
if( Profile.IsFirstVisit )
{
if( !ActionValidator.IsValid(ActionValidator.ActionTypeEnum.FirstVisit))
Response.End();
}
else
{
if( !ActionValidator.IsValid(ActionValidator.ActionTypeEnum.ReVisit) )
Response.End();
}
}
else
{
// Limit number of postbacks
if( !ActionValidator.IsValid(ActionValidator.ActionTypeEnum.Postback) )
Response.End();
}
}
Here I will check specific scenarios such as access, re-access, and re-sending.
Of course, you can block DOS attacks by placing some Cisco firewalls. In this way, you can avoid the host provider's entire network from being affected by DOS or DDOS attacks. All this guarantees is network-level attacks, such as tcp syn attacks and strange data packet groups. Without cookies or attempting to load too many widgets, they cannot analyze these packets and find a Special IP address to try to load the site multiple times. These are called application-level DOS attacks, and hardware cannot block them. These must be implemented in your own code.
Currently, few websites use this precaution to prevent DOS attacks at the application level. Therefore, it is very easy to write some simple code to crash the server by continuously clicking on these expensive pages on the site or making Web service calls from the broadband connection. I hope this is only a small problem for you and can help you prevent your Web applications from DOS attacks in this effective way.
Summary
You have learned a lot about ASP. NET performance on machines with the same hardware configuration. You also learned how to use AJAX technology to make your site load faster and smoother. Finally, you learned how to protect your site against the risks caused by high clicks and the possibility of extending static content over the content transmission network to prevent blocking caused by site peaks. All these technologies can make your site load faster, run more smoothly, and get higher load at a lower cost.
- 10 tips for ASP. NET application design
- Secrets of ASP. NET performance improvement-pipeline and Process Optimization
- Writing to ASP. NET programmers: security issues in websites