Problems with IP address and traffic flooding Software
I thought that the effective control of the same IP address only once on the same day can effectively prevent users from refreshing the address for multiple times. I have to admit that it is not difficult --, we have been ignoring that the current fl software is so powerful that our project is also subject to the use of fl software, which generates a lot of junk data and even affects the accuracy of statistics.
To solve this problem, we have also downloaded two well-known traffic-flushing software, "Traffic treasure" and "Traffic Genie". You don't have to know it. It is really a good thing to use it.
The principles of the two software are the same. It is estimated that the other software is similar, that is, the principle of network mutual access is used to make the access effect true and effective by taking advantage of the geographical differences of network nodes and the randomness of users, that is, when your computer is hung with a traffic-flooding software, your address will be accessed by all the users who are also on the hook. Of course, when someone helps you brush it, you will also help people brush it, all these kung fu software are behind the scenes to help you complete the work. After several minutes, you can see that the traffic is coming up slowly and talking a lot of nonsense. The following also describes the countermeasures.
Solution
In view of the fact that some netizens do not like to read the full text, I would like to emphasize that all the data submitted in the background is checked by IP address.In this article, we will discuss the solutions for IP addresses under such circumstances.
Solution 1: submit data asynchronously through Ajax (invalid)
When the promotion address is clicked at first, the resolution page on the background will first record the visitor's IP address, time, and other information. This method is obviously difficult to prevent traffic flooding software, therefore, we will consider the form of asynchronous data submission through Ajax.
At the beginning, these rogue software was underestimated and thought that it would not trigger js scripts by simply simulating http requests. Therefore, the first solution is to submit record requests asynchronously through ajax after page loading, the result is invalid. Experiments show that this method is only valid for low-level robots;
Solution 2: Determine the width or height of the requested client browser window (invalid)
It can be inferred from solution 1 that these traffic software does not simply simulate http requests, that is, requests through real browsers, but while I am hanging up to "be" to help others brush traffic, I don't see any web page opening. I can only see non-stop requests through the packet capture tool. I guess these traffic flushing software hides a browser window, or sets the browser window very small ...... In this case, I guess that I use js to determine whether the browser window area currently opened by the client is greater than a certain value (such as Px in width and PX in height, I believe no one will visit the website in such a small area ~), Data is submitted through Ajax only when the threshold is exceeded.
The result is still invalid. In this case, a applet is specially written to record the browser type and window size of each request ......, The result left me speechless. The requested browser's visible area was normal, and even the resolution was much higher than that of my monitor ......
Solution 3: Use mouse events as the basis for normal access(Valid)
Through several experiments, we can conclude that these robots are not simple, but after all, they are robots, so we should consider using mouse events to determine whether they are robots, such as mousemove, mousedown, and mouseover, of course, you can also let the user choose to click a button to determine the basis for the operation (of course, the operation experience should be taken into account). Below is a simple script:
Copy codeThe Code is as follows:
<Script src = "jquery-1.4.1.min.js" type = "text/javascript"> </script>
<Script language = "javascript" type = "text/javascript">
$ (Document). ready (function (){
Var movetimes = 0; // number of moves
$ (Document). mousemove (function (even ){
Movetimes ++;
If (movetimes> 100) {// The number of event executions exceeds n, and a conservative value is set.
$ (Document). unbind ("mousemove"); // unbind mousemove
// Execute asynchronous data submission here
Alert ("Asynchronous submission record request! ");
}
});
});
</Script>
Summary:
After several attempts, they finally achieved results in the third solution, but it was not ruled out that these robots would continue to improve and break this simple test link. After all, robots are robots. It is recommended that you add more conditions for manual operation to determine whether the impact of such malicious IP traffic software can be prevented. Due to the special nature of the project, these solutions are not necessarily suitable for all situations. After all, most of these traffic flushing software is used by some grass-roots webmasters and usually fl their own websites :), here, I also hope that some netizens in the yard will give suggestions on how to deal with these traffic flooding software and share their experiences.