Fiddler crawl HTTP requests.
Grab bag is the most basic application of fiddler, take this blog as an example, after starting fiddler, enter Http://blog.csdn.net/ohmygirl in the browser after entering enter, in the Fiddler web The HTTP request captured by the session interface is as follows:
A detailed description of each field has been explained and is no longer described here. Note that the icons in the # column, each of which represent different types, include the following:
Also, note the host field of the request. You can see a response from a subdomain of multiple www.csdn.net, stating that in the architecture of a large web site, many subdomains are required, which may be used solely to cache static resources, may be dedicated to media resources, or specifically responsible for data statistics (such as Pingback).
Right-click one of the requests. The actions that can be selected are: Save (the message that holds the request, which can be a request message, which can be a response message). For example, one of the request header information we saved is as follows:
Not only a single session,fiddler also supports saving all crawled sessions (and supports import), which is helpful for crawling suspicious requests and then saving them and analyzing them at any time afterwards.
If you want to resend some requests, you can select the requests and then click Reply in the toolbar. You can resend the selected requests.
Left click on a single HTTP request, you can see the following information in the tab panel on the right:
1. Statistic.
Performance and other data analysis for HTTP requests:
We can see some basic performance data: such as the time consumption of DNS resolution is 8ms, the establishment of TCP/IP connection time consumption is 8ms and so on information.
2. Inspectors.
It is divided into two parts, the upper part is the request head part, the lower part is the response head part. For each section, a variety of different formats are available to view the contents of each request and response. The JPG format uses ImageView to see the picture, and Html/js/css uses TextView to see the content of the response. Raw tags can view the original HTTP compliant request and response headers. Auth can view information about licensing proxy-authorization and Authorization. The Cookies tab can see the requested cookie and the Set-cookie header information of the response.
3. Autoresponder
Fiddler is one of the more important and powerful features. can be used to intercept a request, redirect to a local resource, or use the built-in response of fiddler. Can be used to debug server-side code without modifying the server-side code and configuration, because after interception and redirection, it is actually accessing local files or getting a built-in response for fiddler. When allow Autoresponser is checked and the corresponding rule is set (the rule in this example is to intercept Http://blog.csdn.net/ohmygirl's request to a local file layout.html), as shown in
Then access the Http://blog.csdn.net/ohmygirl in the browser, the results are actually:
This is exactly the content of the local layout.html, which indicates that the request has been successfully intercepted locally. Of course, you can also use the Fiddler's built-in response. Is the way fiddler supports blocking redirects:
Therefore, if you want to debug a script file of the server, you can intercept the script locally, after modifying the script locally, then modify the server side of the content, which can ensure that as far as possible in the real environment debugging, so as to minimize the likelihood of the occurrence of the bug.
Not only is a single url,fiddler support multiple URL matching methods:
I. Character matching
Such as example can match http://www.example.com and http://example.com.cn
II. Exact match
Exact matches are indicated by exact, as in the example above
Exact:http://blog.csdn.net/ohmygirl
Iii. Regular expression matching
Start with regex: use regular expressions to match URLs
such as: Regex: (? insx). *\. (css|js|php) $ means matching all request URLs ending with css,js,php
4. Composer.
The old version of the Fiddler is called Request-builder. As the name implies, you can build the corresponding request, there are two common ways to build the request:
(1) Parsed enter the requested URL after executed, or you can modify the corresponding header information (such as adding common accept, host, referrer, Cookie,cache-control and other headers) after execute.
The common application of this feature is: "Swipe ticket" (Not train ticket!!) ), such as Refresh page traffic (based on ethical and security reasons, if you really go to brush tickets, brush the amount of visits, this blog is not responsible for)
(2) Raw. Constructs an HTTP request using HTTP header information. Similar to the above. Not much of a narrative
5. Filter
Fiddler another more powerful feature. Fiddler provides multi-dimensional filtering rules sufficient to meet the needs of daily development and commissioning. As shown:
The filter rules are:
A. Host and zone filtering. You can filter HTTP requests that display only the intranet or the Internet
You can also select an HTTP request for a specific domain name
B. Client process: You can capture a request for a specified process.
This is useful for debugging a single application's request.
Additional settings can be found in the official documentation for fiddler.
"HTTP" Fiddler (ii)-use Fiddler to do packet Capture analysis (RPM)