Make good use of web debugging proxy tool recommendations
12 Favorites
Original article address: using web debugging proxies
When we debug the front-endCodeIt usually takes a lot of time to check how CSS and JavaScript render pages, but understanding the impact of network requests on pages is equally important. In many cases, because of our local development, we may ignore the huge impact of page size, latency, and Script Loading and blocking on Website user experience. Therefore, a handy network traffic check tool is essential.
Fortunately, all mainstream browsers provide debugging tools for viewing network traffic, and third-party tools such as fiddler and Charles not only allow viewing network requests, it also provides an extension function that can interact with the website.
The following describes the two types of tools.
Browser-based traffic sniffing
As I mentioned, every mainstream browser has built-in debugging tools. They include:
- F12 Developer Tools for Internet Explorer
- Firefox Web Developer Tools and firebug additional components
- Chrome Developer Tools
- Dragonfly of Opera
- Safari web checker
(Note: For more information about these tools, see another article by the translator .)
Each toolset has its own unique functions and has the ability to collect network traffic information. Looking at the pictures below the timeline, you will find that although the UI is different, the collected and displayed data are very similar.
The final result is a list of network requests generated by resources or data downloaded by the browser. Network Tools can intercept these requests and display the main data to you:
- Request type (get, post, etc)
- What is the request?
- Uri
- Status
- Size
- Time spent completing the request
Therefore, if we view the results in firebug, we can see that these requests are pulled back to the home page and related CSS and JavaScript files, including resources on Amazon AWS. Due to image restrictions, I cannot show you all the loaded content. However, the image and Flash SWF files are also returned.
In-depth research
With this information, you can thoroughly check the specific request, determine whether the correct data is received, and check why a long request takes a long time. Suppose I want to view the webtrend Javascript file request. It takes 1.2 seconds to download and I want to see how the request is processed. I can expand this request to see if it is compressed using gzip (yes ):
And whether it is minimized (minified ):
In this example, the files are not compressed to a minimum, and then I can ask the developer to follow up to see if this is reasonable. It is true that this is only a 2 K file, but every byte is actually very important. This information allows us to better optimize the website.
Network Timing
Network latency can be a fatal killer, especially for single-page apps that rely on external APIs or multiple script files for functionality ). Most browsers load resources asynchronously as much as possible, but some resources such as JavaScript files can trigger blocking events. It is important to restrict these files as much as possible and optimize resource loading. If we look at this figure, we will find that this file takes 1.4 seconds to load:
When hovering over the timeline, a dialog box is displayed, showing the decomposition information of the request being processed:
This is partly because it is blocked for 760 milliseconds. If this is a common problem, you can try to use a script loader (such as requirejs) to better manage Script Loading and dependency.
Ajax request
Because dynamic applications are everywhere, it is critical to be able to view xhr calls. You have seen countless network requests before, and it is not efficient to try to filter these requests and find out xhr calls from them. Therefore, most tools allow you to view data by request type. Here I filter requests based on xhr requests, so I can evaluate requests and responses:
By viewing the request information in depth, I can obtain important details of the request, such as the request header, status, request method, and cookies. More importantly, I can see the data returned by the request:
In this example, HTML is returned, but the response can be any content including text, JSON, or XML. Even better, I can fully view the request details when I encounter any problems.
Cookies
Cookies are really very useful. Because we will use them extensively, a convenient way to view cookie values will make life easier. The developer tool can easily implement these functions. It can show which cookies are sent or received.
If you have done server-side development without client tools, you will know how nice it is.
In short, the best thing is that these features are built into your browser, making it extremely convenient to enable tool debugging and View Details. However, sometimes you need a little more horsepower.
Third-party HTTP Proxy tools
HTTP ProxyProgramFor example, fiddler and the Charles web debugging proxy are browser-based network traffic sniffing tools. They can not only intercept network requests from browsers, but also intercept network requests from other programs on your machine, which makes them widely used in debugging. They also provide richer features, such:
- Bandwidth limit
- Automatic response to specific requests
- Resource Replacement during transmission
- SSL Proxy
- Agent Ecosystem
- Customizable scripts
- Recording and playback test scenarios
I often use Windows-based, function-rich fiddler (free !). Because of its powerful feature set, it is widely used within Microsoft. Eric Lawrence, a developer of Fiddler, previously worked at Microsoft and maintained the software.
If we look at the interface, we will find output similar to those of the browser tools. All network requests are displayed according to key information.
View a request in depth to view more details, including the compressed jquerySource code:
Most of this information can also be obtained through a browser-based tool, but what if you want to determine if a library breaks your website? You can directly replace these libraries for locating. A better path is to create an autoresponder to intercept and replace the databases in the production environment. Fiddler receives the request and replaces it with a local file. Next, let's look at it.
First, I need to identify the URI to be replaced. In this example, I see that my blog topic uses jquery v1.2.6. It's crazy, but before I lose it, check whether jquery v1.8.3 works as expected.
Click the record of jquery v1.2.6. In the right bar of Fiddler, select the "autoresponder" tab and select "Enable automatic responses ". You can drag the URI directly to the rule editor. You will find that the rule starts from comparing the URI. If it matches, it responds to an event of your choice.
Since I want to test jquery 1.8.3, I hope this rule can replace the production environment version with my local jquery copy.
Save this rule and refresh my page. The final result is that, although the URI may look the same, the check result can verify that jquery v1.8.3 is actually embedded. In this way, I can directly test my website without making any changes.
From the perspective of debugging, this function is extremely practical, especially when you want to locate a bug hidden in the old version framework or library.
Additional component Ecosystem
Fiddler has benefited a lot from its attachment component ecosystem, which provides extended functionality for Fiddler through the ifiddlerextension interface. Currently, the following additional components are available:
- Stress Testing
- Security Audit
- Traffic comparison
- Javascript formatting
For it itself, fiddler has numerous features and cannot be listed one by one in this article. That's why there is a 330-page book to teach you how to make good use of it. This book is only $10, allowing you to master this great tool from the inside out.
OSX and Linux
If you are using OSX or Linux, the best choice is the Charles web debugging agent tool. This tool is widely supported and commercially mature, making every penny worthwhile. I once looked for alternatives focused on web development, but Charles stood out.
The interface is similar to Fiddler, but it provides two different ways to view network traffic:
The style is exactly the same as you. I prefer a structured view because it feels more organized, but it is slightly inconvenient to search for a specific URI.
Like Fiddler, Charles also provides an automatic response function called "map local ...", You can right-click a URI to call out. This feature allows you to select a local file for work.
After refreshing the page, my jquery v1.2.6 is replaced by jquery v1.9 on my local machine.
Another excellent feature of Charles is that it can suppress network requests to simulate specific bandwidth speeds. I still remember the exciting days when I used my 56 k cat. Using this feature allows you to recall the past few days:
Chares also provides a cross-platform interface, so it can work on Windows.
Which one should I use?
I use all these tools all the time, because I need to test every mainstream browser. With these features, it is easier to locate the problem. Of course, whether you choose a browser-based Sniffer tool or an application-based proxy tool depends entirely on your debugging needs.
If you only need to check some traffic and view the results, browser-based sniffing tools may be your best choice.
On the other hand, if you need to precisely control how the URI responds, or want to flexibly create custom scripts, then tools like Fiddler or Charles are what you need. It is gratifying that we have stable alternatives that can help us implement these features, especially when the complexity of the project increases.