burp Suite is one of the best tools for Web application testing, with a variety of features that can help us perform a variety of tasks. Request interception and modification, scan Web application vulnerabilities to brute force login forms, perform session tokens and many other random checks. This article will be a complete walkthrough of Burp Suite, which mainly discusses the following features.
1. The proxy –burp suite comes with an agent that runs on the default port 8080, and using this proxy, we can intercept and modify the packets from the client to the Web application.
The spider feature of the 2.Spider (spider) –burp Suite is used to crawl Web application links and content, and it automatically submits the login form (via user-defined input). Burp Suite spiders can crawl through all the links on the site to discover vulnerabilities in Web applications through a detailed scan of those links.
3.Scanner (Scanner) – it is used to scan Web application vulnerabilities. Some false positives may occur during the test. It is important to remember that the results of automatic scanner scanning cannot be exactly 100% accurate.
4.Intruder (Intrusion) – This feature can be used for a variety of purposes, such as exploiting vulnerabilities, Web application blur testing, and brute force guessing.
5.Repeater (Repeater) – This feature is used to modify and send the same number of requests and analysis according to different circumstances.
6.sequencer– This feature is primarily used to check the randomness of the session tokens provided by the Web application. and perform various tests.
7.Decoder (decoding) – This feature can be used to decode data back to the original data form, or to encode and encrypt data.
8.comparer– This feature is used to perform any comparison between two requests, responses, or any other form of data.
It has the following modules:
1. Target--a feature that shows the structure of the target directory2. Proxy (agent)--intercept http/s proxy server, as an intermediary between the browser and the target application, allows you to intercept, view, and modify raw data streams in two directions. 3. Spider-A web crawler that uses intelligent sensing to fully enumerate the contents and functions of an application. 4. Scanner (Scanner)-an advanced tool that automates the discovery of Web application security vulnerabilities. 5. Intruder-A custom, highly configurable tool that automates attacks on Web applications, such as enumerating identifiers, collecting useful data, and using fuzzing technology to detect generic vulnerabilities. 6. Repeater (Repeater)-a tool that manually operates to trigger individual HTTP requests and analyzes application responses. 7. Sequencer (session)-a tool used to analyze unpredictable application session tokens and the randomness of important data items. 8. Decoder (decoder)-a tool that performs manual execution or intelligently decodes code for application data. 9. Comparer (contrast)-Usually a visual "difference" between two data is obtained through some related requests and responses. Ten. Extender (extension)-Allows you to load burp Suite extensions and use your own or third-party code to extend the functionality of Burp suit. One. Options (Setup)--Some settings for Burp Suite
Test workflow
Burp supports the activity of manual Web application testing. It allows you to effectively combine manual and automated techniques, giving you complete control over all the actions performed by the Burpsuite and providing detailed information and analysis of the application you are testing. Let's take a look at the Burp suite's testing process. Such as
Brief analysis
The proxy tool can be said to be a heart of the Burp suite testing process, which allows you to browse through the app in a browser to capture all the relevant information, and lets you easily begin further action, in a typical test, the reconnaissance and analysis phase includes the following tasks:
Manually mapping applications-using the browser to work through the Burpsuite agent, manually mapping the application through the following links, submitting the form, and stepping through the multi-step process. This process will populate the agent's history and target site map with all the requested content, and through the passive spider will be added to the Sitemap, which can infer any further content from the application's response (via links, forms, etc.). You can also request any unsolicited sites (shown in gray in the Sitemap) and request them using your browser.
In the necessary is to perform automatic mapping-you can use the various methods in the Burpsuite auto-mapping process. Can be carried out automatically spider crawl, request in Site map unsolicited site. Be sure to check all spider crawl settings before using this tool.
Use the Content Lookup feature to find links to further actions that allow you to browse or spider crawling visible content.
Use Burpsuite Intruder (Intruder) to perform custom discovery, looping, and determining hits through a common file and directory list.
Note that before any automated actions are performed, it may be necessary to update various aspects of the configuration of the burpsuite, such as the scope of the target and session processing.
Analyze the attack surface of the application-the process of mapping the application fills in the history of the proxy server and the target site map with all the Burpsuite has captured information about the application. The two libraries contain features to help you analyze the information they contain and evaluate the application exposure for the attack surface. Additionally, you can use the Burpsuite Target Analyzer to report the extent of the attack surface and the URLs that are used by different types of applications.
For more detailed information, refer to the following two websites:
Adema:http://www.nxadmin.com/tools/689.html
dark clouds:http://drops.wooyun.org/tools/1548
Burp Suite Use