Https://github.com/tony1016/BurpLogFilter a python3 written by a small program that is used to filter burpsuite logs
Why?
Why do you write this program? Powerful Sqlmap support the use of Burpsuite logs for batch analysis, but the Burpsuite log records all traffic to the agent, including static resources Ah, duplicate submissions Ah, which will affect the efficiency of sqlmap analysis. So I intend to write a small program, can do:
- Can filter requests by domain name
- Static resource requests can be automatically filtered
- You can automatically filter URLs by pattern, that is, requests for the same URL and parameters, leaving only one (the parameter value has no effect on sqlmap) USAGE1. Tick the Burpsuite output log (check the logging option)
- 2. Use burplogfilter.py filter log (using burplogfilter.py to filter log file)
Usage:python3 burplogfilter.py [options]options:- H Show this showHelp- f filepath the burpsuite log To analyze --host keyword,--host=keyword host name filter- v Show debug messageexamples: python3 Burplogfilter.py-f/tmp/burp.log--host=' google.com> Burp-proxy.log
3. Use Sqlmap Batch analysis log (using SQLMAP to batch analyze log)Sqlmap-l Burp-proxy.log--batch-smart
4. View the analysis results (check result)linux:ls/usr/local/cellar/sqlmap/in the output directory
Windows:sqlmap under
- Then do a penetration test concurrency Framework Security Research, write the Access Automatic Scan script, ah ah ah
Burpsuite Log Analysis Import sqlmap for batch scanning-burplogfilter