Web Vulnerability Scanning Tool-python

Source: Internet
Author: User
Tags http cookie simple sql injection sql injection

This is a web vulnerability scanning gadget made last year, mainly for simple SQL injection vulnerabilities, SQL blinds and XSS vulnerabilities, code is to see the github foreign God (heard is one of the writers of SMAP) two small tools source, according to the idea of their own writing. The following are the usage instructions and source code.

First, instructions for use:

1. Operating Environment:

Linux command line interface +python2.7

2. Program Source code:

Vim scanner//to create a file named scanner

Chmod a+xscanner//Modify the file permissions for the executable

3. Run the program:

Python scanner//Running files


If you do not carry the target URL information, the interface outputs help information, reminding you which parameters can be entered.

Parameters include:

--h Output Help information

--url scanned URLs

--data parameters of the POST request method

--cookie HTTP request Header Cookie value

--user-agent HTTP request Header User-agent value

--random-agent whether to use browser camouflage

--referer the previous interface of the destination URL

--proxy HTTP request Header proxy value

For example, scan "Http://127.0.0.1/dvwa/vulnerabilities/sqli/?id=&Submit=Submit"

Python scanner--url= "Http://127.0.0.1/dvwa/vulnerabilities/sqli/?id=&Submit=Submit"--cookie= "security=low; Phpsessid=menntb9b2isj7qha739ihg9of1 "

The output scan results are as follows:

The results show:

There is an XSS vulnerability, vulnerability Matching vulnerability signature Library ""; xss.< "", which belongs to the type outside of the embedded label.

There is a SQL injection vulnerability in which the target Web server's database type is mysql.

There is a blind SQL injection vulnerability.


Second, the source code:

Code verification can run, I personally recommend using DVWA test it.

#!-*-coding:utf-8-*-import optparse, random, Re, string, urllib, Urllib2,difflib,itertools,httplibname = "Scanner for                    RXSS and SQLI "AUTHOR =" Lishuze "prefixes = (" ",") "," ' "," ') "," \ "") Suffixes = ("", "---", "#") Boolean_tests = ("and%d=%d", "OR Not (%d=%d)") Tamper_sql_char_pool = (' (', ') ', ' \ ', ' "') TAMPER     _xss_char_pool = (' \ ' ', ' "', ' > ', ' < ', '; ')     GET, post = "Get", "post" Cookie, UA, REFERER = "Cookie", "user-agent", "REFERER"                                                TEXT, Httpcode, TITLE, HTML = xrange (4) _headers = {} User_agents = ("mozilla/5.0" (X11; Linux i686; rv:38.0) gecko/20100101 firefox/38.0 "," mozilla/5.0 (Windows NT 6.1; WOW64) applewebkit/537.36 (khtml, like Gecko) chrome/45.0.2454.101 safari/537.36 "," mozilla/5.0 (Macintosh; U Intel Mac OS X 10_7_0; En-US) applewebkit/534.21 (khtmL, like Gecko) chrome/11.0.678.0 safari/534.21 ",) Xss_patterns = ((R" <!--[^>]*% (chars) s|% ( Chars) s[^<]*--> "," \ "<!--. '. XSS. '. -->\ ", Inside the comment", None), (R "(? s) <script[^>]*>[^<]*?" [^< ']*% (chars) s|% ( chars) s[^< ']* ' [^<]*</script>, ' \ ' <script>. '. XSS. '. </script>\ ", enclosed by <script> tags, inside single-quotes", None), (R ' (? s) <script[^>]*>[^<] *?" [^< "]*% (chars) s|% ( Chars) s[^< "]*" [^<]*</script> ', "' <script>.\". xss.\ ".</script>", enclosed by <script> tags, inside double-quotes ", None), (R" (? s) <script[^>]*>[^<]*?% ( Chars) s|% ( Chars) s[^<]*</script> "," \ "<script>.xss.</script>\", enclosed by <script> tags ", None), (R ">[^<]*% (chars) s[^<]* (<|\z)", "\" >.xss.<\ ", outside of tags", r "(? s) <script.+?</script>| <!--. *?--> "), (r" <[^>]* ' [^> ']*% (chars) s[^> ']* ' [^>]*> "," \ "&LT;.. XSS. '. >\ ", inSide the tag, inside Single-quotes ", R" (? s) <script.+?</script>|<!--. *?--> "), (R ' <[^>]*" [^> "] *% (chars) s[^> "]*" [^>]*> ', "' <.\". xss.\ ".>", inside the tag, inside Double-quotes ", R" (? s) <script.+? </script>|<!--. *?--> "), (R" <[^>]*% (chars) s[^>]*> "," \ "<.xss.>\", inside the tag, Outside of quotes ", R" (? s) <script.+?</script>|<!--. *?--> ")) dbms_errors = {" MySQL ": (r" SQL syntax.*mysq L ", R" warning.*mysql_.* ", r" valid MySQL result ", R" mysqlclient\. ")," Microsoft SQL Server ": (r" driver.* sql[\-\_\]*se RVer ", r" OLE db.* SQL Server ", r" (\w|\a) SQL Server.*driver ", r" warning.*mssql_.* ", R" (\w|\a) SQL Server.*[0-9a-fa-f]{8} " , R "(? s) exception.*\wsystem\. Data\. Sqlclient\. ", R" (? s) exception.*\wroadhouse\. Cms\. ")," Microsoft Access ": (r" Microsoft Access Driver ", r" JET database Engine ", r" Access Database Engine ")," Ora Cle ": (r" Ora-[0-9][0-9][0-9][0-9] ", r" Oracle error ", R" Oracle.*driver ", R" Warning.*\wocI_.* ", R" warning.*\wora_.* ")}def _retrieve_content_xss (URL, data=none): Surl=" "For I in Xrange (len (URL)): I F i > Url.find ('? '): Surl+=surl.join (Url[i]). Replace (","%20 ") Else:surl+=surl.join (url[ I]) Try:req = Urllib2. Request (sURL, data, _headers) retval = Urllib2.urlopen (req, timeout=30). Read () except Exception, Ex:retval = GetAttr (ex, "message", "") return retval or "def _retrieve_content_sql (URL, data=none): retval = {Httpcode:http Lib. OK} surl= "" For I in Xrange (len (URL)): If I > Url.find ('? '): Surl+=surl.join (Url[i]). Replace (' ', "%20") Else:surl+=surl.join (url[i]) Try:req = Urllib2. Request (sURL, data, _headers) retval[html] = Urllib2.urlopen (req, timeout=30). Read () except Exception, ex:r Etval[httpcode] = GetAttr (ex, "Code", None) retval[html] = GetAttr (ex, "message", "") match = Re.search (r "< Title> (? P<result>[^<] +) </title> ", retval[html], re. I) Retval[title] = Match.group ("result") if match else None retval[text] = Re.sub (r "(? si) <script.+?</script&gt ;|<!--. +?-->|<style.+?</style>|<[^>]+>|\s+ "," "", retval[html]) return retvaldef Scan_page_ XSS (URL, data=none): print "Start scanning rxss:\n" retval, usable = false, false URL = re.sub (r "= (&|\z)", "=        1\g<1> ", url) if URL else url data=re.sub (r" = (&|\z) "," =1\g<1> ", data) if data else data try: For phase in (GET, POST): current = URL If phase was GET else (data or "") for match in Re.finditer (r "(\a| [? &]) (? p<parameter>[\w]+) =) (? p<value>[^&]+) ", current): found, usable = False, True print" scanning%s paramete                R '%s '% (phase, Match.group ("parameter")) prefix = ("". Join (Random.sample (String.ascii_lowercase, 5))) suffix = ("". Join (Random.sample (String.ascii_lowercase, 5)))                If not found:tampered = Current.replace (match.group (0), "%s%s"% (Match.group (0), Urll Ib.quote ("%s%s%s%s"% ("'", Prefix, "" ". Join (Random.sample (Tamper_xss_char_pool, Len (Tamper_xss_char_pool))), suffix) )) content = _RETRIEVE_CONTENT_XSS (tampered, data) If phase is GET else _retrieve_content_xss (url, tam pered) for the sample in Re.finditer ("%s" ([^]+?) %s "% (prefix, suffix), content, re.                            I): #print Sample.group () for Regex, info, Content_removal_regex in Xss_patterns: context = Re.search (regex% {"chars": Re.escape (Sample.group (0))}, Re.sub (Content_removal_regex or "", "", CO ntent), Re. I) If context and not found and Sample.group (1). Strip (): print " !!!                                %s parameter '%s ' appears to be XSS vulnerable (%s) "% (phase, Match.group (" parameter "), info) Found = RetVal = True If noT Usable:print "(x) no usable get/post parameters found" except Keyboardinterrupt:print "\ r (x) Ct Rl-c pressed "return retvaldef scan_page_sql (URL, data=none): print" Start scanning sqli:\n "retval, usable = Fal SE, False url = re.sub (r "= (&|\z)", "=1\g<1>", url) if URL else url data=re.sub (r "= (&|\z)", "=1\g<1&            gt; ", data) if data else data try:for phase in (GET, POST): current = URL If phase is GET else (data or" ") For match in Re.finditer (r "(\a|[? &]) (? p<parameter>\w+) =) (?                p<value>[^&]+) ", current): vulnerable, usable = False, True original=none Print "Scanning%s parameter '%s '"% (phase, Match.group ("parameter")) tampered = Current.replace (ma Tch.group (0), "%s%s"% (Match.group (0), Urllib.quote ("" ". Join (Random.sample (Tamper_sql_char_pool, Len (tamper_sql_ (Char_pool)))) content = _retrieve_content_sql (tampered, DATA) If phase is GET else _retrieve_content_sql (URL, tampered) for (DBMS, regex) in (DBMS, regex) for DBMS in dbms_errors fo R regex in Dbms_errors[dbms]): If not vulnerable and re.search (regex, content[html], re. I): PRINT "!!! %s parameter '%s ' could be error SQLi vulnerable (%s) "% (phase, Match.group (" parameter "), DBMS) re Tval = vulnerable = True Vulnerable = Falseoriginal = Original or (_retrieve_content_sql (current, data) if Phase is GET-else _retrieve_content_sql (URL, current))-for Prefix,boolean,suffix in Itertools.product (prefi xes,boolean_tests,suffixes): If not vulnerable:template = "%s%s%s"% (Prefix,bo Olean, suffix) payloads = Dict ((_, Current.replace (Match.group (0), "%s%s"% (Match.group (0), Urllib . quote (template% (1 if _ Else 2, 1), safe= '% '))))] contents = Dict ((_, _ret RiEve_content_sql (Payloads[_], data) If phase is GET else _retrieve_content_sql (URL, payloads[_])) (= _ In (False, True)) If all (_[httpcode "for _" (original, Contents[true], Contents[false])) and (any (original[_] = = Con                        Tents[true][_]! = Contents[false][_] For _ In (Httpcode, TITLE)): vulnerable = True Else:ratios = Dict ((_, Difflib.                            Sequencematcher (None, Original[text], Contents[_][text]). Quick_ratio ()) for _ In (True, False)) Vulnerable = All (Ratios.values ()) and ratios[true] > 0.95 and Ratios[false] < 0.95 if Vulner Able:print "!!! %s parameter '%s ' could be error Blind SQLi vulnerable "% (phase, Match.group (" parameter ")) RET        val = True If not Usable:print "(x) no usable get/post parameters found" except Keyboardinterrupt: print "\ r (x) ctrl-c PreSsed "Return retvaldef init_options (Proxy=none, Cookie=none, Ua=none, Referer=none): global _headers _headers = Dict (Filter (lambda _: _[1], ((Cookie, Cookie), (UA, UA or NAME), (REFERER, REFERER))) Urllib2.install_opener (urllib2.b Uild_opener (URLLIB2. Proxyhandler ({' HTTP ': proxy}) if proxy else None) if __name__ = = "__main__": print "----------------------------------- -----------------------------------------------"print"%s\nby:%s "% (NAME, AUTHOR) print"------------------------- ---------------------------------------------------------"parser = Optparse. Optionparser () parser.add_option ("--url", dest= "url", help= "Target url") parser.add_option ("--data", dest= "data", he lp= "POST data") parser.add_option ("--cookie", dest= "Cookie", help= "HTTP Cookie Header Value") Parser.add_option ("--u Ser-agent ", dest=" UA ", help=" HTTP user-agent Header Value ") parser.add_option ("--random-agent ", dest=" randomagent ", AC Tion= "Store_true", help= "use randomly selectedHTTP user-agent Header Value ") parser.add_option ("--referer ", dest=" Referer ", help=" http Referer header value ") Pars        Er.add_option ("--proxy", dest= "proxy", help= "HTTP proxy address") options, _ = Parser.parse_args () if Options.url: Init_options (Options.proxy, Options.cookie, options.ua if not options.randomagent else Random.choice (USER_AGENTS), O Ptions.referer) result_xss= SCAN_PAGE_XSS (Options.url if Options.url.startswith ("http") Else "http://%s"% options. URL, options.data) print "\nscan results:%s vulnerabilities found"% ("possible" if RESULT_XSS else "no") p Rint "----------------------------------------------------------------------------------" Result_sql = scan_page_s QL (Options.url if Options.url.startswith ("http") Else "http://%s"% Options.url, options.data) print "\nscan result S:%s vulnerabilities found "% (" possible "if result_sql else" no ") print"---------------------------------------- ------------------------------------------"Else:parser.print_help () 


Finally, intentionally to this when the graduation design, can add QQ29027646 buy paper to see the principle.




Web Vulnerability Scanning Tool-python

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.