To do the penetration test, there is a larger project, which has hundreds of sites, so you must first determine which sites are normal and which sites are not normal. So I made up a small script, for the convenience of later use.
The code for the implementation is as follows:
#!/usr/bin/python
#-*-coding:utf-8-*-' '
@Author: Joy_nick
@ Blog: http://byd.dropsec.xyz/
'
Import Requests
Import sys
f = open (' Url.txt ', ' r ')
URL = f.readlines ()
length = Len (URL)
Url_result_success=[]
url_result_failed=[] for
i in range (0,length):
try:
response = Requests.get (Url[i].strip (), Verify=false, Allow_redirects=true, timeout=5)
if Response.status_code!= 200:
raise requests. Requestexception (U "Status code error: {}". Format (Response.status_code))
except requests. Requestexception as E:
url_result_failed.append (Url[i])
continue
url_result_success.append (url[i) )
f.close ()
Result_len = Len (url_result_success) for
I in range (0,result_len):
print ' url%s '% Url_ Result_success[i].strip () + ' open success '
The test results are as follows:
The problems encountered:
Just start testing, encountered as long as it can not be wrong, or does not exist, the direct error to stop the program. It turned out to be a mistake when Response.status_code!= 200 took the status code here.
Because some Web sites can not be opened, will not return the status code. So the program doesn't know! ==200 how to deal with it.
Workaround:
Use try except else to catch exceptions
The specific code is:
Try:
response = Requests.get (Url[i].strip (), Verify=false, Allow_redirects=true, timeout=5)
if Response.status_code!=:
raise requests. Requestexception (U "Status code error: {}". Format (Response.status_code))
except requests. Requestexception as E:
url_result_failed.append (Url[i])
continue
The above is a small series to introduce the use of Python script to achieve batch site survival detection problems and solutions, I hope to help you, if you have any questions please give me a message, small series will promptly reply to everyone. Here also thank you very much for the cloud Habitat Community website support!