Example of a common port scanner
The principle of the port scanner is very simple, the operation of the socket to determine the connection status determines the opening of the host Port.
import socket def scan (port): s = Socket.socket ()
if s.connect_ex (( " localhost , port) == 0: print port, open '
This is the basic code of a socket Scanner.
however, if the direct run will wait for a long time to be unresponsive, this is because the socket is blocked and will not enter the next connection until each connection timeout has Elapsed.
Add a timeout to this code
S.settimeout (0.1)
The complete code is as follows
ImportSocketdefScan (port): S=Socket.socket () S= Settimeont (0.1) ifS.CONNECT_EX (('localhost', port) = =0:PrintPort'Open's.close ()if __name__=='__main__': Map (scan,range (1,65536))
The focus of this article is not on the scanner functional part. The focus is on the improvement and optimization of code quality to improve the efficiency of code Operation.
Multi-threaded Version:
ImportSocketImportThreadingdefScan (port): S=socket.socket () s.settimeout (0.1) ifS.CONNECT_EX (('localhost', port) = =0:PrintPort'Open's.close ()if __name__=='__main__': Threads= [threading. Thread (target=scan, args= (i,)) forIinchXrange (1,65536)] Map (LambdaX:x.start (), Threads)
Run up, the speed is really fast, but throws an exception: Thread.error:can ' t start new thread
This process opens 65,535 threads, with two possibilities, one exceeding the maximum number of threads, and one exceeding the maximum number of socket Handles. Linux can be modified by ulimit.
If you do not modify the maximum limit, how to use multithreading do not error it?
Add a queue, turn into Producer-consumer mode, and open a fixed thread.
Multithreading + Queue Version:
ImportSocketImportThreading fromQueueImportQueuedefScan (port): S=socket.socket () s.settimeout (0.1) ifS.CONNECT_EX (('localhost', port) = =0:PrintPort'Open's.close ()defworker (): while notq.empty (): Port=Q.get ()Try: Scan (port)finally: Q.task_done ()if __name__=='__main__': Q=Queue () map (q.put,xrange (1,65535)) Threads= [threading. Thread (target=worker) forIinchXrange (500)] Map (Lambdax:x.start (), threads) q.join ()
Open 500 threads, continue to remove the task from the queue to do ...
Multiprocessing + Queue Version:
You can't open 65,535 of processes? Or with the Producer-consumer model
ImportMultiprocessingdefScan (port): S=socket.socket () s.settimeout (0.1) ifS.CONNECT_EX (('localhost', port) = =0:PrintPort'Open's.close ()defworker (q): while notq.empty (): Port=Q.get ()Try: Scan (port)finally: Q.task_done ()if __name__=='__main__': Q=Multiprocessing. Joinablequeue () map (q.put,xrange (1,65535)) Jobs= [multiprocessing. Process (target=worker, args= (q,)) forIinchXrange (100)] Map (LambdaX:x.start (), Jobs)
Note here that the queue is passed as a parameter to the worker, because it is the queue of the safe of the process, otherwise it will Error.
Also useful is Joinablequeue (), as the name implies is can join ().
Gevent version of Spawn:
from Import ImportImport if __name__ ' __main__ ': for in xrange (1,65536)]
Note that the monkey patch must be exception before the patch, or Keyerror. for example, you can't import threading first, then Monkey Patches.
Gevent's Pool Version:
from Import importfromimportif__name__' __main__ ' : = Pool (pool.map) (scan,xrange (1,65536))
Concurrent.futures version:
ImportSocket fromQueueImportQueue fromConcurrent.futuresImportThreadpoolexecutor ...if __name__=='__main__': Q=Queue () map (q.put,xrange (1,65536)) with Threadpoolexecutor (max_workers=500) as Executor: forIinchRange (500): executor.submit (worker,q)
multi-threading optimization of Python high-performance code