The principle of the port scanner is very easy. There is nothing more than a socket to operate, to be able to connect it to think that port is open.
Import socketdef Scan (port): s = socket.socket () if s.connect_ex ((' localhost ', port) = = 0: print port, ' Open ' s.close () if __name__ = = ' __main__ ': map (Scan,range (1,65536))
So one of the simplest port scanners came out.
Wait, hello, half a day did not respond, that is because the socket is blocked, each connection to wait a very long time to timeout.
We add it ourselves to the timeout.
S.settimeout (0.1)
Run again, it feels much faster.
Multi-threaded version number
Import socketimport threadingdef Scan (port): s = socket.socket () s.settimeout (0.1) if S.CONNECT_EX (' localhost ', port) = = 0: print port, ' open ' s.close () if __name__ = = ' __main__ ': threads = [Threading. Thread (Target=scan, args= (i,)) for I in Xrange (1,65536)] map (lambda X:x.start (), threads)
Run, whoa, come on, it's going to throw the wrong. Thread.error:can ' t start new thread.
Think about it, this process opens up 65,535 threads, and there are two possibilities. One is exceeding the maximum number of threads, and one is exceeding the maximum number of socket handles. Linux can be changed by Ulimit.
Suppose not to change the maximum limit, how to use multi-threaded not error?
Add a queue, turn into producer-consumer mode, and open a fixed thread.
Multithreading + Queue version number
Import socketimport threadingfrom Queue import queuedef Scan (port): s = socket.socket () s.settimeout (0.1) If S.CONNECT_EX (' localhost ', port) = = 0: print port, ' open ' s.close () def worker (): While not Q.empty (): C6/>port = Q.get () try: scan (port) finally: q.task_done () if __name__ = = ' __main__ ': q = Queue () Map (Q.put,xrange (1,65535)) threads = [Threading. Thread (Target=worker) for I in xrange (+)] map (lambda X:x.start (), threads) Q.join ()
Here, 500 threads are opened, and the task is continuously taken from the queue.
multiprocessing+ Queue version number
You can't open 65,535 of processes? Or with the producer-consumer model
Import multiprocessingdef Scan (port): s = socket.socket () s.settimeout (0.1) if S.CONNECT_EX (' localhost ', port) = = 0: print port, ' open ' s.close () def worker (Q): While not Q.empty (): port = Q.get () C7/>try: Scan (port) finally: q.task_done () if __name__ = = ' __main__ ': q = multiprocessing. Joinablequeue () map (Q.put,xrange (1,65535)) jobs = [multiprocessing. Process (Target=worker, args= (Q,)) for I in xrange (+)] map (lambda X:x.start (), jobs)
Notice here that the queue is passed as a parameter to the worker, because it is the queue of the safe of the process. Otherwise it will be an error.
Also practical is the Joinablequeue (). As the name implies is the ability to join ().
Spawn version number of gevent
from gevent import Monkey; Monkey.patch_all (); import geventimport socket...if __name__ = = ' __main__ ': threads = [Gevent.spawn (scan, i) for I in X Range (1,65536)] gevent.joinall (threads)
Note that the monkey patch must be import before the patch, or exception keyerror. For example, you cannot import threading first, and then monkey patches.
Gevent's pool version number
from gevent import Monkey; Monkey.patch_all (); Import socketfrom gevent.pool import pool...if __name__ = = ' __main__ ': pool = Pool Pool.map (Scan,xrange (1,65536)) Pool.join ()
concurrent.futures Version number
Import socketfrom Queue import queuefrom concurrent.futures import threadpoolexecutor...if __name__ = = ' __main__ ': q = Queue () map (Q.put,xrange (1,65536)) with Threadpoolexecutor (max_workers=500) as executor: for I in range (a): executor.submit (WORKER,Q)
Copyright notice: This article blog original article. Blogs, without consent, may not be reproduced.
Mobile Python-write port scanner and various concurrent attempts (multi-threaded/multi-process/gevent/futures)