The port scanner works very easily. It is nothing more than operating the socket. If you can connect, it is determined that the port is open.
import socketdef scan(port): s = socket.socket() if s.connect_ex(('localhost', port)) == 0: print port, 'open' s.close()if __name__ == '__main__': map(scan,range(1,65536))
This is the simplest port scanner.
Wait a minute. I didn't respond because the socket is blocked. It takes a long time for each connection to time out.
The timeout value we added to it.
S. setTimeout (0.1)
It seems much faster to run it again.
Multi-threaded version
import socketimport threadingdef scan(port): s = socket.socket() s.settimeout(0.1) if s.connect_ex(('localhost', port)) == 0: print port, 'open' s.close()if __name__ == '__main__': threads = [threading.Thread(target=scan, args=(i,)) for i in xrange(1,65536)] map(lambda x:x.start(),threads)
Run it. Wow, it's so fast. It's almost time to throw an error. Thread. Error: Can't start new thread.
Think about it. This process has 65535 threads enabled. There are two possibilities: exceeding the maximum number of threads and exceeding the maximum number of socket handles. In Linux, you can use ulimit to modify the settings.
If you do not modify the maximum limit, how can I use multiple threads without an error?
Add a queue to the producer-consumer mode and open a fixed thread.
Multi-thread + queue version
import socketimport threadingfrom Queue import Queuedef scan(port): s = socket.socket() s.settimeout(0.1) if s.connect_ex(('localhost', port)) == 0: print port, 'open' s.close()def worker(): while not q.empty(): port = q.get() try: scan(port) finally: q.task_done()if __name__ == '__main__': q = Queue() map(q.put,xrange(1,65535)) threads = [threading.Thread(target=worker) for i in xrange(500)] map(lambda x:x.start(),threads) q.join()
Here, we open 500 threads and constantly pull tasks from the queue.
Multiprocessing + queue version
Cannot open 65535 processes at all? Or the producer-consumer model
import multiprocessingdef scan(port): s = socket.socket() s.settimeout(0.1) if s.connect_ex(('localhost', port)) == 0: print port, 'open' s.close()def worker(q): while not q.empty(): port = q.get() try: scan(port) finally: q.task_done()if __name__ == '__main__': q = multiprocessing.JoinableQueue() map(q.put,xrange(1,65535)) jobs = [multiprocessing.Process(target=worker, args=(q,)) for i in xrange(100)] map(lambda x:x.start(),jobs)
Note that the queue is passed into the worker as a parameter here, because it is a process safe queue, or an error will be reported.
Joinablequeue () is also used. As the name suggests, joinablequeue () can be joined.
Spawn version of gevent
from gevent import monkey; monkey.patch_all();import geventimport socket...if __name__ == '__main__': threads = [gevent.spawn(scan, i) for i in xrange(1,65536)] gevent.joinall(threads)
Note that the monkey patch must be imported before the patch. Otherwise, exception keyerror occurs. For example, you cannot import threading first, and then monkey patch.
Gevent pool version
from gevent import monkey; monkey.patch_all();import socketfrom gevent.pool import Pool...if __name__ == '__main__': pool = Pool(500) pool.map(scan,xrange(1,65536)) pool.join()
Concurrent. Futures version
import socketfrom Queue import Queuefrom concurrent.futures import ThreadPoolExecutor...if __name__ == '__main__': q = Queue() map(q.put,xrange(1,65536)) with ThreadPoolExecutor(max_workers=500) as executor: for i in range(500): executor.submit(worker,q)
Write a port scanner and various attempts