Nineth article: Network programming supplement and process

Source: Internet
Author: User
Tags mutex stdin

The content of this article
    1. UDP protocol Sockets
    2. How to open a process
    3. Multi-process implementation of concurrent socket communication
    4. Join method
    5. Daemon process
    6. Sync Lock
    7. Process queue
    8. Producer Consumer Model
    9. Process Pool
    10. Paramiko Module

One, the UDP protocol socket

1.TCP and UDP differ in the transport layer:
UDP is a non-connected, unreliable datagram protocol. TCP provides connection-oriented, reliable byte streams.

2. Use UDP common applications:
DNS (domain Name System), NFS (Network File system), SNMP (Simple Network Management Protocol).

3. Code application:

Service side:

#!/usr/binl/env python#encoding:utf-8#author:yangleiimport socketserverclass Myudphandler (socketserver. Baserequesthandler):    def handle (self):        print (self.request)        self.request[1].sendto (self.request[0]. Upper (), self.client_address) if __name__ = = ' __main__ ':    s = socketserver. Threadingudpserver ((' 127.0.0.1 ', 8080), Myudphandler)    s.serve_forever ()

Client:

#!/usr/binl/env python#encoding:utf-8#author:yangleifrom Socket Import *udp_client = socket (Af_inet,sock_dgram) while True:    msg=input (' >>: '). Strip ()    udp_client.sendto (Msg.encode (' Utf-8 '), (' 127.0.0.1 ', 8080))    DATA,SERVER_ADDR = Udp_client.recvfrom (1024x768)    print (Data.decode (' Utf-8 '))

Note: However, this method does not control the concurrency of the client, the number of concurrent quantities reached a certain amount, the server will be down, the solution will be provided later.

Second, the way to open the process

There are two ways to open a process:

(1) Using the module to open the process:

#!/usr/binl/env python#encoding:utf-8#author:yangleifrom multiprocessing Import processimport timedef work (name):    print (' Task <%s> is runing '%name)    time.sleep (2)    print (' Task <%s> was done '% name) if __name__ = = ' __main__ ':    p1 = Process (target=work,args= (' Xiaolan ',))    P2 = Process (target=work,args= (' Xiaohong ',))    P1.start ()    p2.start ()    print (' main program ')

(2) Use the class to open the process:

#!/usr/binl/env python#encoding:utf-8#author:yangleifrom multiprocessing Import processimport timeclass MyProcess ( Process):    def __init__ (self,name):        super (). __init__ ()        self.name = name    def run (self):        print (' Task <%s> is runing '% self.name)        time.sleep (2)        print (' Task <%s> was done '% self.name) if __name__ = = ' __main__ ':    p = myprocess (' Xiaolan ')    p.start ()    print (' main program ')

Three, multi-process implementation of concurrent socket communication

Based on the way we've just learned to open a process, we're going to start a network communication in a process way.

Service side:

#!/usr/binl/env python#encoding:utf-8#author:yangleifrom multiprocessing Import processfrom socket Import *s = socket (A F_inet,sock_stream) s.setsockopt (sol_socket,so_reuseaddr,1) s.bind ((' 127.0.0.1 ', 8080)) S.listen (5) def talk (Conn, Addr): While    True:        try:            data=conn.recv (1024x768)            if not data:break            conn.send (Data.upper ())        Except Exception:            break    conn.close () if __name__ = = ' __main__ ': While    True:        conn,addr = s.accept ()        p=process (target=talk,args= (conn,addr))        P.start ()    s.close ()

Client:

#!/usr/binl/env python#encoding:utf-8#author:yangleifrom Socket Import *c = socket (af_inet,sock_stream) c.connect (' 127.0.0.1 ', 8080)) while True:    msg = input (' >>: '). Strip ()    if not msg:continue    c.send (Msg.encode ( ' Utf-8 ')    data = C.recv (1024x768)    print (Data.decode (' Utf-8 ')) C.close ()

Iv. Join method

1. Definition:

(1) The Join method is used to block the main process (blocking, unable to execute subsequent join statements), and focus on the execution of multiple processes.

(2) In the case of multi-process multi-join, the Join method of each process is executed sequentially, before one end can execute the latter.

(3) Without parameters, wait until the end of the process to begin the join of the next process.

2. Code:

#!/usr/binl/env python#encoding:utf-8#author:yangleifrom multiprocessing Import processimport timedef work (name):    print (' Task <%s> is runing '%name)    time.sleep (3)    print (' Task <%s> was done '% name) if __name__ = = ' __main__ ':    p1 = Process (target=work,args= (' Xiaolan ',))    P2 = Process (target=work,args= (' Xiaohong ',))    P3 = Process (target=work,args= (' Xiaolv ',))    p_list = [P1, p2, p3] for    p in p_list:        P.start () for    p in P_lis T:        p.join ()    print (' main process ')

Five, The Guardian process

1. Definition:

(1) The daemon is created by the main program.

(2) The daemon terminates after the execution of the main process code is completed.

(3) The daemon can no longer open the child process, or throw an exception:

Assertionerror:daemonic processes is not allowed to has children.

2. Code:

#!/usr/binl/env python#encoding:utf-8#author:yangleifrom multiprocessing Import processimport timedef work (name):    print (' Task <%s> is runing '%name)    time.sleep (2)    print (' Task <%s> was done '% name) if __name__ = = ' __main__ ':    p1 = Process (target=work,args= (' Xiaolan ',))    P1.daemon = True    p1.start ()    print (' main program ')

Six, synchronous lock

1. Definition:

is typically used to synchronize access to shared resources, creating a lock object for each shared resource when you need to access the resource, call the Qcuqire method to get the lock object (if the lock is already acquired by another thread, the current thread waits for the time to be released), and after the resource has been accessed, Release the lock in the call release method.

2. Code:

#!/usr/binl/env python#encoding:utf-8#author:yangleifrom multiprocessing Import process,lockimport timedef work (name , mutex):    mutex.acquire ()    print (' Task <%s> is runing '%name)    time.sleep (2)    print (' Task <%s > is do '% name '    mutex.release () if __name__ = = ' __main__ ':    mutex = Lock ()    p1 = Process (target=work,args= (' Xiaolan ', mutex))    P2 = Process (target=work,args= (' Xiaohong ', mutex))    P1.start ()    p2.start ()    print (' main program ')

3. Code application:

Simulate the process of robbing a ticket

Python code:

#!/usr/binl/env python#encoding:utf-8#author:yangleiimport jsonimport osimport timefrom multiprocessing Import Process,lockdef search ():    dic = json.load (open (' Db.txt '))    print (' \033[32m[%s] See remaining votes <%s>\033[0m '% ( Os.getpid (), dic[' Count ']) def get_ticket ():    dic = json.load (open (' Db.txt '))    time.sleep (0.5) #模拟读数据库的网络延迟    if dic[' count '] > 0:        dic[' count ']-= 1        time.sleep (0.5)  # The network delay of the emulated write database        json.dump (' Db.txt ', ' W '))        print (' \033[31m%s ticket successful \033[0m '%os.getpid ()) def task (mutex):    search ()    mutex.acquire ()    get_ticket ()    mutex.release () if __name__ = = ' __main__ ':    mutex = Lock () for    I in range:        p = Process (target=task,args= (mutex))        P.start ()

Db.txt file:

{"Count": 0}

4. Disadvantages:

(1) Low operating efficiency

(2) need to lock their own processing, cumbersome operation

Vii. Process Queue

1. Definition:

(1) queue ([MaxSize]): Creates a shared process queue, which is a multi-process secure queue that enables data transfer between multiple processes using queue.

(2) MaxSize is the maximum number of items allowed in the queue, and no size limit is omitted.

2. Code:

#!/usr/binl/env python#encoding:utf-8#author:yangleifrom multiprocessing Import Queueq = Queue (3) q.put (' first ') q.put (' second ') Q.put (' third ') print (Q.get ()) print (Q.get ()) print (Q.get ())

Viii. Producer Consumer Model

1. Definition:

At work, you might encounter a situation where a module is responsible for generating data that is handled by another module (the module here is generalized, which can be classes, functions, threads, processes, and so on). The module that produces the data is visually called the producer, and the module that processes the data is called the consumer. Between producers and consumers in the addition of a buffer zone, our image is called the warehouse, the producer is responsible for the warehouse into the goods, and consumers are responsible for the goods from the warehouse, which constitutes the producer of consumer models.

2. Advantages:

(1) decoupling.

(2) Support concurrency.

(3) Support free and busy uneven.

3. Code:

#!/usr/binl/env python#encoding:utf-8#author:yangleifrom multiprocessing Import Process, Joinablequeueimport time, Osdef producer (q, name): For    I in range (3):        time.sleep (1)        res = '%s%s '% (name, i)        q.put (res)        print ( ' \033[45m<%s> produced [%s]\033[0m '% (Os.getpid (), res))    Q.join () def consumer (q): While    True:        res = Q.get ()        Time.sleep (1.5)        print (' \033[34m<%s> ate [%s]\033[0m '% (Os.getpid (), res))        Q.task_done () if __ name__ = = ' __main__ ':    q = joinablequeue ()    p1 = Process (Target=producer, args= (q, ' braised pork '))    P2 = Process ( Target=producer, args= (q, ' Fish fragrant shredded pork ')    p3 = Process (Target=producer, args= (q, ' pot meat '))    C1 = Process (target= Consumer, args= (Q,))    C2 = Process (Target=consumer, args= (Q,))    C1.daemon = True    C2.daemon = True    P1.start ()    P2.start ()    P3.start ()    C1.start ()    C2.start ()    p1.join ()    print (' main program ')

Nine, Process pool

1. Definition:

Pool can provide a specified number of processes for the user to invoke, and when a new request is submitted to the pool, a new process is created to execute the request if it is not full, but if the number of processes in the pool has reached the specified maximum, the request waits until the process ends in the pool. To create a new process to it.

2. Code:

#!/usr/binl/env python#encoding:utf-8#author:yangleifrom multiprocessing Import poolimport osimport timedef work (n): C0/>print (' Task <%s> is runing '% os.getpid ())    Time.sleep (2)    return n**2if __name__ = ' __main__ ':    p = Pool (4)    res_l = [] for    i in range:        res = P.apply_async (work,args= (i))        res_l.append (res)    P.close ()    p.join ()

3. callback function of the process pool:

#!/usr/binl/env Python#encoding:utf-8#author:yangleiimport Requestsimport  Os,timefrom multiprocessing import pooldef get_page (URL): print (' <%s> get:%s '% (Os.getpid (), url)) Respone =    Requests.get (URL) If Respone.status_code = = 200:return {' url ': url, ' text ': Respone.text}def parse_page (DIC): Print (' <%s> parse:%s '% (Os.getpid (), dic[' url ')) time.sleep (0.5) res = ' url:%s size:%s\n '% (dic[' url '),  Len (dic[' text '))) with open (' Db.txt ', ' a ') as F:f.write (res) if __name__ = = ' __main__ ': p = Pool (4) URLs = [' https://www.baidu.com ', ' https://www.qq.com ', ' https://www.163.com ', ' https://www.sina.com '        ', ' https://www.jd.com ', ' https://www.taobao.com ', ' https://www.sohu.com ',] for URL in URLs: P.apply_async (get_page, args= (URL,), callback=parse_page) P.close () P.join () print (' main process pid: ', Os.getpid ()) ' /pre>

Ten, Paramiko module

1. Definition:

Paramiko is a module written in the Python language that follows the SSH2 protocol and supports the connection of remote servers in a way that is encrypted and authenticated.

All Python-supported platforms, such as Linux, Solaris, BSD, MacOS X, Windows, and so on, are supported by the use of Python, which can run across platforms, so Paramiko Paramiko is one of the best tools when you need to use SSH to connect to another platform from one platform and perform a series of operations.

2. Installation:

Since Paramiko is a third-party module, we need to install it separately.

PIP3 Install Paramiko

3. Code:

(1) How to connect using a password:

#!/usr/binl/env python#encoding:utf-8#author:yangleiimport paramikossh = Paramiko. Sshclient () Ssh.set_missing_host_key_policy (Paramiko. Autoaddpolicy ()) Ssh.connect (hostname= ' 192.168.0.1 ', port=22, username= ' root ', password= ' root123456 ') stdin, stdout, stderr = Ssh.exec_command (' df-h ') result = Stdout.read () print (Result.decode (' Utf-8 ')) Ssh.close ()

(2) How to connect using the secret key:

#!/usr/binl/env python#encoding:utf-8#author:yangleiimport Paramikoprivate_key = Paramiko. Rsakey.from_private_key_file (' id_rsa ') ssh = Paramiko. Sshclient () Ssh.set_missing_host_key_policy (Paramiko. Autoaddpolicy ()) Ssh.connect (hostname= ' 192.168.0.1 ', port=22, username= ' root ', Pkey=private_key) stdin, stdout, stderr = Ssh.exec_command (' df ') result = Stdout.read () print (Result.decode (' Utf-8 ')) Ssh.close ()

(3) Upload or download files:

#!/usr/binl/env python#encoding:utf-8#author:yangleiimport paramikotransport = Paramiko. Transport (' 192.168.0.1 ', ()) Transport.connect (username= ' root ', password= ' root123456 ') sftp = Paramiko. Sftpclient.from_transport (transport) Sftp.put (' Test.txt ', '/tmp/test.txt ') sftp.get ('/tmp/test.txt ', ' test.txt ') Transport.close ()

Nineth article: Network programming supplement and process

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.