Python Development thread: Thread & Daemon Thread & Global Interpreter Lock

Source: Internet
Author: User
Tags mutex

From:https://www.cnblogs.com/jokerbj/p/7460260.html

Introduction of a threading module

The Multiprocess module completely imitates the interface of the threading module, which has a great similarity in the use level, so it is no longer described in detail

Website Link: https://docs.python.org/3/library/threading.html?highlight=threading#

Two ways to open a threadWay One Mode two

Three the difference between opening multiple threads under one process and opening multiple sub-processes under one process1 who has the fastest opening speed 2 A look at the PID 3 threads in the same process share the process's data? Four exercises

Practice One:

multi-threaded concurrent Socket server Client

Practice 2:3 tasks, one to receive user input, one to format user input into uppercase, and one to save formatted results to a file

View CodeFive other thread-related methods
Thread instance object's method  # isAlive (): Returns whether the thread is active.  # getName (): Returns the thread name.  # SetName (): Sets the thread name. Some of the methods provided by the threading module are:  # threading.currentthread (): Returns the current thread variable.  # threading.enumerate (): Returns a list that contains the running thread. Running refers to threads that do not include pre-and post-termination threads until after the thread has started and ends.  # Threading.activecount (): Returns the number of running threads with the same result as Len (Threading.enumerate ()).
View Code

The main thread waits for the child thread to end

From threading Import Threadimport timedef Sayhi (name):    time.sleep (2)    print ('%s say hello '%name) if __name__ = = ' __main__ ':    t=thread (target=sayhi,args= (' Egon '))    T.start ()    t.join ()    print (' main thread ')    print ( T.is_alive ())    '    Egon say hello main thread    False    '
Six Daemon threads

Whether it is a process or a thread, follow: Guardian xxx will wait for the main xxx to be destroyed after the completion of the operation

It should be emphasized that the operation is not terminated

#1. For the main process, run complete means that the main process code has finished running. To the main thread, it means that all the non-daemons in the process of the main thread are running, and the main thread is running.

Detailed Explanation:

#1 The main process has finished running after its code is finished (the daemon is being reclaimed at this point), then the main process will always wait until the non-daemon has finished running and reclaim the child process's resources (otherwise it will produce a zombie process) before it ends, #2 The main thread runs after the other non-daemon threads have finished running (the daemon is recycled at this point). Because the end of the main thread means the end of the process, the resources of the process as a whole are recycled, and the process must ensure that the non-daemon threads are finished before they end.
From threading Import Threadimport timedef Sayhi (name):    time.sleep (2)    print ('%s say hello '%name) if __name__ = = ' __main__ ':    t=thread (target=sayhi,args= (' Egon ',))    T.setdaemon (True) #必须在t. Start () before setting    T.start ()    Print (' main thread ')    print (T.is_alive ())    "main thread    True    "
examples of confusing people

Seven Global interpreter lock Gil Introduction
Definition: In CPython, the global interpreter lock, or GIL, was a mutex that prevents multiple native threads from executing Pyt Hon Bytecodes at once. This lock is necessary mainly because CPython ' s memory management are not thread-safe. (However, since the GIL exists, other features has grown to depend on the guarantees that it enforces.) Conclusion: In the CPython interpreter, multiple threads that are opened under the same process can only have one thread at a time and cannot take advantage of multicore advantages

The first thing to be clear is GIL not the Python feature, which is a concept introduced when implementing the Python parser (CPython). Just like C + + is a set of language (syntax) standards, but can be compiled into executable code with different compilers. Well-known compilers such as Gcc,intel c++,visual C + +. Python is the same, and the same piece of code can be executed through different Python execution environments such as Cpython,pypy,psyco. Like the Jpython there is no Gil. However, because CPython is the default Python execution environment for most environments. So in many people's concept CPython is Python, also take it for granted that the GIL Python language defects. So let's be clear here: Gil is not a python feature, Python can be completely independent of the Gil

This article thoroughly analyzes the Gil's effect on Python multithreading, and it is highly recommended to look at: http://www.dabeaz.com/python/UnderstandingGIL.pdf

Back to the top of 7.1 Gil Introduction

Gil is the essence of a mutex, since it is a mutex, all the nature of the mutex is the same, all the concurrent operation into serial, in order to control the same time shared data can only be modified by a task, and thus ensure data security.

One thing is certain: to protect the security of different data, you should add a different lock.

To understand the Gil, first make a point: each time you execute a python program, you create a separate process. For example, Python Test.py,python Aaa.py,python bbb.py will produce 3 different Python processes

verifying that Python test.py only produces a single process

In a python process, not only the main thread of the test.py or other threads opened by the thread, but also the interpreter-level thread of the interpreter-enabled garbage collection, in short, all threads are running within this process, without a doubt

#1 all the data is shared, which, The code as a data is also shared by all threads (all code of test.py and all code of the CPython interpreter) For example: test.py defines a function work (code content), and all threads in the process have access to the code, so we can open three threads and target will point to The code, which can be accessed to mean that it can be executed. #2 the task of all threads requires that the code of the task be passed as a parameter to the interpreter's code to execute, that is, all threads want to run their own tasks, the first thing to do is to have access to the interpreter's code.

Comprehensive:

If multiple threads are target=work, then the execution process is

Multiple lines enters upgradeable access to the interpreter's code, that is, get execute permission, and then give the target code to the interpreter code to execute

The code of the interpreter is shared by all threads, so the garbage collection thread can also access the interpreter's code to execute, which leads to a problem: for the same data 100, it is possible that thread 1 executes the x=100 while garbage collection performs the recovery of 100 operations, there is no clever way to solve this problem , is to lock processing, such as Gil, to ensure that the Python interpreter can only execute one task at a time code

7.2 Gil and lock

The Gil protects the data at the interpreter level and protects the user's own data by locking them up, such as

7.3 Gil and multithreading

With Gil's presence, at the same moment only one thread in the same process is executed

Heard here, some students immediately questioned: The process can take advantage of multicore, but the overhead, and Python's multithreaded overhead, but can not take advantage of multicore advantage, that is, Python is useless, PHP is the most awesome language?

Don't worry, Mom's not finished yet.

To solve this problem, we need to agree on several points:

#1. Is the CPU used for computing, or is it used for I/O? #2. Multi-CPU means that multiple cores can be computed in parallel, so multi-core boosts compute performance. Once each CPU encounters I/O blocking, it still needs to wait, so it's useless to check I/O operations more

A worker is equivalent to the CPU, at this time the calculation is equivalent to workers in the work, I/O blocking is equivalent to work for workers to provide the necessary raw materials, workers work in the process if there is no raw materials, the workers need to work to stop the process until the arrival of raw materials.

If your factory is doing most of the tasks of preparing raw materials (I/O intensive), then you have more workers, the meaning is not enough, it is not as much as a person, in the course of materials to let workers to do other work,

Conversely, if your plant has a full range of raw materials, the more workers it is, the more efficient it is.

Conclusion:

For computing, the more CPU, the better, but for I/O, no more CPU is useless

Of course, to run a program, with the increase in CPU performance will certainly be improved (regardless of the increase in size, there is always improved), this is because a program is basically not pure computing or pure I/O, so we can only compare to see whether a program is computationally intensive or I/o-intensive, Further analysis of the Python multithreading in the end there is no useful

#分析: We have four tasks to deal with, the processing method must be to play a concurrency effect, the solution can be: Scenario One: Open four process Scenario two: a process, open four threads # single core case, the analysis results: If four tasks are computationally intensive, no multicore to parallel computing, Scenario one increases the cost of creating a process, and the scheme wins if four tasks are I/O intensive, the cost of the scenario one creation process is large, and the process is much less than the thread, and the scenario is faster than the threads, the result of the analysis: if four tasks are computationally intensive, multicore means parallel computing, In Python a process in the same time only one thread execution is not multi-core, scheme one win if four tasks are I/O intensive, no more cores can not solve the I/O problem, the scheme wins #结论: Now the computer is basically multicore, Python's efficiency in multi-threading for computationally intensive tasks does not bring much performance gains, or even better serial (no large switching), but there is a significant increase in the efficiency of IO-intensive tasks.

Python Development thread: Thread & Daemon Thread & Global Interpreter Lock

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.