Queries about concurrent programming--concurrency

Source: Internet
Author: User
Tags mutex semaphore
 1. What is shared by threads in the same process. The memory space (including code snippets, data segments, heaps, etc.) shared between programs and some process-level resources (such as opening files and signals). A classic thread-and-process diagram looks like this: 2. Multithreading has those advantages over single-threaded processes. Why use multithreading? An operation may be trapped for a long time, and the waiting thread will go to sleep and cannot continue. Multi-threaded execution can effectively utilize the waiting time. A typical example is waiting for the network to be expected, which can take a few seconds or even dozens of seconds. An operation (often a calculation) consumes a lot of time, and if there is only one thread, the interaction between the program and the user is interrupted. Multithreading allows one thread to be responsible for interacting, and another thread to compute. Program logic itself requires concurrent operations, such as a multi-terminal download software (such as BitTorrent) of multiple CPUs or multi-core computers (basically the future of the mainstream computer), the ability to execute multiple threads at the same time, so the single thread can not fully play the full computing power of the computer. Multithreading is much more efficient in data sharing than multiple process applications. 3. From the C Programmer's point of view, the data thread is private or shared. 4. What is the relationship between the thread and the processor?     Threads are always executed concurrently, whether on a multiprocessor computer or on a single-processor computer. When the number of threads is less than or equal to the number of processors (and the operating system supports multiprocessor), the concurrency of the threads is real concurrency, and different threads allow different processors to be irrelevant to each other.     However, when the number of threads is greater than the number of processors, the concurrency of threads is hampered by the fact that at least one processor runs multiple threads. In the case where the processor is multi-threaded, concurrency is a simulated state. The operating system makes these multithreaded programs take turns, performing only a short time (usually dozens of to hundreds of milliseconds) at a time, so that each thread "looks" at the same time. Such a constant switching of different threads on the processor is called thread scheduling. 5. The thread is a state in the process scheduling. Run (runing): The thread is executing at this time. Ready (Ready): At this point the thread can run immediately, but the CPU is already occupied.    Wait (Waiting): The thread is waiting for an event (usually I/O or synchronous) to be sent and cannot be executed. The running thread has a period of time that can be performed, and this time becomes the time Slice, when the time slice is exhausted, the process is ready to go. If the process starts waiting for an event before the time slice runs out, it goes into the wait state. Whenever a thread leaves the allowed state, the dispatch system chooses a different ready thread to continue execution. After an event waits for a thread that is in a waiting state, the thread enters the ready state. 6. In-line scheduling, the way in which the priority of a thread changes.
User-specified priority. Increase or decrease priority according to the degree of frequency of entry.    Priority is promoted for a long time without execution. 7 . The relationship between synchronization and locks. Sync: is when a thread accesses data that is not finished, other threads must not access the same data.     So access to the data has been atomized. The most common way to synchronize is to use Locks (lock). A lock is a very strong mechanism in which each thread attempts to acquire (acquire) a lock before accessing data or resources and releases the lock after the access is complete. When attempting to acquire a lock while the lock is already occupied, the thread waits until the lock is available again. 8. The common ways to synchronize threads.     1 "two Yuan Semaphore (bianry semaphore): It has only two states, occupy and not occupy. It is suitable for resources that can only be accessed exclusively by a single thread.     2 Mutex (mutex): Similar to two-dollar signal, resources allow only one thread to access at the same time, but unlike semaphores, semaphores can be retrieved and released by any thread throughout the system, i.e. The same semaphore can be obtained by one thread in the system and released by another thread. The mutex requires the thread to acquire the mutex, and that thread is responsible for releasing the lock, which is not valid for other threads to release the mutex.     3 Critical Area (cirtical section): is a more stringent than the mutex synchronization means. The acquisition of the lock in the critical section is called the entry into the critical zone, and the release of the lock is called the departure critical zone. The difference between the critical section and the mutex and the semaphore is that the mutex and semaphore are visible in any process of the system, that is, a process creates a mutex or semaphore, and another process attempts to acquire a lock that is legal. However, the scope of the critical zone is limited to this process, and other processes cannot acquire the lock. In addition, the critical zone has the same properties as the mutual exclusion.      4 Read-write lock (Read-write Lock): Dedicated to a more specific situation of synchronization, for a piece of data, multiple threads at the same time reading is always no problem, but the assumption that the operation is not atomic type, As long as any one of the threads tries to modify this data, you must use the synchronization method to avoid errors. If you use the above semaphore, mutex, or the one in the critical region to synchronize, although you can guarantee that the program is correct, but for frequent reading, but only occasionally write the case, it will appear very inefficient. Reading and writing locks can avoid this problem. For the same lock, there are two ways to acquire a read-write lock, shared or exclusive (Exclusive). When the lock is in a free State, attempting to acquire a lock in either way succeeds and places the lock in the corresponding state. If the lock is shared, other threads still succeed in acquiring the lock in a shared way, at which point the lock is assigned to multiple threads. However, if another thread tries to gain exclusive access to a lock that is already in the shared state, it will have to wait for the lock to be released by all threads. Accordingly, an exclusive lock will prevent any other thread from acquiring the lock, regardless of how they attempt to obtain it, and the behavior of the read-write lock can be summed up as follows: 5 condition variable (condition Variable): As a means of synchronization, it acts like a fence. For a condition variable, a thread can have two operations, first the thread can wait for the condition variable, and a condition variable can be waited by multiple threads. Second, the thread can wake the condition variable, at which point aOne or all threads waiting for this condition variable will be awakened and continue to support. That is, using a condition variable allows many threads to wait together for an event to occur, and when the event occurs (the condition variable is awakened), all threads can resume execution together.





Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.