Multithreaded Programming (iv) GCD

Source: Internet
Author: User
Tags gcd

In the previous article, we introduced the basic concept of multithreading and the implementation of multithreaded programming two ways, this article, we introduce the last multithreaded programming tools, but also the most important one: GCD.

1. GCD Introduction 1.1 What is GCD

The GCD full name is the Grand Central Dispatch, which translates to "The Great Hub Scheduler", the pure C language, which provides a lot of powerful functions. GCD provides a great experience for us to write code. It is the solution that Apple offers to take advantage of multicore advantages in multi-core devices, and in many technologies, such as Runloop GCD, play an important role. GCD full name Grand Central Dispatch, literal translation of the dispatch center, completely based on C language. With GCD, you do not need to manage threads, and thread management is fully managed to GCD. To think of the thread as a train, we just need to submit our shipping mission, and GCD will help us run the task on the right train. Multithreaded programming has become so simple, but for beginners, GCD learning will be a certain difficulty. Because it is pure C writing. So there is no concept of object, learning GCD You can forget the concept of encapsulation inheritance polymorphism and so on.

The main advantages of 1.2 GCD

GCD offers many advantages beyond traditional multithreaded programming:

    • Ease of use: GCD is easy to use, and programmers simply need to tell gcd what tasks they want to perform, without having to write any thread-management code. Because GCD is based on work unit rather than on operations like thread, GCD can control tasks such as waiting for task completion, monitoring file descriptors, periodic code execution, and work hangs. Block-based descent makes it extremely simple to pass the context between different code scopes.

    • Efficiency: GCD is an apple solution for multi-core parallel computing, and GCD will automatically take advantage of more CPU cores (such as dual-core, quad-core). GCD is implemented so lightly and gracefully that it is more practical and fast in many places than it is dedicated to creating resource-intensive threads. This is related to ease of use: part of the reason why GCD can be used is that you don't have to worry too much about efficiency, just using it.

    • Performance: GCD automatically manages the life cycle of threads (creating threads, scheduling tasks, destroying threads). GCD automatically increments the number of threads based on system load, which reduces context switching and increases computational efficiency.

1.3 points to be aware of when using GCD
    • GCD exists in Libdispatch.dylib This library, this library contains all the GCD, but any iOS program, the default load of the library, in the process of running the program will dynamically load the library, do not need to manually import.

    • GCD is a pure C language, so when we write GCD related code, we face the function, not the method.
    • Most of the functions in GCD start with dispatch.
There are 2 core concepts in 1.4 GCD
    • Queues: Used to store tasks
    • Task: what action to take
2 steps for the use of 1.5 gcd

(1) Custom queue
(2) Submit a task to determine what you want to do

2. Create a queue

An important part of the GCD is the queue, where we submit various tasks to the queue, and the queue is added to different threads to perform tasks based on its own type and the state of the current system. The creation and management of threads has GCD itself and does not require us to participate. The system provides many well-defined queues: Only the main_queue of the main thread is managed, and the global parallel globle_queue. At the same time, we can also customize our own queue of queues.

GCD的队列可以分为2大类型:1. 并发队列(Concurrent Dispatch Queue)   可以让多个任务并发(同时)执行(自动开启多个线程同时执行任务)并发功能只有在异步(dispatch_async)函数下才有效2. 串行队列(Serial Dispatch Queue)   让任务一个接着一个地执行(一个任务执行完毕后,再执行下一个任务)所有线程串行,或者只有一个线程,任务依次执行。

The type of queue is dispatch_queue_t.
The creation function is: dispatch_queue_create (const char *label, dispatch_queue_attr_t attr)
The first parameter represents the name of the queue, note that it is a char-type pointer, not a nsstring;
The second parameter represents the type of queue,

    • The dispatch_queue_serial represents a serial QUEUE.
    • Dispatch_queue_concurrent represents a parallel QUEUE.
2.1 Serial Queue

There are two ways to get serial in GCD

    • Use the primary queue (the queue associated with the mainline threads)
      The primary queue is a special serial queue that comes with the GCD, and the tasks placed in the master queue are executed in the main thread. The task submitted to the main queue may not be executed immediately, but will be executed when the main thread's run loop detects a task that has dispatch submitted.
      Use the Dispatch_get_main_queue () function to get the home row
      Note: If the task is placed in the main queue for processing, no new threads will be opened regardless of whether the processing function is asynchronous or synchronous.
      Example: dispatch_queue_t queue = Dispatch_get_main_queue ();

    • To create a serial queue using the Dispatch_queue_create function
      Description: dispatch_queue_t dispatch_queue_create (const char *label, dispatch_queue_attr_t attr); Queue name, queue property, General null, or set parameter dispatch_queue_serial
      Example: dispatch_queue_t queue = dispatch_queue_create ("queue", NULL); Create
      Note: Non-ARC needs to release the manually created queue, dispatch_release (queue);

2.2 Concurrent queues

The GCD also provides two ways to get the serial

    • Use global queue
      GCD by default, the global concurrency queue has been provided for use by the entire application and does not need to be created manually. That is, we can submit it directly to the queue, and the task executes on other threads that are not the main thread.
      Use the Dispatch_get_global_queue function to obtain a global concurrency queue
      Description: dispatch_queue_t dispatch_get_global_queue (dispatch_queue_ priority_t priority,unsigned long flags);
      Example: dispatch_queue_t queue = dispatch_get_global_queue (Dispatch_queue_priority_default, 0);
      The first parameter is the priority, where the default is selected. Gets a global default priority for concurrent queues.
      Description: Priority of global concurrent queue
      Define Dispatch_queue_priority_high 2//High
      Define Dispatch_queue_priority_default 0//default (medium) define Dispatch_queue_priority_low (-2)//low
      Define Dispatch_queue_priority_background int16_min//Background
      The second parameter parameter is left for later use, temporarily not used, pass a 0.

Get global concurrency dispatch queue (concurrent dispatch queue)
1. Concurrent dispatch queue can perform multiple tasks concurrently in parallel, but the concurrent Queue still initiates the task in FIFO order. The concurrent queue will dequeue the next task and begin execution before the previous task is completed. The number of simultaneous tasks performed by the concurrent queue varies dynamically depending on the application and system, including the number of available cores, the number of jobs being performed by other processes, and the number of priority tasks in other serial dispatch queue.
2. The system provides three concurrent dispatch queue for each application, the whole in-app global sharing, and three queue differences are priority. You do not need to explicitly create these queues, use the Dispatch_get_global_queue function to get the three queue
the first parameter is used to specify a priority, respectively, using dispatch_queue_priority_ High and Dispatch_queue_priority_low two constants to obtain the highest and lowest priority two queues; the second parameter is not currently used, and the default of 0 is
3. Although the dispatch QUEUE is a reference count object, But you don't need to retain and release the global concurrent queue. Because these queues are global to the application, the retain and release calls are ignored. You also don't need to store the three queue references, and each time you call Dispatch_get_global_queue directly to get the queue.

    • Create a parallel queue using the Dispatch_queue_create function
      As with getting a serial queue, we can also use the Dispatch_queue_create function to create a parallel queue, except that the second singular is filled in and the acquisition of the serial queue is different.
      Description: dispatch_queue_t dispatch_queue_create (const char *label, dispatch_queue_attr_t attr); Queue name, queue property set parameter to Dispatch_queue_concurrent
      Example: dispatch_queue_t queue = dispatch_queue_create ("queue", dispatch_queue_concurrent);
      Note: Non-ARC needs to release the manually created queue, dispatch_release (queue);
Memory management for 2.3 queue

In the ARC environment, you do not need to write code manually. Use Dispatch_retain () and Dispatch_release () to manage the reference count of the queue in non-ARC environments. The principle is like the NSObject object, which counts as 0 o'clock destroys the queue.
code example:

dispatch_queue_t q_1 =  dispatch_queue_create("task3.queue.1",DISPATCH_QUEUE_SERIAL);                  dispatch_release(q_1);
3. Submit a Task

There are 2 functions for performing tasks in gcd, and the function of two functions is to submit the parameter (Task) on the right to the left parameter (queue) for execution.

1. 用同步的方式执行任务 dispatch_sync(dispatch_queue_t queue, dispatch_block_t block);2. 用异步的方式执行任务 dispatch_async(dispatch_queue_t queue, dispatch_block_t block);  参数说明:queue,任务执行的队列                    block,需要执行任务

The difference between synchronous and asynchronous:

    • Synchronization: Executes in the current thread, waits for the task to complete and snaps to the current thread
    • Async: Executes in another thread, does not have to wait for the task to complete and does not affect the current thread
3.1 Synchronous Commit

The synchronous-commit function is Dispatch_sync (), two parameters:

    • The first parameter represents the queue that is submitted to
    • The second parameter indicates the task details

      Here the block way to describe a task, the reason is very simple, block is also pure C implementation, and the usual use of invocation or Target+selector method is object-oriented.
      Note that after the task is submitted synchronously, the block is executed and then Dispatch_sync () returns, which causes the main thread to die if the task is too long to be submitted synchronously.

3.2 Asynchronous Commit

The asynchronous Commit function is Dispatch_async (), two parameters:

    • The first parameter represents the queue that is submitted to
    • The second parameter indicates the task details

      After the task is submitted asynchronously, the Dispatch_async () function returns directly without waiting for the block execution to finish. Does not cause the main thread's card to die.

3.3 Submit multiple tasks at the same time

The function of submitting multiple tasks to a queue at the same time is simple:


Notice here that the block describing the task repeats multiple calls, each time giving us a parameter that indicates the order of the task. The order in which multiple tasks are executed depends on the addition of queue queues, and if the target queue is serial, the task executes sequentially, and if the queue is parallel, the task executes concurrently and the order of printing is disrupted.

3.4 Dead Lock

A synchronous commit can cause a deadlock in some cases, that is, the card dies. Example 1:

dispatch_queue_t mainQ = dispatch_get_main_queue();dispatch_sync(mainQ, ^{    NSLog(@"----");});NSLog(@"OK");

Example 2:

dispatch_queue_t q_1 =  dispatch_queue_create("task3.queue.1",DISPATCH_QUEUE_SERIAL);dispatch_async(q_1, ^{    NSLog(@"current is in q_1");// q_1 blcok q_1 q_1 dispatch_syncdispatch_syncdispatch_sync(q_1, ^{NSLog(@"this is sync ");});   NSLog(@"this is sync ????");});

Combined with the above two sample code, the conclusion is obvious: in code executed in a serial queue, if a task is submitted synchronously to this queue, a deadlock is created. In order to avoid deadlocks, we are asked to avoid synchronizing the queue that commits the task to itself in the serial queue.

3.5 execution effects for various queues

3.6 Additional Instructions

Synchronous and asynchronous determine whether to open a new thread

    • Synchronous: Performs a task in the current thread without the ability to open a new thread
    • Async: Perform a task in a new thread with the ability to open a new thread

Concurrency and serialization determine how tasks are executed

    • Concurrency: Multiple tasks concurrently (concurrently) executed
    • Serial: Once a task is completed, the next task is performed
3.7 Code Examples

(1) Adding a task to the concurrent queue with an asynchronous function

//1. Obtaining a global concurrent queue    dispatch_queue_tQueue = Dispatch_get_global_queue (Dispatch_queue_priority_default,0);//2. Add a task to a queue to perform a task     //Async functions: Ability to open new threads     Dispatch_async(Queue, ^{NSLog(@"Download picture 1----%@",[NsthreadCurrentThread]); });Dispatch_async(Queue, ^{NSLog(@"Download picture 2----%@",[NsthreadCurrentThread]); });Dispatch_async(Queue, ^{NSLog(@"Download picture 2----%@",[NsthreadCurrentThread]); });//Print main thread    NSLog(@"Main thread----%@",[NsthreadMainthread]);

Summary: Open three sub-threads at the same time

(2) Adding tasks to the serial queue with an asynchronous function

//Print main thread     NSLog(@"Main thread----%@",[NsthreadMainthread]);//Create serial queue     dispatch_queue_tQueue= Dispatch_queue_create ("Wendingding",NULL);//The first parameter is the name of the serial queue and is a C-language string     ///The second parameter is a property of the queue, in general the serial queue does not need to assign any attributes, so the null value is usually passed (null)    //2. Adding tasks to the queue execution    Dispatch_async(Queue, ^{NSLog(@"Download picture 1----%@",[NsthreadCurrentThread]); });Dispatch_async(Queue, ^{NSLog(@"Download picture 2----%@",[NsthreadCurrentThread]); });Dispatch_async(Queue, ^{NSLog(@"Download picture 2----%@",[NsthreadCurrentThread]); });

Summary: Threads are turned on, but only one thread is opened

(3) Adding a task to the concurrent queue with a synchronous function

//Print main thread    NSLog(@"Main thread----%@",[NsthreadMainthread]);//Create serial queue     dispatch_queue_tQueue= Dispatch_get_global_queue (Dispatch_queue_priority_default,0);//2. Adding tasks to the queue execution     Dispatch_sync(Queue, ^{NSLog(@"Download picture 1----%@",[NsthreadCurrentThread]); });Dispatch_sync(Queue, ^{NSLog(@"Download picture 2----%@",[NsthreadCurrentThread]); });Dispatch_sync(Queue, ^{NSLog(@"Download picture 3----%@",[NsthreadCurrentThread]); }); }

Summary: New threads are not opened and concurrent queues lose concurrency

(4) Adding a task to the serial queue with a synchronous function

NSLog(@"Add a task to a serial queue with a synchronous function");//Print main thread     NSLog(@"Main thread----%@",[NsthreadMainthread]);//Create serial queue     dispatch_queue_tQueue= Dispatch_queue_create ("Wendingding",NULL);//2. Adding tasks to the queue execution     Dispatch_sync(Queue, ^{NSLog(@"Download picture 1----%@",[NsthreadCurrentThread]); });Dispatch_sync(Queue, ^{NSLog(@"Download picture 2----%@",[NsthreadCurrentThread]); });Dispatch_sync(Queue, ^{NSLog(@"Download picture 3----%@",[NsthreadCurrentThread]); }); }

Summary: New threads are not opened

(5) Supplement
Add: The role of the queue name:
When you debug in the future, you can see which queue the task is executing in.

4. Suspension and continuation of the queue

We can use the Dispatch_suspend function to pause a queue to prevent it from executing a block object, and use the Dispatch_resume function to continue dispatch the queue. Suspend and continue are asynchronous and take effect only between execution blocks, such as before or after executing a new block. Suspending a queue does not cause the block that is being executed to stop. In particular, we need to make sure that the function of the suspend queue and the restart queue is called in pairs.
Note: Calling Dispatch_suspend increases the queue's reference count when used in non-arc, and calling Dispatch_resume reduces the queue's reference count. When the reference count is greater than 0 o'clock, the queue remains in a suspended state. So you have to call the suspend and resume functions accordingly.

// 挂起与重启任务          dispatch_suspend(globe_queue);          dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(5 * NSEC_PER_SEC)dispatch_get_main_queue(), ^{               dispatch_resume(globe_queue);          });
5. Dispatch Group
   什么是dispatch group,就像我们在NSOperation中添加依赖一样,如果我们一个任务需要等待其他一些任务完成才能执行时,我们使用dispatch group是最轻松的解决方式。

Using queue groups allows the download of pictures 1 and 2 to be performed simultaneously, and when two download tasks are completed, they are returned to the main thread for display.

5.1 Set Task Execution order
   GCD设置任务执行顺序是通过Dispatch Group实现的。我们以吃火锅为例:
    • Create Group and Task queue first

      dispatch_group_t group = Dispatch_group_create ();
      dispatch_queue_t Globleq = dispatch_get_global_queue (0, 0);

    • Submit the task and add the task to group

      Dispatch_group_async (Group, Globleq, ^{});
      Note: Submitting a task to group is not a synchronous commit, only an asynchronous commit

    • Submit final task to group, same as asynchronous commit,

      Dispatch_group_notify (Group, dispatch_get_global_queue (0, 0), ^{});

5.2 Time delay
    • Dispatch_group_wait (group, dispatch_time_forever);

      Block the current thread and wait for all group tasks to end:

    • dispatch_time_t time = Dispatch_time (Dispatch_time_now, (*nsec_per_sec));

    • Dispatch_group_wait (Group, time); NSLog (@ "");

      You can also customize the timeout period, waiting for a certain amount of time after the group's tasks do not directly stop

6. Common methods of GCD 6.1 deferred execution

Dispatch_after (Dispatch_time (Dispatch_time_now, (int64_t) (5.0 * nsec_per_sec)), Dispatch_get_main_queue (), ^{
Methods for deferred execution
Description: After 5 seconds, execute the code snippet in block.
Parameter description:

6.2 One-time execution (common and singleton settings)

Use dispatch_once Disposable Code
Use the Dispatch_once function to ensure that a piece of code is executed only 1 times during program operation
Static dispatch_once_t Oncetoken;
Dispatch_once (&oncetoken, ^{
Execute code only 1 times (this is thread-safe by default)
The entire program runs, only once.

7. Summary

Description: The synchronization function does not have the ability to open threads, no matter what queue does not open the thread, the asynchronous function has the ability to open threads, open a few threads by the queue (the serial queue will only open a new thread, the concurrent queue will open multiple threads).
Synchronization functions
(1) Concurrent queue: no thread
(2) Serial queue: no thread
Asynchronous functions
(1) Concurrent queue: can open n threads
(2) Serial queue: Open 1 threads
Add: In non-ARC engineering, all functions in the function name with Create\copy\new\retain and other words, you need to do not need to use this data when the release.
The GCD data type does not need to be release in the ARC environment.
The data type of the CF (core Foundation) is still required for release in the ARC environment.
Asynchronous functions have the ability to thread, but not necessarily the thread

Multithreaded Programming (iv) GCD

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.