IOS multi-thread series (3)

Source: Internet
Author: User

On the 2011 WWDC, Apple released GCD, which adds a new method for multithreading. GCD must run in iOS or later or OS X10.6 or later. GCD, short for Grand Central Dispatch, is a group of C interfaces used to implement concurrent programming. GCD is developed based on Objective-C Block features. The basic business logic is similar to NSOperation. Is to add a task to a queue, the system is responsible for thread generation and scheduling. Because Block is directly used, it is very convenient to use and reduces the threshold for multi-threaded development.

Let's take a look at the code first. It is the same as the example in the multi-threaded series (1). The GCD implementation is as follows:

- (void)viewDidLoad{    [super viewDidLoad];       dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{        [self downloadImage:IMAGE_URL];    });}
The call interface of GCD is very simple, that is, to submit the task to the Queue.

The dispatch_async function is asynchronous and non-blocking. It will be returned immediately after the call. The system allocates threads in the thread pool to execute the work. Of course, there are asynchronous and synchronous operations. dispatch_sync is the blocked API of synchronization and will not be returned until the added task is completed.

The implementation of multithreading in GCD is really simple. You don't need to know many details in multithreading, and the efficiency is high. However, disaptch_queue has some special features and needs to be learned more in actual use. Dispatch_queue can be run in serial mode or in parallel mode. As the name suggests, serial mode means that tasks are executed in sequence. After one task is completed, the next task is executed, and only one task is running at a time; parallel Running means that each task can run at the same time, and the number of parallel tasks is determined based on the current load of the system. This developer does not need to care about it.

The system provides three types of dispatch queue:

1. main queue

This is actually the queue of the main thread, so it is obvious that this is a serial queue. all tasks that join the main queue will start the main thread to run, therefore, do not add long-running tasks when adding tasks.

2. Global queue

The most common queue in our actual development is the concurrent queue. There are three priorities: high, default, and low (each priority corresponds to an independent queue ). You can obtain the queue through the dispatch_get_global_queue API.

3. Custom queue

The dispatch queue can be created by yourself through the dispatch_queue_create API. The first parameter of the dispatch_queue_create (const char * label, dispatch_queue_attr attr) API is the queue name, which must not be repeated, so many times, like java, it is recommended to use a inverted domain name. The second parameter is the created queue type. Before iOS4.3, you can only create a serial queue. The parameter is to pass DISPATCH_QUEUE_SERIAL. After iOS4.3, you can create a parallel queue. The parameter is DISPATCH_QUEUE_CONCURRENT.

As you can see, create involves memory management. GCD memory management also uses reference counting, but it is not included in iOS memory management, therefore, developers need to manually manage it (whether it is ARC or not ).

Because there are different types of queues, dispatch_async can also be used in nesting, or in the same example, we can also write as follows:

- (void)viewDidLoad{    [super viewDidLoad];    __block UIImage *_image;       dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{        NSData *data = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:IMAGE_URL]];        _image = [[UIImage alloc] initWithData:data];        dispatch_async(dispatch_get_main_queue(), ^{            self.imageView.image = _image;        });    });}
In this way, all functions are implemented in a piece of code, including background download, refreshing the UI after download, and being simple and clear.

Some common APIs are introduced as follows:

Dispatch_get_current_queue () Get the current queue

Dispatch_queue_get_label () to get the queue name. If the queue has no name, NULL is returned.

Dispatch_set_target_queue () sets the target queue of the given object

Dispatch_main () will Block the main thread and wait for the execution of the Block in the main queue of the main queue to end.


Sometimes we encounter a series of tasks. When all tasks are completed, another special task is run. If we use the dispatch_sync method to run all the tasks serially, we can determine the order of the tasks, but the efficiency will be greatly reduced. However, dispatch_async is asynchronous and non-blocking, therefore, it is useless to write the following code, and the running time of the special task cannot be guaranteed after all tasks are completed.

    dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);    for(id obj in array)        dispatch_async(queue, ^{            [self doWork:obj];        });    [self doneWork];

In this case, GCD provides a dispatch group, which can be used to aggregate a group of tasks and wait until these tasks are completed before continuing. In the preceding scenario, the code should be written as follows:

    dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);    dispatch_group_t group = dispatch_group_create();    for(id obj in array)        dispatch_group_async(group, queue, ^{            [self doWork:obj];        });    dispatch_group_wait(group, DISPATCH_TIME_FOREVER);    dispatch_release(group);    [self doneWork];

The method is very simple, that is, to add concurrent tasks to a Group and global queue asynchronously with dispatch_group_async. dispatch_group_wait will wait until the work is completed before returning. In this way, the task runs sequentially. However, dispatch_group_wait blocks the thread, so if it is the main thread, this API cannot be called. What should we do?

    dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);    dispatch_group_t group = dispatch_group_create();    for(id obj in array)        dispatch_group_async(group, queue, ^{            [self doWork:obj];        });    dispatch_group_notify(group, queue, ^{        [self doneWork];    });    dispatch_release(group);
The answer is quite simple. For another API, use the dispatch_group_policy method.


Sometimes we need to synchronously perform operations on array elements one by one. GCD provides a simple dispatch_apply method:

    dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);    dispatch_apply([array count], queue, ^(size_t index){        [self doWork:obj:[array objectAtIndex:index]];    });    [self doneWork];

When you use the dispatch_async method to submit parallel tasks, the execution sequence of the tasks cannot be determined. However, sometimes we do need to execute some work after a job is completed, you can use the Dispatch Barrier interface.

    dispatch_async(queue, block1);    dispatch_async(queue, block2);    dispatch_barrier_async(queue, block3);    dispatch_async(queue, block4);    dispatch_async(queue, block5);
Dispatch_barrier_async is asynchronous and will be returned immediately after the call. This write method ensures that block3 is executed after block1 and block2 are executed in parallel, and then block4 and block5 are run in parallel.

Note that the queue here is a parallel queue and is customized.


As Apple's multi-thread artifact, GCD is far more than that. However, we can look at the most common ones. GCD provides comprehensive considerations for different needs and related solutions. It is easy for developers to use GCD, So what really needs to be concerned is how to divide tasks, how to run them, whether it is serial or parallel.

The apple Grand Central Dispatch (GCD) Reference document is attached. For more information, see.







Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.