First, Introduction
GCD should be the most attractive solution for all multithreaded implementations of iOS, as the GCD itself is the solution Apple proposes for multi-core parallel computing. GCD will automatically utilize more processor cores at work to take advantage of more powerful machines. GCD is the abbreviation for Grand Central Dispatch, which is based on the C language. If you use GCD, the thread is completely managed by the system, we do not need to write thread code. Simply define the tasks you want to perform and add them to the appropriate dispatch queue (dispatch queue). GCD is responsible for creating threads and dispatching your tasks, and the system directly provides thread management
Second, scheduling queue (Dispath)
An important concept of 1.GCD is the queue, its core philosophy: splitting long-running tasks into multiple units of work and adding them to the Dispath queue, the system manages these dispath queues for us, and executes the unit of work on multiple threads. We don't need to start and manage background threads directly.
2. The system provides many predefined dispath queues, including the Dispath queue, which ensures that work is always performed on the main thread. You can also create your own Dispath queue, and you can create any number of them. GCD's Dispath queue Strictly follows the FIFO (first-in-a-go) principle, and the units of work added to the Dispath queue are always started in the order in which they were added to the Dispath queue.
3.dispatch Queue performs tasks serially or concurrently in FIFO order
1> Serial Dispatch The queue can only perform one task at a time, the current task is completed before the dequeue starts and the next task is started
2> Concurrent dispatch queue to start as many tasks as possible concurrently execution
Iii. Creating and managing dispatch queue
1. Get Global concurrency Dispatch queue (concurrent dispatch queue)
1> Concurrent Dispatch queue can perform multiple tasks concurrently in parallel, but the concurrent queue still initiates the task in FIFO order. The concurrent queue will dequeue the next task and begin execution before the previous task is completed. The number of simultaneous tasks performed by the concurrent queue varies dynamically depending on the application and system, including the number of available cores, the number of jobs being performed by other processes, and the number of priority tasks in other serial dispatch queue.
The 2> system provides three concurrent dispatch queues for each application, a global share across the app, and a priority for three queue differences. You do not need to explicitly create these queues, using the Dispatch_get_global_queue function to get these three queues:
- Get the default priority global concurrency dispatch queue
- dispatch_queue_t queue = Dispatch_get_global_queue (Dispatch_queue_priority_default, 0);
The first parameter is used to specify a priority, using Dispatch_queue_priority_high and Dispatch_queue_priority_low two constants to obtain the high and low priority two QUEUE, and the second parameter is not currently used. The default 0 can be
3> Although the dispatch queue is a reference-counted object, you do not need to retain and release the global concurrent queue. Because these queues are global to the application, the retain and release calls are ignored. You also don't need to store the three queue references, and each time you call Dispatch_get_global_queue directly to get the queue.
2. Create a serial dispatch queue (serial dispatch queue)
1> application tasks need to be executed in a specific order, a serial dispatch queue is required, and the serial queue can perform only one task at a time. You can use serial queue instead of locks to protect shared resources or variable data structures. Unlike locks, serial queue ensures that tasks are performed in a predictable order. And as long as you commit the task asynchronously to the serial queue, it will never generate a deadlock
2> you must explicitly create and manage all the serial queues that you use, the app can create any number of serial queues, but do not create more serial queues to perform more tasks at the same time. If you need to perform a large number of tasks concurrently, you should submit the task to the global concurrent queue
3> using the Dispatch_queue_create function to create a serial queue, two parameters are the queue name and a set of queue properties
- dispatch_queue_t queue;
- Queue = Dispatch_queue_create ("Cn.itcast.queue", NULL);
3. Run-time access to public queue
GCD provides a function for the application to access several public dispatch queue:
1> uses the Dispatch_get_current_queue function for debugging purposes, or to test the identity of the current queue. Calling this function in the Block object returns the queue to which the block was submitted (this time the queue should be executing). Calling this function outside of the block object returns the default concurrent queue for the app.
2> using the Dispatch_get_main_queue function to get the serial dispatch queue associated with the application's main thread
3> using Dispatch_get_global_queue to get a shared concurrent queue
Memory Management for 4.Dispatch queue
1> Dispatch Queue and other Dispatch objects (and Dispatch source) are data types that reference counts. When you create a serial dispatch queue, the initial reference count is 1, and you can use the Dispatch_retain and Dispatch_release functions to increase and decrease the reference count. When the reference count arrives 0 o'clock, the system will asynchronously destroy the queue
2> is important for dispatch objects, such as dispatch queue retain and release, to ensure that they remain in memory when they are used. As with OC objects, the general rule is that if you use a queue that passes over, you should use the retain before using it, and then release
3> you don't need retain or release global dispatch queue, including global concurrency dispatch queue and main dispatch queue
4> even if you are implementing an automated garbage collection application, you also need the dispatch queue and other dispatch objects created by retain and release. GCD does not support garbage collection model to reclaim memory
Iv. adding tasks to the queue
To perform a task, you need to add it to an appropriate dispatch queue, you can add it individually or by group, or you can perform a task synchronously or asynchronously, too. Once you get into queue,queue you will be responsible for carrying out your task as soon as possible. You can typically use a block to encapsulate the contents of a task.
1. Add a single task to the queue
1> Adding tasks asynchronously
You can add a task to the queue asynchronously or synchronously, and dispatch the task asynchronously, using the Dispatch_async or Dispatch_async_f functions as much as possible. Because the task is added to the queue, it is not possible to determine when the code will be able to execute. Therefore, adding blocks or functions asynchronously allows you to immediately dispatch the execution of the code, and then the calling thread can continue to do other things. In particular, the application main thread must dispatch the task asynchronously to respond to user events in a timely manner
2> Adding tasks synchronously
A few times you might want to schedule tasks synchronously to avoid race conditions or other synchronization errors. Using the Dispatch_sync and Dispatch_sync_f functions to add tasks synchronously to the queue, these two functions block the current calling thread until the corresponding task finishes executing. Note: Never call the Dispatch_sync or Dispatch_sync_f function in a task, and synchronize the dispatch of a new task to the currently executing queue. This is especially important for serial queue because doing so will definitely lead to deadlocks, and concurrent queues should avoid doing so.
3> Code Demo
Before calling, look under current thread NSLog (@ "Current calling thread:%@", [Nsthread CurrentThread]); Create a serial queue dispatch_queue_t queue = dispatch_queue_create ("Cn.itcast.queue", NULL); Dispatch_async (queue, ^{ NSLog (@ "opens an asynchronous task, current thread:%@", [Nsthread CurrentThread]); }); Dispatch_sync (queue, ^{ NSLog (@ "opens a synchronization task, current thread:%@", [Nsthread CurrentThread]); }); Destroy Queue dispatch_release (queues);
Printing information
2013-02-03 09:03:37.348 thread[6491:c07] current calling thread: <nsthread:0x714fa80>{name = (null), num = 1} 2013-02-03 09:03:37.349 THREAD[6491:1E03] Opened an asynchronous task, current thread: <nsthread:0x74520a0>{name = (null), num = 3} 2013-02-03 09:03:37.350 Thread[6491:c07] Opens a synchronization task with the current thread: <nsthread:0x714fa80>{name = (null),
2. Execute the loop iteration in parallel
If you use loops to perform a fixed number of iterations, the concurrent dispatch queue may improve performance.
For example, the following for loop:
int i; int count = ten; for (i = 0; i < count; i++) { printf ("%d ", i); }
1> you can call the Dispatch_apply or Dispatch_apply_f function to replace a loop if the task performed by each iteration is independent of the other iterations and the loop iteration execution order is irrelevant. These two functions submit the specified block or function to the queue for each iteration of the loop. When dispatch to a concurrent queue, it is possible to perform multiple loop iterations at the same time. You can specify a serial or concurrent queue when using dispatch_apply or Dispatch_apply_f. The concurrent queue allows multiple loop iterations to be executed simultaneously, and the serial queue is less important to use.
The following code replaces the for loop with Dispatch_apply, and the block you pass must contain a parameter of type size_t to identify the current loop iteration. The first iteration of this parameter value is 0, the last value is count-1
Get global concurrency queue dispatch_queue_t queue = dispatch_get_global_queue (Dispatch_queue_priority_default, 0); size_t count = ten; Dispatch_apply (count, queue, ^ (size_t i) { printf ("%zd", I); }); Destroy Queue dispatch_release (queues);
Printing information:
1 2 0 3 4 5 6 7 8 9 /c7>
As you can see, these iterations are executed concurrently
As with the normal for loop, the dispatch_apply and Dispatch_apply_f functions are not returned until all iterations have been completed, so the two functions block the current thread and must be taken care of when calling both functions in the main thread. Event processing loops may be blocked and user events cannot be responded to. So if the loop code takes some time to execute, consider calling the two functions in another thread. If you pass an argument that is a serial queue, and it is the queue that executes the current code, a deadlock is generated.
3. Perform tasks in the main thread
1> GCD provides a special dispatch queue that can perform tasks in the main thread of the application. This queue is automatically created and is automatically destroyed as long as the application main thread is set to run loop (managed by the Cfrunloopref type or Nsrunloop object). Non-cocoa applications if you do not explicitly set the run loop, you must explicitly invoke the Dispatch_main function to explicitly activate the dispatch queue, otherwise you can add tasks to the queue, but the task will never be executed.
2> calls the Dispatch_get_main_queue function to get the dispatch queue of the application's main thread, the task added to this queue is serialized by the main thread
3> code implementation, such as the asynchronous download of pictures, back to the main thread display pictures
Asynchronous download Picture Dispatch_async (dispatch_get_global_queue (dispatch_queue_priority_default, 0), ^{ nsurl *url = [ Nsurl urlwithstring:@ "Http://car0.autoimg.cn/upload/spec/9579/u_20120110174805627264.jpg"]; UIImage *image = [UIImage imagewithdata:[nsdata datawithcontentsofurl:url]; Back to the main thread show Picture Dispatch_async (Dispatch_get_main_queue (), ^{ self.imageView.image = image; }); });
4. Using Objective-c objects in Tasks
The GCD supports the cocoa memory management mechanism so that the Objective-c object can be used freely in blocks submitted to the queue. Each dispatch queue maintains its own autorelease pool to ensure that autorelease objects are freed, but the queue does not guarantee the actual time that these objects are freed. If your application consumes a lot of memory and creates a large number of Autorelease objects, you need to create your own autorelease pool, which is used to release objects that are no longer in use in a timely manner.
V. Suspension and continuation of the queue
We can use the Dispatch_suspend function to pause a queue to prevent it from executing a block object; Use the Dispatch_resume function to continue dispatch the queue. Calling Dispatch_suspend increases the queue's reference count, and calling Dispatch_resume reduces the queue's reference count. When the reference count is greater than 0 o'clock, the queue remains in a suspended state. So you have to call the suspend and resume functions accordingly. Suspend and continue are asynchronous and take effect only between execution blocks, such as before or after executing a new block. Suspending a queue does not cause the block that is being executed to stop.
Vi. use of the Dispatch group
Suppose there is a need to download two different pictures from the network and then display them to different uiimageview, which can be implemented in general.
//get UIImage based on URL-(UIImage *) imagewithurlstring: (NSString *) urlstring {Nsurl *url = [Nsurl urlwithstring:urlstring]; NSData *data = [NSData Datawithcontentsofurl:url]; return [UIImage Imagewithdata:data]; }-(void) Downloadimages {//asynchronously download Picture Dispatch_async (Dispatch_get_global_queue (dispatch_queue_priority_default , 0), ^{//download First picture NSString *URL1 = @ "http://car0.autoimg.cn/upload/spec/9579/u_20120110174805627264.jp G "; UIImage *image1 = [self IMAGEWITHURLSTRING:URL1]; Download the second picture nsstring *url2 = @ "Http://hiphotos.baidu.com/lvpics/pic/item/3a86813d1fa41768bba16746.jpg"; UIImage *image2 = [self imagewithurlstring:url2]; Back to the main thread show Picture Dispatch_async (Dispatch_get_main_queue (), ^{self.imageView1.image = Image1; Self.imageView2.image = Image2; }); }); }
While this solution solves the problem, the download process for two images does not need to be executed sequentially, and executing them concurrently can improve execution speed. One of the points to note is that you must wait for two images to be downloaded before returning to the main thread to display the pictures. Dispatch Group can help us improve performance in this situation. Let's take a look at the usefulness of dispatch group:
We can use the Dispatch_group_async function to correlate multiple tasks into a single dispatch group and the corresponding queue, and group will perform these tasks concurrently. And the dispatch group can be used to block a thread until all the tasks associated with the group are executed. Sometimes you have to wait for the result of the task to finish before you can continue with the processing.
The above code is optimized with dispatch group:
Get UIImage-(UIImage *) imagewithurlstring by URL: (NSString *) urlstring {Nsurl *url = [Nsurl Urlwithstring:urlstrin G]; NSData *data = [NSData Datawithcontentsofurl:url]; This does not automatically release the UIImage object return [[UIImage alloc] initwithdata:data]; }-(void) downloadimages {dispatch_queue_t queue = Dispatch_get_global_queue (Dispatch_queue_priority_default, 0); Asynchronously download the picture Dispatch_async (queue, ^{//Create a group of dispatch_group_t group = Dispatch_group_cre Ate (); __block UIImage *image1 = nil; __block UIImage *image2 = nil; Associate a task to group Dispatch_group_async (group, Dispatch_get_global_queue (Dispatch_queue_priority_default, 0), ^{ Download the first picture nsstring *URL1 = @ "Http://car0.autoimg.cn/upload/spec/9579/u_20120110174805627264.jpg"; Image1 = [self IMAGEWITHURLSTRING:URL1]; }); Associate a task to group Dispatch_group_async (group, Dispatch_get_global_queue (dispatch_queue_priority_default, 0), ^{//download First picture NSString *url2 = @ "Http://hiphotos.baidu.com/lvpics/pic/item/3a86813d1fa41768bba16746.jpg"; Image2 = [self imagewithurlstring:url2]; }); After the tasks in the wait group have been executed, return to the main thread to execute block callback Dispatch_group_notify (group, Dispatch_get_main_queue (), ^{Self.imagev Iew1.image = Image1; Self.imageView2.image = Image2; Do not automatically release UIImage in an asynchronous thread, because when the asynchronous thread ends, the automatic release pool of the asynchronous thread is destroyed, and the uiimage is destroyed//here the image resource is released [IM Age1 release]; [Image2 release]; }); Release group Dispatch_release (group); }); }
The Dispatch_group_notify function is used to specify an additional block that will be executed after all tasks in the group have been completed
Multithreading--GCD