iOS concurrent programming

Source: Internet
Author: User
Tags gcd

iOS has three multithreaded programming techniques, namely: (i) nsthread (ii) Cocoa nsoperation (iii) GCD (full name: Grand Central Dispatch) These three ways of programming from top to bottom, the level of abstraction is from low to high, The higher the abstraction, the easier it is for Apple to recommend it.

1. Asynchronous calls and Concurrency:

Concurrency mechanisms are often used in implementations of asynchronous invocations, but not all of them are concurrency mechanisms, or other mechanisms, such as those that rely on interrupts.

GCD: A system that supports multi-core processors and other symmetric multi-processing systems with optimized applications. This is based on the thread pool pattern that the task executes in parallel. It was first released on Mac OS X 10.6, and IOS 4 and above are also available. How the GCD works is :A specific task that allows the program to queue in parallel, scheduling them to perform tasks on any available processor core, based on the available processing resources. A task can be a function or a block. The bottom of the GCD is still threaded, but this allows the programmer to focus on the details of the implementation. the FIFO queue in the GCD is called the dispatch.

An important concept of the 1.GCD is the queue, which dispatch the core idea of splitting long-running tasks into multiple units of work and adding them to the Dispath queue, which the system manages for US Dispath queue, For us to execute the unit of work on multiple threads, we do not need to start and manage the background thread directly.

2. The system provides many predefined dispath queues, including the Dispath queue, which ensures that work is always performed on the main thread. You can also create your own Dispath queue, and you can create any number of them. GCD's Dispath queue Strictly follows the FIFO (first-in-a-go) principle, and the units of work added to the Dispath queue are always started in the order in which they were added to the Dispath queue.

3.dispatch Queue performs tasks serially or concurrently in FIFO order

1> Serial Dispatch The queue can only perform one task at a time, the current task is completed before the dequeue starts and the next task is started

2> Concurrent dispatch queue to start as many tasks as possible concurrently execution

2. Three types of queues and management: Serial Team Serial Also known as private dispatch queues, and performs only one task at a time. Serial queues are typically used to synchronize access to specific resources or data. When you create multiple serial queues, the serial queue is executed concurrently with the serial queue, although they are executed synchronously. 1> Serial queue can only perform one task at a time. You can use serial queue instead of locks to protect shared resources or variable data structures. Unlike locks, serial queue ensures that tasks are performed in a predictable order. And as long as you commit the task asynchronously to the serial queue, it will never generate a deadlock

2> you must explicitly create and manage all the serial queues that you use, the app can create any number of serial queues, but do not create more serial queues to perform more tasks at the same time. If you need to perform a large number of tasks concurrently, you should submit the task to the global concurrent queue

3> using the Dispatch_queue_create function to create a serial queue, two parameters are the queue name and a set of queue properties run-time get public team ConcurrentAlso known as the global dispatch queue, the system provides three of each applicationthese three concurrent dispatch queues are global, and they differ only by priority. You can perform multiple tasks concurrently,but the order of execution is random.。

GCD provides a function for the application to access several public dispatch queue:

1> uses the Dispatch_get_current_queue function for debugging purposes, or to test the identity of the current queue. Calling this function in the Block object returns the queue to which the block was submitted (this time the queue should be executing). Calling this function outside of the block object returns the default concurrent queue for the app.

2> using Dispath_get_global_queue can get to get concurrent dispatch queues queue, as follows:

Let Globalq = Dispatch_get_global_queue (dispatch_queue_priority_default, 0);

The first parameter is used to specify a priority, using Dispatch_queue_priority_high and Dispatch_queue_priority_low two constants to obtain the high and low priority two QUEUE, and the second parameter is not currently used. The default 0 can be

Main Dispatch QueueIt is a globally available serial queue that performs tasks on the main thread of the application. 1> uses the Dispatch_get_main_queue function to get the serial dispatch queue associated with the application's main thread, and the task added to this queue is serialized by the main thread
Asynchronous download Picture  Dispatch_async (dispatch_get_global_queue (dispatch_queue_priority_default, 0), ^{      nsurl *url = [ Nsurl urlwithstring:@ "Http://car0.autoimg.cn/upload/spec/9579/u_20120110174805627264.jpg"];      UIImage *image = [UIImage imagewithdata:[nsdata datawithcontentsofurl:url];            Back to the main thread show Picture      Dispatch_async (Dispatch_get_main_queue (), ^{          self.imageView.image = image;      });  });  

3. Create a queue:

Let queue = Dispatch_queue_create ("Gcdtest.rongfzh.yc", nil)//serial Let queue = Dispatch_queue_create ("Gcdtest. Rongfzh.yc ", dispatch_queue_serial)//serial Let QUEUE = Dispatch_queue_create (" Gcdtest.rongfzh.yc ", Dispatch_queue_ CONCURRENT)  //Parallel

4. Add a task to the queueTo perform a task, you need to add it to an appropriate dispatch queue, you can add it individually or by group, or you can perform a task synchronously or asynchronously, too. Once you get into queue,queue you will be responsible for carrying out your task as soon as possible. You can typically use a block to encapsulate the contents of a task.

1. Add a single task to the queue

1> Adding tasks asynchronously

You can add a task to the queue asynchronously or synchronously, and dispatch the task asynchronously, using the Dispatch_async or Dispatch_async_f functions as much as possible. Because the task is added to the queue, it is not possible to determine when the code will execute (GCD automatically assigns resources to the multi-core processor based on the task, optimizing the program). Therefore, adding blocks or functions asynchronously allows you to immediately dispatch the execution of the code, and then the calling thread can continue to do other things. In particular, the application main thread must dispatch the task asynchronously to respond to user events in a timely manner

        Dispatch_async (Dispatch_get_global_queue (dispatch_queue_priority_default, 0), {            //time-consuming Operation            Dispatch_async ( Dispatch_get_main_queue (), {                //update Interface                })            })

2> Adding tasks synchronously

A few times you might want to schedule tasks synchronously to avoid race conditions or other synchronization errors. Using the Dispatch_sync and Dispatch_sync_f functions to add tasks synchronously to the queue, these two functions block the current calling thread until the corresponding task finishes executing. Note: Never call the Dispatch_sync or Dispatch_sync_f function in a task, and synchronize the dispatch of a new task to the currently executing queue. This is especially important for serial queue because doing so will definitely lead to deadlocks, and concurrent queues should avoid doing so.

        Before calling, look under current thread        NSLog ("Current calling Thread:%@", Nsthread.currentthread ())        //Create a serial queue let        queue = Dispatch_queue_ Create ("Cn.itcast.queue", nil)        Dispatch_async (queue, {            NSLog ("Open an asynchronous task, current thread:%@", Nsthread.currentthread ( )            }        )                Dispatch_sync (queue,  {            NSLog ("Open a synchronization task, current thread:%@", Nsthread.currentthread ())            }        )

Printing results:

2015-05-09 00:49:27.539 imageloaderexample[2122:81150] current calling Thread: <nsthread:0x7fcbdad27890>{number = 1, name = main}

2015-05-09 00:49:27.541 imageloaderexample[2122:81222] an asynchronous task is turned on, the current thread: <nsthread:0x7fcbdae66a30>{number = 2, name = (null)}

2015-05-09 00:49:27.541 imageloaderexample[2122:81150] A synchronization task is turned on, the current thread: <nsthread:0x7fcbdad27890>{number = 1, name = main}

5. Pausing and continuing the queue

We can use the dispatch_suspend function to pause a queue to prevent it from executing a block object; Use the dispatch_resume function to continue dispatch the queue. Calling Dispatch_suspend increases the queue's reference count, and calling Dispatch_resume reduces the queue's reference count. When the reference count is greater than 0 o'clock, the queue remains in a suspended state. So you have to call the suspend and resume functions accordingly. Suspend and continue are asynchronous and take effect only between execution blocks, such as before or after executing a new block. Suspending a queue does not cause the block that is being executed to stop.

6. Use of Dispatch_group_async

Dispatch_group_async can be used to listen to whether a group of tasks are completed and be notified to perform other operations when completed. This method is useful, for example, you perform three download tasks, and when three tasks are downloaded, you are notified that the interface is complete. Here is an example code:

        Let queue = Dispatch_get_global_queue (dispatch_queue_priority_default, 0) let        group = Dispatch_group_create ()        Dispatch_group_async (Group, queue, {            nsthread.sleepfortimeinterval (1)            NSLog ("group1");        })        Dispatch_group_async (group, queue, {            nsthread.sleepfortimeinterval (2)            NSLog ("group2");        })        Dispatch_group_async (group, queue, {            nsthread.sleepfortimeinterval (3)            NSLog ("Group3");        })        Dispatch_group_async (group, queue, {            nsthread.sleepfortimeinterval (3)            NSLog ("group34");        })        Dispatch_group_notify (Group, Dispatch_get_main_queue (), {            NSLog ("UpdateUi");        });
Dispatch_group_async is an asynchronous method that you can see when you run the print result:

2015-05-08 23:23:40.344 imageloaderexample[1504:48633] group1

2015-05-08 23:23:41.340 imageloaderexample[1504:48631] Group2

2015-05-08 23:23:42.340 imageloaderexample[1504:48638] Group34

2015-05-08 23:23:42.340 imageloaderexample[1504:48634] Group3

2015-05-08 23:23:42.341 imageloaderexample[1504:48547] UpdateUi

Each second prints one, and when the third task executes, the UPADTEUI is printed.
Get UIImage-(UIImage *) imagewithurlstring by URL: (NSString *) urlstring {Nsurl *url = [Nsurl Urlwithstring:urlstrin      G];      NSData *data = [NSData Datawithcontentsofurl:url];  This does not automatically release the UIImage object return [[UIImage alloc] initwithdata:data];             }-(void) downloadimages {dispatch_queue_t queue = Dispatch_get_global_queue (Dispatch_queue_priority_default, 0); Asynchronously download the picture Dispatch_async (queue, ^{//Create a group of dispatch_group_t group = Dispatch_group_cre                    Ate ();          __block UIImage *image1 = nil;                    __block UIImage *image2 = nil;              Associate a task to group Dispatch_group_async (group, Dispatch_get_global_queue (Dispatch_queue_priority_default, 0), ^{              Download the first picture nsstring *URL1 = @ "Http://car0.autoimg.cn/upload/spec/9579/u_20120110174805627264.jpg";          Image1 = [self IMAGEWITHURLSTRING:URL1];                    }); Associate a task to group Dispatch_group_async (group, Dispatch_get_global_queue (dispatch_queue_priority_default, 0), ^{//download First picture NSString *url2 =              @ "Http://hiphotos.baidu.com/lvpics/pic/item/3a86813d1fa41768bba16746.jpg";          Image2 = [self imagewithurlstring:url2];                    }); After the tasks in the wait group have been executed, return to the main thread to execute block callback Dispatch_group_notify (group, Dispatch_get_main_queue (), ^{Self.imagev              Iew1.image = Image1;                            Self.imageView2.image = Image2; Do not automatically release UIImage in an asynchronous thread, because when the asynchronous thread ends, the automatic release pool of the asynchronous thread is destroyed, and the uiimage is destroyed//here the image resource is released [IM              Age1 release];          [Image2 release];                    });      Release group Dispatch_release (group);  });   }

7. Use of Dispatch_barrier_async Dispatch_barrier_async is executed after the execution of the previous task is completed, and the task after it is executed before it executes.
        NSLog ("Begin");        Let queue = Dispatch_queue_create ("Gcdtest.rongfzh.yc", Dispatch_queue_concurrent)        Dispatch_async (queue, {            Nsthread.sleepfortimeinterval (2)            NSLog ("dispatch_async1");        Dispatch_async (queue, {            nsthread.sleepfortimeinterval (4)            NSLog ("Dispatch_async2");            }        );        Dispatch_barrier_async (queue, {            NSLog ("Dispatch_barrier_async");            Nsthread.sleepfortimeinterval (4)            }        );        Dispatch_async (queue, {            nsthread.sleepfortimeinterval (1)            NSLog ("dispatch_async3");            }        );        Dispatch_async (queue, {            nsthread.sleepfortimeinterval (1)            NSLog ("Dispatch_async4");            }        );
Printing results:

2015-05-08 23:39:11.729 imageloaderexample[1635:55195] Begin

2015-05-08 23:39:13.731 imageloaderexample[1635:55268] Dispatch_async1

2015-05-08 23:39:15.730 imageloaderexample[1635:55267] Dispatch_async2

2015-05-08 23:39:15.731 imageloaderexample[1635:55267] Dispatch_barrier_async

2015-05-08 23:39:20.742 imageloaderexample[1635:55268] Dispatch_async4

2015-05-08 23:39:20.742 imageloaderexample[1635:55267] Dispatch_async3

8, dispatch_apply concurrent execution loop iteration

If you use loops to perform a fixed number of iterations, the concurrent dispatch queue may improve performance. Serial or parallel can be specified

As with the normal for loop, the dispatch_apply and Dispatch_apply_f functions are not returned until all iterations have been completed, so the two functions block the current thread. If you pass an argument that is a serial queue, and it is the queue that executes the current code, a deadlock is generated.

Let queue = Dispatch_queue_create ("Gcdtest.rongfzh.yc", Dispatch_queue_concurrent) dispatch_apply (the queue, {print (" |*\ ($) *| ")})

Printing results:

|*0*| | *4*| | *5*| | | | | 6312****| | | | *|7| | |8| 9**|*1|*|01|*1|*|**1| | | 12**3*15*| | *1|*16|4*|**|1| | | **|711**98|2**|0| | **|| 2|*|1*2**2223|*4*|*| | | | **|2|2*56*2**2|7|8*| | **|29*|

Let queue = Dispatch_queue_create ("Gcdtest.rongfzh.yc", Dispatch_queue_serial) dispatch_apply (A, queue, {print ("|*\ ( ($) *| ")})

Printing results:

|*0*| | *1*| | *2*| | *3*| | *4*| | *5*| | *6*| | *7*| | *8*| | *9*| | *10*| | *11*| | *12*| | *13*| | *14*| | *15*| | *16*| | *17*| | *18*| | *19*| | *20*| | *21*| | *22*| | *23*| | *24*| | *25*| | *26*| | *27*| | *28*| | *29*|

iOS concurrent programming

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.