IOS _ multithreading _ GCD, ios multithreading _ gcd

Source: Internet
Author: User

IOS _ multithreading _ GCD, ios multithreading _ gcd
1. GCD serial queue + asynchronous dispatch Description: very useful asynchronous operations, indicating that a new thread will be opened outside the main thread, but because the queue is serial, therefore, only one (only one) New thread is opened. When the dispatch assigns a block, the main thread immediately returns and continues to execute the serial queue downward, indicating: blocks assigned to this queue perform dispatch_async one by one in FIFO order. Description: return immediately and never wait. The queue determines whether it is serially or concurrently, block_copy (), block_release ()

Program running result output: we can see that adding a block to the queue in the serial queue + asynchronous mode only opens a new thread, and all the added blocks are in the new thread for pleasant and orderly execution.

2. GCD serial queue + synchronous dispatch (rarely used) Synchronous representation: if no new thread is enabled, serial representation is run on the main thread: All blocks run one by one



3. Parallel queue + asynchronous dispatch (easy to get out of control)

Open multiple new threads (several threads cannot be controlled), block execution has no sequence, and programmers cannot control the execution sequence.



4. Concurrent queue + synchronous dispatch

Key: Because of synchronization, no new thread is enabled, and the main thread is used directly,

Although it is a concurrent queue, it can be executed sequentially because it has a route.




5. divergent thinking ---> the serial queue synchronizes ten dispatch blocks first, and then asynchronously dispatch ten blocks. Since the ten blocks are synchronously distributed, they are executed on the main thread, asynchronous allocation: to open a new thread, because the queue is serial, only one new thread is opened, and one thread is executed happily.


6. divergent thinking ---> in a parallel queue, 10 blocks are allocated synchronously, and 10 blcoks are allocated asynchronously. As long as they are synchronous dispatches, they will only run on the main thread and then asynchronously distribute, A new thread will be opened. Because it is a parallel queue, N new threads will be opened, N numbers cannot be controlled, and the execution sequence of blocks cannot be controlled.


7. divergent thinking ---> in a parallel queue, 10 blocks are allocated asynchronously, and 10 blcoks are allocated synchronously.
Because it is a parallel and asynchronous dispatch: multiple threads are enabled, and after the block is distributed, the main thread continues to execute downward, as long as it is a synchronous dispatch, it will only be executed on the main thread, so N sub-threads and the main thread are interspersed for execution


8. Global queues provided by the system are used by all apps. the similarities and differences between global queues and parallel queues are as follows:

Global queue for all apps

Three differences between global queue and parallel queue:

1. The Global queue does not need to create, and can be obtained only by get.

2. The Global queue has no name. Therefore, it is inconvenient to view the global queue during debugging.

3. The execution effects of global queues and parallel queues are identical.





9. The main column + synchronous dispatch [blocking] Because the main column is always waiting for user input at any time, the main column is always running and will not exit, the program ends. Therefore, if a block is synchronously assigned to the main queue, it will never be executed. The block will block the main thread, the code after the block will never be executed.
10. The main team column + async is assigned to the main team column. No matter how dispatch is performed, as long as it is a block added to the main team column, it is arranged by the main team column and executed in sequence.

















//// ViewController. m // GCD // Created by xss on 14-11-23. // Copyright (c) 2014 beyond. all rights reserved. // # import "ViewController. h "@ interface ViewController () @ end @ implementation ViewController-(void) viewDidLoad {[super viewDidLoad]; // [self gcd_serial_async]; // [self gcd_serial_sync]; // [self gcd_concurrent_async]; // [self gcd_concurrent_sync]; // [self gcd_concurrent_sync_async]; // [self gcd_concurrent_async_sync]; // [self Sync]; // [self gcd_global_queue]; // [self gcd_main_queue_sync]; [self gcd_main_queue_async];}/* 1. Main queue column + asynchronous dispatch (no matter how dispatch is performed, add the main queue column, will be executed in a serial manner, arranged in the order of the main queue) */-(void) gcd_main_queue_async {dispatch_queue_t mainQueue = dispatch_get_main_queue (); // quickly add 10 blocks for (int I = 0; I <10; I ++) {dispatch_async (mainQueue, ^ {NSLog (@ "\ n ----------> asynchronous DISPATCH: % @ -- the % d block in the main queue column is being executed", [NSThread currentThread], i);}/* 1. The main queue column + synchronous dispatch = will never be executed because a block in the main queue column is listening for and processing UI operations, and blocking is always performed, it will not exit and will not end. Therefore, if you synchronize dispatch, the block after it will never be executed */-(void) gcd_main_queue_sync {dispatch_queue_t mainQueue = dispatch_get_main_queue (); NSLog (@ "before"); dispatch_sync (mainQueue, ^ {NSLog (@ "will never be executed to") ;}); NSLog (@ "after ");} /* The Global queue used by all apps is slightly different from the parallel queue. 1. The Global queue does not need to be created, and can be obtained as long as get is used. 2. The Global queue has no name. Therefore, during debugging, it is not convenient to view 3. The execution effects of the global queue and the parallel queue are exactly the same */-(void) gcd_global_queue {dispatch_queue_t globalQueue = dispatch_get_global_queue (queue, 0 ); // quickly add 10 blocks for (int I = 0; I <10; I ++) {dispatch_sync (globalQueue, ^ {NSLog (@ "\ n ----------> synchronous DISPATCH: % @ -- the % d block in the global queue is being executed", [NSThread currentThread], I );});} // quickly add 10 blocks for (int I = 0; I <10; I ++) {dispatch_async (globalQueue, ^ {NSLog (@ "\ n ----------> asynchronous DISPATCH: % @ -- the % d block in the global queue is being executed", [NSThread currentThread], i);}) ;}}/* parallel queue: first, 10 queues are asynchronously distributed, and then 10 queues are synchronously distributed: interspersed execution */-(void) gcd_concurrent_async_sync {dispatch_queue_t queue = dispatch_queue_create ("parallel queue", DISPATCH_QUEUE_CONCURRENT); // Add 10 blocks for (int I = 0; I <10; I ++) {dispatch_async (queue, ^ {NSLog (@ "\ n ----------> asynchronous DISPATCH: % @ -- % d block in the parallel queue being executed ", [NSThread currentThread], I) ;}// add 10 blocks for (int I = 0; I <10; I ++) to the queue synchronously) {dispatch_sync (queue, ^ {NSLog (@ "\ n ----------> synchronization allocation: % @ -- the % d block in the execution parallel queue", [NSThread currentThread], i);}/* parallel queue: 10 queues are distributed synchronously, and 10 queues are allocated asynchronously */-(void) gcd_concurrent_sync_async {dispatch_queue_t queue = dispatch_queue_create ("parallel queue", DISPATCH_QUEUE_CONCURRENT); // Add 10 blocks for (int I = 0; I <10; I ++) {dispatch_sync (queue, ^ {NSLog (@ "\ n ----------> synchronous DISPATCH: % @ -- the % d block in the parallel queue is being executed ", [NSThread currentThread], I) ;}// add 10 blocks for (int I = 0; I <10; I ++) to the queue asynchronously) {dispatch_async (queue, ^ {NSLog (@ "\ n ----------> asynchronous DISPATCH: % @ -- % d block in the parallel queue being executed", [NSThread currentThread], i) ;};}}/* serial queue: 1. first sync to distribute 10 blocks 2. async then allocates 10 blocks */-(void) gcd_serial_sync_async {dispatch_queue_t queue = dispatch_queue_create ("Serial queue", DISPATCH_QUEUE_SERIAL ); // quickly add 10 blocks for (int I = 0; I <10; I ++) to the queue {// synchronize dispatch dispatch_sync (queue, ^ {NSLog (@ "----------> synchronous DISPATCH: % @ -- the % d block in the serial queue is being executed", [NSThread currentThread], I );});} // quickly add 10 blocks for (int I = 0; I <10; I ++) {// asynchronous dispatch dispatch_async (queue, ^ {NSLog (@ "----------> asynchronous DISPATCH: % @ -- the % d block in the serial queue is being executed", [NSThread currentThread], I );});}} /* concurrent queue + synchronous dispatch key: synchronous, no new thread is enabled, and the main thread is directly used. Although it is a concurrent queue, it can have an execution route, therefore, the */-(void) gcd_concurrent_sync {// parameter 1 is used for debugging and is a C string. // parameter 2: A dispatch queue that may invoke blocks concurrently and supports // barrier blocks submitted with the dispatch barrier API. dispatch_queue_t queue = dispatch_queue_create ("parallel queue", DISPATCH_QUEUE_CONCURRENT); // quickly add 10 blocks to the queue for (int I = 0; I <10; I ++) {dispatch_sync (queue, ^ {NSLog (@ "\ n ----------> synchronization allocation: % @ -- the % d block in the execution parallel queue", [NSThread currentThread], i);}/* concurrent queue + asynchronous dispatch key: multiple new threads are enabled (several threads cannot be controlled), and the block execution is not sequential, the programmer cannot control the execution sequence */-(void) gcd_concurrent_async {// parameter 1: used for debugging, is a C string, // parameter 2: A dispatch queue that may invoke blocks concurrently and supports // barrier blocks submitted with the dispatch barrier API. dispatch_queue_t queue = dispatch_queue_create ("parallel queue", DISPATCH_QUEUE_CONCURRENT); // quickly add 10 blocks to the queue for (int I = 0; I <10; I ++) {dispatch_async (queue, ^ {NSLog (@ "\ n ----------> asynchronous DISPATCH: % @ -- % d block in the parallel queue being executed", [NSThread currentThread], i);}/* serial queue + synchronous dispatch [rarely used] since it is synchronous dispatch, there is no need to open a new thread, run the */-(void) gcd_serial_sync {// parameter 1 on one block in the main thread. It is a C string for debugging. // parameter 2: A dispatch queue that invokes blocks serially in FIFO order. dispatch_queue_t queue = dispatch_queue_create ("Serial queue", DISPATCH_QUEUE_SERIAL); // Add for (int I = 0; I <10; I ++) {dispatch_sync (queue, ^ {NSLog (@ "\ n ----------> thread: % @ -- % d block in the serial queue being executed", [NSThread currentThread], I );});}} /* 1. asynchronous tasks in the serial queue are very useful. For example: in a serial queue, first, download the block of the image, then perform the filter operation (such as the blcok of the red eye, highlight, and feather), and then save the block */-(void) gcd_serial_async {// parameter 1: yes for debugging, It is a c string, // parameter 2: A dispatch queue that invokes blocks serially in FIFO order. dispatch_queue_t queue = dispatch_queue_create ("Serial queue", DISPATCH_QUEUE_SERIAL); // Add for (int I = 0; I <10; I ++) to the queue quickly) {// Submits a block for asynchronous execution on a dispatch queue. // callto dispatch_async () always return immediately after the block has // been submitted, and never wait for the block to be invoked. dispatch_async (queue, ^ {NSLog (@ "\ n ----------> thread: % @ -- the % d block in the serial queue is being executed", [NSThread currentThread], i) ;}}}@ end


Serial queue: synchronization is nested in the synchronization. The result is: [never executed]

Because: theoretically, the first block is executed in the main thread, and then the 2nd blocks added to the queue are executed. However, because the 1st blocks are not completed, it will not execute the 2nd blocks that are subsequently added to the queue, so it is blocked. The 2nd blocks added to the queue will never be executed.


Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.