Multi-Threading for iOS development-all aspects of GCD

Source: Internet
Author: User
Tags gcd

Preface: This article GCD's blog is I read a lot of great God at home and abroad on the gcd of the article, and the combination of their superficial understanding of GCD, and then take its essence, to its trough meal, comprehensive notes, and is as far as possible to understand and is the correct way of theoretical exposition to the reader, At the same time, it is also said that the Great God blog some deep-astringent theoretical understanding of the easy to understand the reader, has been as far as possible to allow readers to understand and master the multi-threading knowledge and GCD use of technology. The final appendix, I will give all I read the great God wrote about multithreading or GCD article links, everyone interested, can go to reference and learning. Perhaps, look at my this article is enough, because I just refer to them, hehe.

This blog post: Continuous update, not yet updated.

Content outline:

Related Basic concepts

1. What is GCD?
2. Tasks (Task)
3, queues (queue)
4. Types of queues (queue Types)
5. Threads
6. Summarize the use of queues and threads at the same time

Brief discussion on GCD technology

1. Delay
2. Create dispatch_time_t correctly
3, Dispatch_suspend not equal to "stop the queue immediately"
4. Avoid deadlocks
5, GCD signal volume
6, the GCD timer

Appendix: Very Good GCD learning Web Links

------------------------------------------

In addition, I gcd commonly used API interface for a simple package, easy to use, code readability greatly improved, GitHub URL is: Https://github.com/HeYang123456789/HYGCD.

If you are good, remember to praise on GitHub.

Well, I'm not going to advertise for myself. Let the reader learn the following carefully.

Related Basic concepts

1. What is GCD?

iOS implementations provide multithreaded solutions: Nsthread, Nsoperation, Nsinvocationoperation, Pthread, GCD.

GCD should be the most attractive and easy to use in all of the iOS implementations of multithreading, because GCD is the solution Apple proposes for multi-core parallel computing.

GCD is the abbreviation for Grand Central Dispatch, which is based on the C language. With GCD, we don't need to write thread code, and its lifecycle doesn't require us to manage it manually, define the tasks we want to perform, and then add them to the appropriate dispatch queue, which is the dispatch queue. GCD is responsible for creating threads and scheduling tasks, and the system provides thread management directly.

2. Tasks (Task)

A piece of code block, a business logic process that can be implemented after execution. Generally use closures or block to include it. To perform a task, you need to give the closure or block to the queue, and then get the queue in FIFO order and put it in the thread execution.

3, queues (queue)

We need to understand the concept of queues, GCD provides dispatch queues to handle code blocks that manage the tasks provided to GCD and perform those tasks in a FIFO order. This ensures that the first task to be added to the queue will be the first task in the queue, and the second task to be added will start the second one, so that the end of the queue.

Serial (Serial) queue

The tasks in the serial queue are taken out according to the queue definition FIFO (FIFO) Order, execute one, execute and then take out the next task, such one execution.

Serial Queue graph:

   

Concurrent (Concurrent) queues

The tasks in the concurrent queue are also taken out in the order of the queue definition FIFO (FIFO), but unlike the serial queue, the concurrent queue takes out a task to start execution on another thread, regardless of whether the thread is a synchronous or an asynchronous thread. The concurrent queue does not wait for the task in the thread to finish, but takes the task out and then immediately takes the next task from the concurrent queue and puts it on another thread. Because fetching tasks is fast and negligible, it seems as if all tasks are starting at the same time. So called "Concurrent queue". However, the task is long and short, the task execution end of the order you are not sure, it is possible that the shortest task performed first.

Concurrent Queue graphs:

   

Supplement: Different parts of the concurrent code can be executed "synchronously". However, how this happens or whether it happens depends on the system. Multi-core devices execute multiple threads concurrently, but in order for a single-core device to do this, they must first run a thread, perform a context switch, and then run another thread or process. This usually happens quickly enough to give us the illusion of a concurrent execution, as shown in:

  

4. Types of queues (queue Types)

Provided by the system:

A home column (main queue), which is a serial queue.

As with other serial queues, the tasks in this queue can only execute one at a time. However, it ensures that all tasks are performed on the main thread, and that the main thread is the only one that can be used to update the UI. This queue is used for messages to UIView or to send notifications.

When using the main thread serial queue, we only get it through the Dispatch_get_main_queue method, and we don't need to manage its memory problems.

Four global dispatch queues (globally Dispatch Queues), all parallel queues

Four global queues have different priorities: background, low, default, and high. You know, Apple's API also uses these queues, so any tasks you add won't be the only tasks in those queues.

When using global concurrency queues, we only get them through the Dispatch_get_global_queue method, and we don't need to manage their references.

You can also create your own serial queue or a concurrent queue.

Summary: That is to say, there are at least five queues for you to dispose of: the primary queue, four global dispatch queues, plus any queues you create yourself.

The above is the big frame of dispatch queue!

5. Threads

There are two types of threads: Synchronous vs. asynchronous synchronous vs. asynchronous.

GCD creates a synchronization task and an asynchronous task by calling the following two functions, placing the queue, and finally pulling it out of the queue for execution by the thread:

  

Here are two things to say: synchronous functions and asynchronous functions. Our C-Language Foundation has learned the function, which contains a block of code in curly braces, which returns when the function is executed, either to return void or to return a specific data value, while also representing the completion of the function. The function in the thread, of course, contains the task, which is either a closure or a block. The difference between a synchronous function and an asynchronous function is that the synchronization function needs to be returned after it has completed its scheduled task, and the asynchronous function returns immediately, which means that the task that the asynchronous function is scheduled to complete but does not return immediately when it is finished.

In addition, the synchronization thread does not have the ability to create child threads, in general, the task of the synchronization thread executes in the only main thread, that is, the only main thread is the synchronization thread. While asynchronous threads are capable of creating sub-threads, in general, asynchronous threads are threads other than the main thread (main), not unique, there will be more than one, specifically which threads, if created through GCD, are all determined by GCD. As to the maximum number of sub-threads that can be created, this needs to be based on the CPU capacity of the device, such as the following: I use a real machine, create multiple asynchronous tasks by using GCD, and add the results of the test execution to the concurrent queue.

  

As you can see, I created 8 asynchronous tasks, and GCD created three sub-threads based on the limits of the device, namely Threads 2, 3, 4, and then randomly executing the task. Where thread 4 performs four tasks, thread 2 and thread 3 perform two tasks respectively.

Then, synchronous or asynchronous, these two terms need to be discussed in comparison between multiple function tasks, which is better understood. For example, synchronization, when the thread executes a synchronous function task, the line routines stop, that is, the so-called "current thread blocking", the current thread needs to wait for the function to complete the return value, also means that the function of the complete execution of the task, the thread will continue to execute, To perform the next function task, so such a thread is also called a synchronization thread . While the async function is in a different thread, the thread starts executing an asynchronous function task, because the async function returns immediately, although the function may not have finished, but returns, the thread will continue to perform the next function task, because the process is fast, It is almost possible to ignore the order in which the task execution begins, and it is almost as if the " asynchronous thread is performing multiple tasks at the same time".

Attention:

Do not place time-consuming operations on the main thread, and all UI-related operations are handled in the main line

Time-consuming operations should be placed on child threads (background threads, non-main thread)

6. Summarize the use of queues and threads at the same time

OK, let's take a look at this diagram, it may be a little different from what you see elsewhere, this is the appropriate modification according to the above theory:

  

6-1. Serial Queue + synchronization:

Suppose a bunch of tasks are placed in a serial queue.

Because the serial queue is an FIFO FIFO queue, and the serial queue is a task that needs to be removed before it is finished, the task is then removed.

Because synchronization (sync) does not have the ability to open new threads, the synchronization function blocks the current thread.

So, this heap of tasks will be in the serial queue and main main thread,

A task is submitted from the serial queue to the main thread, and after the task has been executed, the next task is then taken from the serial queue and submitted to the main main thread, so that the task is executed sequentially.

Verification Code:

  

Printing results:

2016-03-13 15:51:04.970 Multithreading [25,106:632,356] The current thread is: <nsthread:0x7fe28bc025b0>{number = 1, name = main}
2016-03-13 15:51:06.974 Multithreading [25,106:632,356] The current thread is: <nsthread:0x7fe28bc025b0>{number = 1, name = main}
2016-03-13 15:51:08.979 Multithreading [25,106:632,356] The current thread is: <nsthread:0x7fe28bc025b0>{number = 1, name = main}
2016-03-13 15:51:10.980 Multithreading [25,106:632,356] The current thread is: <nsthread:0x7fe28bc025b0>{number = 1, name = main}

Analyze print results: Perform a task every 2 seconds, indicating that a blocking is occurring, and that the thread executing is the main main thread.

6-2. Concurrent Queue + synchronization:

Suppose a bunch of tasks are placed in a concurrent queue.

Because the concurrent queue is also an FIFO FIFO queue, the concurrent queue is a task that needs to be taken out, and the task is not completed before it is immediately removed.

Although the concurrent queue is also an FIFO FIFO queue, the fetch is fast and negligible, as if all the tasks were taken out at the same time, so called "concurrent queues".

Because synchronization (sync) does not have the ability to open new threads, the synchronization function blocks the current thread.

So, this heap of tasks will be in the concurrent queue and main thread,

All tasks are quickly submitted to the main main thread, and after a task is executed and then the next task is executed, the task is executed sequentially.

Verification Code:

Printing results:

2016-03-13 16:02:51.566 Multithreading [25,327:637,330] The current thread is: <nsthread:0x7fe53b6039c0>{number = 1, name = main}
2016-03-13 16:02:53.570 Multithreading [25,327:637,330] The current thread is: <nsthread:0x7fe53b6039c0>{number = 1, name = main}
2016-03-13 16:02:55.575 Multithreading [25,327:637,330] The current thread is: <nsthread:0x7fe53b6039c0>{number = 1, name = main}
2016-03-13 16:02:57.577 Multithreading [25,327:637,330] The current thread is: <nsthread:0x7fe53b6039c0>{number = 1, name = main}

Analyze print results: Perform a task every 2 seconds, indicating that a blocking is occurring, and that the thread executing is the main main thread.

6-3. Serial Queue + Async:

Suppose a bunch of tasks are placed in a serial queue.

Because the serial queue is an FIFO FIFO queue, and the serial queue is a task that needs to be removed before it is finished, the task is then removed.

Because async (async) has the ability to open new threads, async functions do not block the current thread.

So, this heap of tasks will be in the serial queue and open new threads,

A task is submitted from the serial queue to the main thread, and the task is executed sequentially, after the task has been executed, and then the task is submitted from the serial queue to the new one.

Verification Code:

  

Printing results:

2016-03-13 16:09:23.744 Multithreading [25,460:640,764] The current thread is: <nsthread:0x7fb08b638e00>{number = 2, name = (NULL)}
2016-03-13 16:09:25.750 Multithreading [25,460:640,764] The current thread is: <nsthread:0x7fb08b638e00>{number = 2, name = (NULL)}
2016-03-13 16:09:27.753 Multithreading [25,460:640,764] The current thread is: <nsthread:0x7fb08b638e00>{number = 2, name = (NULL)}
2016-03-13 16:09:29.757 Multithreading [25,460:640,764] The current thread is: <nsthread:0x7fb08b638e00>{number = 2, name = (NULL)}

Analyze print results: Perform a task every 2 seconds, indicating that the serial queue has occurred waiting for the previous fetch to complete, and the thread executing is the new thread.

6-4. Concurrent Queue + Async:

Suppose a bunch of tasks are placed in a concurrent queue.

Because the concurrent queue is also an FIFO FIFO queue, the concurrent queue is a task that needs to be taken out, and it is not done yet, and then the task is immediately removed.

Although the concurrent queue is also an FIFO FIFO queue, the fetch is fast and negligible, as if all the tasks were taken out at the same time, so called "concurrent queues".

Because async (async) has the ability to open new threads, async functions do not block the current thread.

So, this heap of tasks will be in the concurrent queues and new threads,

All tasks are quickly submitted to the new thread, and if the current new thread may be busy, the concurrent queue may turn the task into another new thread, so that multiple threads are executed asynchronously and quickly.

Verification Code:

  

Printing results:

2016-03-13 16:17:24.685 Multithreading [25,648:645,355] The current thread is: <nsthread:0x7f93a0605b70>{number = 1, name = main}
2016-03-13 16:17:28.693 Multithreading [25,648:645,477] The current thread is: <nsthread:0x7f93a0611a40>{number = 3, name = (NULL)}
2016-03-13 16:17:28.693 Multithreading [25,648:645,480] The current thread is: <nsthread:0x7f93a0620220>{number = 5, name = (NULL)}
2016-03-13 16:17:28.693 Multithreading [25,648:645,476] The current thread is: <nsthread:0x7f93a0706f60>{number = 2, name = (NULL)}
2016-03-13 16:17:28.693 Multithreading [25,648:645,478] The current thread is: <nsthread:0x7f93a0409ee0>{number = 4, name = (NULL)}

Analyze print results: Both the main thread and the new thread are dormant for 2 seconds, so the task of the child thread executes at the same time after 4 seconds, and the queue quickly submits the task to each thread, and the individual threads do not block the task.

Brief discussion on GCD technology

1. Delay

Method 1: Use the NSObject API to defer synchronous execution (not deferred commit):

[Self performselector: @selector (myFunction) Withobject:nil afterdelay:5.0];

Add: This has the delay to cancel the operation:

[NSObject cancelpreviousperformrequestswithtarget:self];//The argument behind is the object that executes the performselector, and here is the self

Method 2: Use the Nstimer timer (not deferred commit).

Slightly. (Add, Nstimer has a method can be timer accurate to 0.01 seconds, that is, every 0.01 seconds can print out a statement, of course, this is not considered the screen refresh rate)

Method 3: Use the Dispatch_after method for asynchronous deferred execution (deferred commit):

CGFloat time = 5.0f;
Dispatch_after (Dispatch_time (Dispatch_time_now, (int64_t) (Time * nsec_per_sec)),
Dispatch_get_main_queue (), ^{
Time seconds after executing the code here asynchronously ...

});

Supplement: Dispatch_after is deferred commit, not deferred execution.

Official Note:

Enqueue a block for execution at the specified time.
Enqueue, is the team, refers to a block after a specific delay, added to the specified queue, not at a specific time immediately after the run!.

2. Create dispatch_time_t correctly

When you use Dispatch_after, you use the dispatch_time_t variable, but how do you create the right time? The answer is to use the Dispatch_time function, which is prototyped as follows:

dispatch_time_t Dispatch_time (dispatch_time_t when, int64_t Delta);
The first parameter is generally dispatch_time_now, which means starting from now.
The second parameter is the exact time of the real delay.

It is important to note here that the Delta parameter is "Nanosecond!" ", that is, a delay of 1 seconds, the delta should be" 1000 000 000 ", too long, so of course the system provides a constant, as follows:

  

Keyword Explanation:
NSEC: na seconds.
USEC: Subtle.
SEC: sec
Per: every

So:
Nsec_per_sec, how many nanoseconds per second.
Usec_per_sec, how many milliseconds per second. (Note is on the basis of nanosecond)
Nsec_per_usec, how many nanoseconds per millisecond.

Note that the second parameter of dispatch_time_t is in nanoseconds. So when you pass the second parameter to 000ull or nsec_per_sec, it means 1 seconds.

3, Dispatch_suspend not equal to "stop the queue immediately"

Dispatch_suspend,dispatch_resume provides the ability to "suspend, resume" the queue, simply to pause and resume tasks on the queue.

But the "hang" here does not guarantee that the block that is running on the queue can be stopped immediately, see the following example:

  

Printing results:

  

Essence:

The suspend is to pause the task in the queue and pause the task that was removed from the queue before it is done.

If it is a concurrent queue and then pauses a little later, all of the tasks may have been removed from the queue and committed to the thread. Readers can tap the code to authenticate themselves.

4. Avoid deadlocks

What is a deadlock?

The task is actually waiting for each other.

The following looks simple, but there will be a deadlock, oh, if your app has the following situation, it will be stuck dead oh:

1. The sync function is nested with each other, resulting in a deadlock

  

This is a good understanding that the synchronization task in UpdateUI1 is definitely more than updateUI2, because only updateUI1 tasks are executed, and the UPDATEUI2 task is created to add to the thread and execute.

So, first we can make sure that updateUI1 is executed first and then executed after updateUI2.

However, because updateUI1 contains updateUI2 as a task, updateUI2 execution is complete, so the updateUI1 is executed. If UpdateUI2 is not finished, updateUI1 will not be executed.

However, depending on the synchronization function blocking, updateUI2 needs to wait for updateUI1 execution to complete before it is updateUI2 execution.

That's it, waiting for each other. So there was a deadlock.

2, in the main thread using the "synchronous" method to submit block, must be deadlocked.

  

Then this is the same as in the previous case. In fact, the program logic is executed in the currently running thread, that is, the main thread, the task function on the main thread is the synchronization function.

Because, you write a piece of code that does not allow concurrent execution. If executed concurrently, then your code logic is not chaotic, you do not know that the code executed first, the code after the execution.

You may not be able to see it, so you can add a nslog (@ "Current thread is:%@", [Nsthread]), thus proving that the code currently executing is in the main thread.

Then you have to create a synchronous function task in the main thread in the current main thread.

This is the same as in the first case. There was a deadlock.

So, try to use less sync.

5, GCD signal volume

About GCD Semaphore Example one:

/**

* When we are dealing with a series of threads, when the quantity reaches a certain amount, we may choose to use Nsoperationqueue to handle concurrency control in the past, but how to control concurrency quickly in GCD? The answer is Dispatch_semaphore, for people who often do UNIX development, the content I introduce may seem very entry-level, the semaphore in their multi-threaded development is more common.

The semaphore is an integer value and has an initial count value, and supports two operations: signal notification and wait. When a semaphore is signaled, its count is incremented. When a thread waits on a semaphore, the thread is blocked (if necessary) until the counter is greater than 0, and the thread reduces the count.

There are three functions in GCD that are semaphore operations, namely:

Dispatch_semaphore_create Create a Semaphore

Dispatch_semaphore_signal Send a signal

Dispatch_semaphore_wait Waiting Signal

A simple introduction to these three functions, the first function has a shaping parameters, we can understand as the total amount of signal, dispatch_semaphore_signal is to send a signal, will naturally let the total amount of signal 1,dispatch_semaphore_ Wait for the signal, when the total signal amount of less than 0 will be waiting, otherwise it can be normal execution, and let the total number of signals minus 1, according to this principle, we can quickly create a concurrency control to synchronize tasks and limited resource access control.

*

*

*/

Creating a group

dispatch_group_t group = Dispatch_group_create ();

Initial total signal is 10

dispatch_semaphore_t semaphore = dispatch_semaphore_create (10);

Get global concurrency Queue

dispatch_queue_t queue = Dispatch_get_global_queue (Dispatch_queue_priority_default, 0);

for (int i = 0; i <; i++)

{

The signal waits for the total signal-1, which starts as 10-1=9 and continues down execution.

When the loop traverses to 10 times, the semaphore waits to be executed 10 times, the 11th time to this semaphore waits the code, the signal amount becomes-1,

The current thread is stuck, it's not going to work, that's what it's waiting for.

Dispatch_semaphore_wait (semaphore, dispatch_time_forever);

Associating a concurrent task to a group

Dispatch_group_async (group, queue, ^{

The first 10 asynchronous tasks, prints the I value of the current loop

NSLog (@ "%i", I);

After printing, the previous 10 asynchronous tasks will stall dormant for 2 seconds

Sleep (2);

After 2 seconds of stagnant sleep, 10 asynchronous tasks Add 1 to the semaphore,

Send a signal total of +1 if the +1 pre-signal volume is less than 1, you can start the wait position immediately before execution

Dispatch_semaphore_signal (semaphore);

});

}

Wait for all group related tasks to complete before going down

Dispatch_group_wait (group, dispatch_time_forever);

Dispatch_group_notify (group, queue, ^{

NSLog (@ "complete 、、、、、、");//All group-related callbacks executed

//    });

NSLog (@ "... Finish 、、、、、、 ");//This will precede the print

About GCD Semaphore Example two:

/**

*

* Here is an example of how a semaphore can be implemented, and this example is a question from the interviewer.

* Problem Requirements: Two asynchronous task nesting, how to ensure that the internal asynchronous task is executed first?

*

* I analysis: did not indicate in what queue to remove the task, I first assume the serial queue.

* If it is a serial queue, because the serial queue FIFO, the first task created must first execute, and then execute the next task.

* If there are two asynchronous task nesting in the serial queue, then the external asynchronous task must first be executed, because the external task is first created, advanced queue, and then because the queue FIFO

* Summary: If you want two asynchronous tasks to be nested, and ensure that the internal asynchronous task executes first, then the concurrent queue must be required.

*/

To solve the interview problem, I immediately think of the use of GCD signal volume of knowledge, the following is I immediately through the code to achieve the solution to the problem:

Start with a semaphore of 0, and when the signal volume is less than 0, the current line routines stops waiting
dispatch_semaphore_t Dispatchsemaphore = dispatch_semaphore_create (0);

dispatch_queue_t queue = dispatch_queue_create ("Heyang", dispatch_queue_serial);

Dispatch_async (Queue, ^{

Dispatch_async (Queue, ^{
Sleep (2);
NSLog (@ "--1--");
NSLog (@ "%@", [Nsthread CurrentThread]);
Send a semaphore, and it will add 1 to that semaphore.
Dispatch_semaphore_signal (Dispatchsemaphore);
});
The semaphore waits for the signal to be reduced by 1.
Dispatch_semaphore_wait (Dispatchsemaphore, dispatch_time_forever);
NSLog (@ "--2--");
NSLog (@ "%@", [Nsthread CurrentThread]);
});

If the semaphore is understood earlier, then this code is well understood.

The use of the semaphore is easy to add the wrong point:

Here is the code when I make a mistake:

-(void) Viewdidload {
[Super Viewdidload];

dispatch_queue_t queue = Dispatch_get_global_queue (Dispatch_queue_priority_default, 0);

dispatch_semaphore_t semaphore = dispatch_semaphore_create (0);

Dispatch_async (Queue, ^{
Dispatch_async (Queue, ^{
+1
Dispatch_semaphore_signal (semaphore);
NSLog (@ "Hello-2");
});
Wait-1
Dispatch_semaphore_wait (semaphore, dispatch_time_forever);
NSLog (@ "Hello-1");
});
}

The wrong part: The red code should be placed after the NSLog (@ "Hello-2"), according to the logic of the above error: when the code executes into red code,

Because the semaphore is the global management of the thread, then the red code is to send a semaphore of +1, so that the model two synthesis is not equal to 1, so that all the threads are not stalled.

Then the two tasks of the above code are "NSLog (@" Hello-2 "); and "NSLog (@" Hello-1 ");" and started to go into concurrent execution, so it went into random execution.

In order.

To ensure that the internal thread task executes first, the red Code, the code that sends the semaphore, is placed on the last line of the current task function to ensure that the previous task executes first.

6, the GCD timer

code example:

   

Timer events are slightly different. They do not use the Handle/mask parameter, and the timer event uses a different function dispatch_source_set_timer to configure the timer. This function uses three parameters to control the timer trigger:

startThe parameter controls the moment when the timer is first triggered. The parameter type is dispatch_time_t , this is a opaque type and we cannot manipulate it directly. We need to dispatch_time dispatch_walltime create them with functions. In addition, constants DISPATCH_TIME_NOW and DISPATCH_TIME_FOREVER often are useful.

interval(n. 间隔;间距;幕间休息)the parameters are nothing to explain.

The leeway parameters are more interesting. This parameter tells the system how accurate the timer is to trigger. All timers are not guaranteed to be 100% accurate, this parameter is used to tell the system you want the system to ensure the accuracy of the effort level. If you want a timer to be triggered no more than five seconds, and the more accurate the better, then you pass 0 as the parameter. In addition, if it is a recurring task, such as checking email, then you will want to check every 10 minutes, but not so accurate. So you can pass in 60 and tell the system that 60 seconds of error is acceptable.

What is the point of this? In simple terms, it is to reduce resource consumption. If the system can allow the CPU to rest long enough and perform a task collection each time it wakes up instead of waking up to sleep to perform the task, the system will be more efficient. If you pass in a larger leeway to your timer, it means that you allow the system to delay your timer to combine timer tasks with other tasks.

Reproduced herein annotated Source: http://www.cnblogs.com/goodboy-heyang/p/5271513.html

Appendix: Very Good GCD learning Web Links

1, GCD this piece has open source, address http://libdispatch.macosforge.org

2, Tang Qiao Technology blog: "Use GCD", Address: http://blog.devtang.com/2012/02/22/use-gcd/

3, the great God translation from the foreign iOS very good study website article "Grand Central Dispatch In-depth:part 1/2": Https://github.com/nixzhu/dev-blog

4, the brother of the technology blog: "GCD learning": http://www.henishuo.com/gcd-multiple-thread-learn/

5. The "GCD experience and skills of the use of soil and Earth": http://tutuge.me/2015/04/03/something-about-gcd/

6, youxianming Teacher's "Grand Central Dispatch (GCD) Reference": http://www.cnblogs.com/YouXianMing/p/3600763.html

7, "About iOS multi-threading, you See Me enough": http://www.jianshu.com/p/0b0d9b1f1f19

8. How to Use "gcd (Grand Central Dispatch)": http://www.jianshu.com/p/fbe6a654604c

9, "IOS" gcd deadlock "; http://www.brighttj.com/ios/ios-gcd-deadlock.html

10, very tall on a good site: http://www.dreamingwish.com/article/gcdgrand-central-dispatch-jiao-cheng.html

Multi-Threading for iOS development-all aspects of GCD

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.