1,swift continues to use Object-c's original set of threads, including three multithreaded programming techniques:
(1) Thread
(2) Cocoa operation (operation and Operationqueue)
(3) Grand Central Dispath (GCD)
2, this article highlights the Grand Central Dispath (GCD
GCD is a multi-core programming solution developed by Apple, the basic concept is the dispatch queue, which is an object that accepts tasks and executes them in first-to-first-run order. The dispatch queue can be concurrent or serial. The bottom of the GCD is still threaded, but we don't have to focus on the details of the implementation. The advantages are as follows: (1) Ease of use: GCD is easier to use than thread. Block-based effects make it extremely simple to pass the context between different code scopes. (2) Efficiency: GCD is lightweight and elegant, making it more practical and fast in many places than creating threads that are dedicated to consuming resources. (3) Performance: GCD automatically increases and decreases the number of threads based on system load, thus reducing context switching and increasing computational efficiency. (4) Security: No need to lock or other synchronization mechanism. The code for this article has been updated to Swift3. 3,GCD three ways to create a queue (1) Create a queue by itself the name of the first parameter of the team column, can be arbitrarily named the second parameter indicates whether the queue belongs to serial or parallel execution of a task serial queue executes only one task at a time. Typically used to synchronize access sequentially, but we can create any number of serial queues that are concurrent between each serial queue. Parallel queues are executed in the same order as they join the queue. Multiple tasks can be executed concurrently, but the order in which the execution is completed is random.
/ / Create a serial queue
Let serial = DispatchQueue(label: "serialQueue1")
/ / Create a parallel queue
Let concurrent = DispatchQueue(label: "concurrentQueue1", attributes: .concurrent)
(2) Get the global queue that exists on the system
The Global Dispatch queue has 4 execution priorities:. userinitiated High. Default is normal. utility low. Background very low priority (this priority is only for real background tasks that don't care about the finish time)
Let Globalqueue = Dispatchqueue. Global (QoS:. default)
(3) Main Dispatch Queue running in main thread
As in the name of main, this is the queue executed in the main thread. Because the main thread has only one, all this is naturally a serial queue. A UI-related operation must be performed in the main thread.
Let Mainqueue = Dispatchqueue.main
4, add two methods of task to queue
(1) Async appends block blocks asynchronously (async function does not do any waiting)
DispatchQueue.global(qos: .default).async {
/ / Processing code blocks for time-consuming operations...
Print("do work")
/ / The operation is completed, call the main thread to refresh the interface
DispatchQueue.main.async {
Print("main refresh")
}
}
(2) sync adds block blocks
Append block blocks synchronously, as opposed to above. Before appending the block, the sync function waits until all the tasks in front of the queue are completed before appending tasks can be performed.
/ / Add synchronization code block to the global queue
/ / will not cause a deadlock, but will wait for the code block to complete
DispatchQueue.global(qos: .default).sync {
Print("sync1")
}
Print("end1")
/ / Add synchronization code block to the main queue
/ / will cause deadlock
//Because you add a task to the main thread, because it is synchronous, you can wait until the added task is executed before you can continue. But the newly added tasks are ranked
/ / At the end of the queue, to complete the execution must wait for the previous task to complete, thus returning to the first step, the program is stuck
DispatchQueue.main.sync {
Print("sync2")
}
Print("end2")
5, pause or resume queue
The two functions are asynchronous and only take effect between different blocks, and have no effect on the task that is already being performed. After suspend (), tasks that have not yet been appended to the dispatch queue stop executing. and resume () allows these tasks to continue to execute.
/ / Create a parallel queue
Let conQueue = DispatchQueue(label: "concurrentQueue1", attributes: .concurrent)
/ / Pause a queue
conQueue.suspend()
/ / Continue the queue
conQueue.resume()
6, execute only once
The block of code in the past dispatch_once is executed only once in the application, whether it is multithreaded or not. Therefore, it can be used to achieve a single case mode, safe, concise and convenient.
/ / Add code blocks to the dispatch_get_global_queue queue, only once
Var predicate:dispatch_once_t = 0
Dispatch_once(&predicate, { () -> Void in
/ / Only once, can be used to create a singleton
Println("work")
})
In Swift3, dispatch_once is discarded, and we replace it with other global or static variables and constants.
Private var once1:Void = {
//Only execute once
Print("once1")
}()
Private lazy var once2:String = {
/ / Only once, can be used to create a singleton
Print("once2")
Return "once2"
}()
7,asyncafter Deferred Call
Asyncafter does not perform task processing after a specified time, but appends the task to the queue after a specified time. So there will be a little delay. Note that we cannot (directly) cancel the code that we have submitted to asyncafter.
/ / Delay 2 seconds to execute
DispatchQueue.global(qos: .default).asyncAfter(deadline: DispatchTime.now() + 2.0) {
Print("after!")
}
If you need to cancel a block operation that is waiting to be executed, we can first encapsulate the block into a Dispatchworkitem object and then send Cancle to cancel a block that is waiting to be executed.
/ / Encapsulate the operation to be executed into DispatchWorkItem
Let task = DispatchWorkItem { print("after!") }
/ / Delay 2 seconds to execute
DispatchQueue.main.asyncAfter(deadline: DispatchTime.now() + 2, execute: task)
/ / Cancel the task
Task.cancel()
8, multiple tasks complete after all to do a full end of the processing
Async (Group:): Used to monitor the completion of a set of block objects, you can monitor notify () synchronously or asynchronously: used to summarize results, all tasks end rollup, do not block current thread wait (): Wait until all task execution is finished, cannot cancel halfway, block current thread
/ / Get the global queue of the system exists
Let queue = DispatchQueue.global(qos: .default)
/ / Define a group
Let group = DispatchGroup()
/ / Concurrent tasks, sequential execution
Queue.async(group: group) {
Sleep(2)
Print("block1")
}
Queue.async(group: group) {
Print("block2")
}
Queue.async(group: group) {
Print("block3")
}
//1, all task execution ends up, does not block the current thread
Group.notify(queue: .global(), execute: {
Print("group done")
})
//2, wait forever, until all tasks are executed, they can't be canceled in the middle, blocking the current thread
Group.wait()
Print("Task all executed")
9,concurrentperform the specified number of blocks to the queue
The Dispatchqueue.concurrentperform function is an associated API for the Sync function and dispatch group. Appends the specified block to the specified dispatch queue for a specified number of times and waits for the completion of all processing execution. Because the Concurrentperform function, like the sync function, waits for processing to end, it is recommended to execute the Concurrentperform function asynchronously in an async function. The Concurrentperform function enables high-performance loop iterations.
/ / Get the global queue of the system exists
Let queue = DispatchQueue.global(qos: .default)
/ / Define an asynchronous step code block
Queue.async {
/ / Through the currentPerform, loop variable array
DispatchQueue.concurrentPerform(iterations: 6) {(index) -> Void in
Print(index)
}
/ / Execution is completed, the main thread is updated
DispatchQueue.main.async {
Print("done")
}
}
10, signal, signal volume
Dispatchsemaphore (value:): Used to create semaphores, you can specify the initialization semaphore value, here we default 1.semaphore.wait (): Will determine the semaphore, if it is 1, then execute down. If it is 0, then wait. Semaphore.signal (): Represents the end of the run, the Semaphore plus 1, the waiting task will continue to execute.
/ / Get the global queue of the system exists
Let queue = DispatchQueue.global(qos: .default)
/ / When the task executed in parallel updates the data, it will produce different data.
For i in 1...10 {
Queue.async {
Print("\(i)")
}
}
/ / Use semaphore to ensure correctness
/ / Create a signal with an initial count value of 1
Let semaphore = DispatchSemaphore(value: 1)
For i in 1...10 {
Queue.async {
/ / Wait forever until the count value of Dispatch Semaphore >= 1
Semaphore.wait()
Print("\(i)")
/ / Signal, so that the original signal count value +1
Semaphore.signal()
}
}
Original from: www.hangge.com reprint please keep the original link: http://www.hangge.com/blog/cache/detail_745.html
Swift-Multithreaded implementations-Grand Central Dispatch (GCD)