iOS multithreaded----Dispatch Queues

Source: Internet
Author: User
Tags gcd

Reprinted from: Http://blog.sina.com.cn/s/blog_6dce99b10101atsu.html, respecting the original!

a detailed description of the concurrency of iOS development applications dispatch queues is the content of this article, we can almost dispatch the queue to complete all the tasks with the thread . Scheduling queues are simpler, easier to use, and more efficient than thread code. The following is a summary of the scheduling queue, how to use the scheduling queue in the application to perform tasks.

1, about the scheduling queue

All dispatch queues are FIFO queues, so the order in which the tasks in the queue starts is the same as the order in which they are added to the queue. GCD automatically provides us with some scheduling queues, and we can also create new ones for specific purposes.

The following lists the types of scheduling queues available and how to use them.

(1) Serial Queues (serial queue) is also known as the private Dispatch queue (privately), which is typically used to synchronize access to specific resources. We can create as many serial queues as we need, and each serial queue is concurrent.

(2) Parallel queue, also known as global dispatch queue. While parallel queues can perform multiple tasks concurrently, the order in which the tasks start executing is the same as the order in which they join the queue. We can't do this ourselves. Create a parallel dispatch queue. There are only three available global concurrent queues.

(3) The main dispatch queue is a globally available serial queue that performs tasks on the main thread of the program. This queue's tasks are alternately executed with the event source that the application's main loop (run loop) is to execute. Because it runs in the main thread of the application, the main queue is often used as a synchronization point for the application.

2, about the queue of some technology

In addition to scheduling queues, GCD also provides some useful techniques to help us manage our code.

    1. Dispath Group, dispatch semaphore, Dispath sources

3, use blocks to achieve tasks

Block objects is based on the C language feature and can be used in c,c++ objective-c. A block, although somewhat similar to a function pointer, actually represents a lower-level data structure, similar to an object, with compilers to create and manage.

One advantage of block is that it can use variables of its own extraterritorial scope, for example, a block can read the variable value of its parent scope, which is copied into the data structure of the block heap. When a block is added to the dispatch queue, these values are usually read-only.

The declaration of the block is similar to the function pointer, except that the * is changed to ^, and we can pass the parameter to block or receive the value it returns.

4. Create and manage scheduling queues

(1) Obtain global concurrency dispatch queue (global concurrent Dispath queues)

The system provides three concurrent dispatch queues for each application. These three concurrent dispatch queues are global, and they differ only by priority. Because it's global, we don't need to create it. We only need to get the queue by using the function Dispath_get_global_queue, as follows:

    1. dispatch_queue_t Aqueue = dispatch_get_global_queue (dispatch_queue_priority_default, 0);

In addition to getting the default concurrency queue, you can also pass parameters Dispatch_queue_priopity_high and Dispatch_queue_priopity_low to get high or low priority. (the second parameter is reserved for future expansions)

Although the dispatch queue is a reference counting object, this is because the queue is global and does not require us to go to retain or release, we need to call the function Dispath_get_global_queue directly when we use it.

(2) Creating a serial dispatch queue

Serial queues are useful when you want tasks to execute in a particular order. A serial queue performs only one task at a time. We can use serial queues instead of locks to protect shared data. Unlike locks, a serial queue guarantees that the task executes in a predictable order.

Unlike concurrent queues, we create and manage serial queues on our own, and can create any number of serial queues. When we create a serial queue, it should be for some purpose, such as securing resources, or synchronizing certain key behaviors of the application.

The following code describes how to create a custom serial queue, the function dispath_queue_create requires two parameters, the queue's name, and the queue's properties. Debugger and performance tools display the name of the queue help us to track how the task executes, the properties of the queue are reserved for future use, should be null

    1. dispatch_queue_t queue;
    2. Queue = Dispatch_queue_create ("Com.example.MyQueue", NULL);

In addition to the custom queues that you create, the system automatically creates a serial queue for me and binds it to the main thread of the application. Here's how to get it.

(3) Run-time access to common queues

GCD provides a number of functions that allow us to easily access the common dispatch queues

Use the Dispatch_get_current_queue function to debug or test to get the identity of the current queue.

Use the function Dispatch_get_main_queue to get a serial dispatch queue that is connected to the main thread of the application.

(4) Memory management of scheduling queue

The dispatch queue is a reference count type, and we want to release it when we create a serial dispatch queue. You can use the function Dispatch_retain and dispatch_release to increase or decrease the reference count.

(5) Store custom context information in one queue

All scheduling objects allow us to associate them with a custom context data, which is used by functions Dispatch_set_context and Dispatch_get_context, and the system does not use our custom data, and we allocate and release it at the right time.

For queues, context data is typically used to store a pointer to an object, or other data structure, and we can release the context in the queue's finalizer function. An example is given below.

(6) Provide a clean up function for the queue.

After we have created the serial dispatch queue, we can connect it to a finalizer function to clean up the data that needs to be cleaned up in the queue. We can use the Dispatch_set_finalizer_f function to set a function that will be called automatically when the reference count of the queue is 0. Use this function to clean up the context data associated with the queue, which is called when the context pointer is not NULL.

  1. Shows a custom finalizer function and a function that creates a queue and installs that finalizer.
  2. The queue uses the finalizer function to release the data stored in the queue ' s context pointer.
  3. (The myinitializedatacontextfunction and mycleanupdatacontextfunction functions referenced from the code is custom funct ions that
  4. You would provide to initialize and clean up the contents of the data structure itself.)
  5. The context pointer passed to the FINALIZER function contains the data object associated with the queue.
  1. void myfinalizerfunction (void *context)
  2. {
  3. mydatacontext* thedata = (mydatacontext*) context;
  4. Clean up the contents of the structure
  5. Mycleanupdatacontextfunction (Thedata);
  6. Now release the structure itself.
  7. Free (thedata);
  8. }
  9. dispatch_queue_t Createmyqueue ()
  10. {
  11. mydatacontext* data = (mydatacontext*) malloc (sizeof (Mydatacontext));
  12. Myinitializedatacontextfunction (data);
  13. Create the queue and set the context data.
  14. dispatch_queue_t serialqueue = dispatch_queue_create ("Com.example.CriticalTaskQueue", NULL);
  15. if (serialqueue)
  16. {
  17. Dispatch_set_context (serialqueue, data);
  18. Dispatch_set_finalizer_f (Serialqueue, &myfinalizerfunction);
  19. }
  20. return serialqueue;
  21. }

5. Add a task to the queue

(1) There are two ways to add a task to a queue, either synchronously or asynchronously. Use Dispatch_async and Dispatch_async_f functions as much as possible to perform, rather than synchronizing the preferred. When we add a block object or function to the queue, we have no way to know when the code executes.

Using this async does not block the main thread.

Although it is possible to add tasks asynchronously, in some cases synchronizing the way to add a task will prevent some synchronization errors. The functions Dispatch_sync and dispatch_sync_f are called in a synchronous manner. This function blocks the execution of the main thread until the specified task finishes.

Here is the code example:

(2) Execute completion block when the task is completed

When the task is completed, our application needs to be notified, once again to merge the results, in traditional asynchronous programming, we may use the callback function, but in the dispatch queue, we use completion block.

  1. void Average_async (int *data, size_t len,
  2. dispatch_queue_t queue, void (^block) (int))
  3. {
  4. Retain the queue provided by the user to make
  5. Sure it does not disappear before the completion
  6. Block can be called.
  7. Dispatch_retain (queue);
  8. Do the work in the default concurrent queue and then
  9. Call the user-provided block with the results.
  10. Dispatch_async (Dispatch_get_global_queue (dispatch_queue_priority_default, 0), ^{
  11. int avg = AVERAGE (data, Len);
  12. Dispatch_async (Queue, ^{block (avg);});
  13. Release the user-provided queue when done
  14. Dispatch_release (queue);
  15. });
  16. }

(3) Concurrent Execution loop iteration (loop iterations)

For a For loop, if each iteration has no effect on each other, you can execute the iteration concurrently, using the function dispatch_apply or the Dispatch_apply_f function.

As with normal loops, functions dispatch_apply or dispatch_apply_f are not returned until all loop iterations are complete.

The following code:

    1. dispatch_queue_t queue = Dispatch_get_global_queue (Dispatch_queue_priority_default, 0);
    2. Dispatch_apply (count, queue, ^ (size_t i) {
    3. printf ("%un", I);
    4. });

(4) Perform the task on the main thread

We can get to the main thread's dispatch queue by calling the function Dispatch_get_main_queue.

Summary: Detailed description of the iOS development and application of concurrent dispatch Queues content, hope that through this study can help you!

---restore content ends---

iOS multithreaded----Dispatch Queues

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.