Explanation of Concurrent Dispatch Queues for IOS app development

Source: Internet
Author: User

DetailsIOS developmentApplicationConcurrent Dispatch QueuesThis document introduces almost all tasks completed by threads in the scheduling queue. The scheduling queue is simpler, easier to use, and more efficient than the thread code. The following describes how to use a scheduling queue to execute tasks in an application.

1. about scheduling queues

All scheduling queues are first-in-first-out queues. Therefore, the Starting sequence of tasks in the queue is the same as that added to the queue. GCD automatically provides some scheduling queues for us. We can also create new scheduling queues for specific purposes.

The following lists several available scheduling queue types and how to use them.

1) serial queues serial queue, also known as private scheduling queue, is generally used for Synchronous access to specific resources. We can create any number of serial queues as needed. Each serial queue is concurrent.

(2) Parallel queue, also known as global dispatch queue. Although parallel queues can run multiple tasks concurrently, the execution sequence of the tasks starts is the same as that of the tasks added to the queue. We cannot create parallel scheduling queues by ourselves. There are only three available global concurrent queues.

3) main dispatch queue is a globally available serial queue that executes tasks on the main thread of the application. Tasks in this queue and the Main loop run loop of the application are executed alternately. Because it runs in the main thread of the application, the main queue is often used as a synchronization point of the application.

2. Queue Technologies

In addition to scheduling queues, GCD also provides some useful technologies to help us manage code.

 
 
  1. dispath group ,dispatch semaphore, dispath sources 

3. Use blocks to implement tasks

Block objects is a feature based on C language and can be used in C, c ++ Objective-C. Although a block is similar to a function pointer, it actually represents an underlying data structure. It is similar to an object and has a compiler to create and manage it.

One advantage of block is that it can use variables out of its own scope. For example, a block can read the variable value of its parent scope. This value is copied to the data structure of block heap. When a block is added to the dispatch queue, these values are generally read-only.

The block declaration is similar to the function pointer. We only change * To ^. We can pass the parameter to the block or receive the returned value.

4. create and manage scheduling queues

1) obtain global concurrent dispath queues)

The system provides three concurrent dispatch queues for each application. These three concurrent scheduling queues are global, and they only have different priorities. Because it is global, we do not need to create it. We only need to use the dispath_get_global_queue function to get the queue, as shown below:

 
 
  1. dispatch_queue_t aQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);  

In addition to the default concurrent queue, you can also pass the DISPATCH_QUEUE_PRIOPITY_HIGH and DISPATCH_QUEUE_PRIOPITY_LOW parameters to get a high or low priority. The second parameter is reserved for future extension)

Although dispatch queue is a reference counting object, because the queue is global, we do not need to retain or release. We can directly call the dispath_get_global_queue function when we need to use it.

2) create a serial scheduling queue

The serial queue is useful when you want to execute tasks in a specific order. The serial queue only executes one task at the same time. We can use serial queues instead of locks to protect shared data. Unlike locks, a serial queue ensures that tasks are executed in a predictable order.

Different from the concurrent queue, we need to create and manage the serial queue by ourselves. We can create any number of serial queues. When creating a serial queue, We Should for some purpose, such as protecting resources or Synchronizing some key actions of the application.

The following code describes how to create a custom serial queue. The dispath_queue_create function requires two parameters: queue name and queue attribute. The debugger and performance tool display the queue name to help us track how tasks are executed. The queue attributes are retained for future use and should be NULL.

 
 
  1. dispatch_queue_t queue;  
  2. queue = dispatch_queue_create("com.example.MyQueue", NULL); 

In addition to the custom queue created by myself, the system will automatically create a serial queue for me and bind it with the main thread of the application. The following describes how to obtain it.

3) obtain common queues during running

GCD provides some functions that allow us to easily access common dispatch queues

Use the dispatch_get_current_queue function to debug or test and obtain the ID of the current queue.

Use the dispatch_get_main_queue function to obtain the serial scheduling queue connected to the main thread of the application.

4) memory management of scheduling queues

The scheduling queue is a reference counting type. When we create a serial scheduling queue, We need to release it. You can use the dispatch_retain and dispatch_release functions to increase or decrease the reference count.

5) store custom context information in a queue

All scheduling objects allow us to associate them with a custom context data. They are used through the dispatch_set_context function and dispatch_get_context function. The system will not use our custom data, we allocate and release at the right time.

For queues, context data is usually used to store a pointer to an object or other data structures. We can release context data in the finalizer function of the queue. The following is an example.

6) provide a clean up function for the queue.

After creating a serial scheduling queue, we can connect it to a finalizer function to clear the data to be cleared in the queue. We can use the dispatch_set_finalizer_f function to set a function. When the reference count of the queue is 0, it is automatically called. Use this function to clear the context data associated with the queue. This function is called when the context pointer is not NULL.

 
 
  1. shows a custom finalizer function and a function that creates a queue and installs that finalizer.   
  2. The queue uses the finalizer function to release the data stored in the queue’s context pointer.   
  3. (The myInitializeDataContextFunction and myCleanUpDataContextFunction functions referenced from the code are custom functions that   
  4. you would provide to initialize and clean up the contents of the data structure itself.)   
  5. The context pointer passed to the finalizer function contains the data object associated with the queue. 
 
 
  1. void myFinalizerFunction(void *context)  
  2. {  
  3. MyDataContext* theData = (MyDataContext*)context;  
  4. // Clean up the contents of the structure  
  5. myCleanUpDataContextFunction(theData);  
  6. // Now release the structure itself.  
  7. free(theData);  
  8. }  
  9. dispatch_queue_t createMyQueue()  
  10. {  
  11. MyDataContext* data = (MyDataContext*) malloc(sizeof(MyDataContext));  
  12. myInitializeDataContextFunction(data);  
  13. // Create the queue and set the context data.  
  14. dispatch_queue_t serialQueue = dispatch_queue_create("com.example.CriticalTaskQueue", NULL);  
  15. if (serialQueue)  
  16. {  
  17. dispatch_set_context(serialQueue, data);  
  18. dispatch_set_finalizer_f(serialQueue, &myFinalizerFunction);  
  19. }  
  20. return serialQueue;  

5. Add a task to the queue

1) There are two ways to add a task to the queue, synchronous or asynchronous. Use the dispatch_async and dispatch_async_f functions as much as possible for execution, which is preferred for synchronization. When we add a block object or function to the queue, we have no way to know when the code will be executed.

Using this Asynchronous Method will not block the main thread.

Although it is possible to add tasks asynchronously, adding a task in synchronous mode may prevent some synchronization errors. Call the dispatch_sync and dispatch_sync_f functions in synchronous mode. This function blocks the execution of the main thread until the specified task is completed.

The following is an example of code:

2) execute the completion block when the task is completed.

When the task is completed, our application needs to be notified to merge the results once. In traditional asynchronous programming, callback functions may be used, but in the scheduling queue, we use the completion block.

 
 
  1. void average_async(int *data, size_t len,  
  2.  
  3. dispatch_queue_t queue, void (^block)(int))  
  4. {  
  5. // Retain the queue provided by the user to make  
  6. // sure it does not disappear before the completion  
  7. // block can be called.  
  8. dispatch_retain(queue);  
  9. // Do the work on the default concurrent queue and then  
  10. // call the user-provided block with the results.  
  11. dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{  
  12. int avg = average(data, len);  
  13. dispatch_async(queue, ^{ block(avg);});  
  14. // Release the user-provided queue when done  
  15. dispatch_release(queue);  
  16. });  

3) Concurrent execution loop iteration loop iterations)

For a for loop, if each iteration does not affect each other, You can execute the iteration concurrently. Use the dispatch_apply function or the dispatch_apply_f function.

Like a normal loop, the dispatch_apply or dispatch_apply_f functions are not returned until all loop iterations are completed.

The following code:

 
 
  1. dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);  
  2. dispatch_apply(count, queue, ^(size_t i) {  
  3. printf("%un",i);  
  4. }); 

4) execute the task on the main thread

We can call the dispatch_get_main_queue function to obtain the scheduling queue of the main thread.

Summary: DetailsIOS developmentApplicationConcurrent Dispatch QueuesI hope this article will help you!

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.