GCD introduction (I): Basic Concepts and dispatch queues

Source: Internet
Author: User

What is GCD?

Grand Central dispatch or GCD is a low-level API that provides a new method for writing concurrent programs. Basically, gcd is a bit like nsoperationqueue. They allow programs to split tasks into multiple single tasks and then submit them to the work queue for concurrent or serial execution. GCD is more efficient than nsopertionqueue, and it is not part of the cocoa framework.

In addition to code parallel execution capabilities, gcd also provides a highly integrated event control system. You can set a handle to respond to the file descriptor, Mach ports (Mach PortUsed for inter-process communication on OS X), process, timer, signal, user-generated event. These handles are concurrently executed through GCD.

GCD APIs are largely based on blocks. Of course, gcd can also be used out of blocks. For example, the traditional C mechanism provides function pointers and context pointers. Practice has proved that GCD is very simple and easy to use when used with block, and can exert its maximum capacity.

You can run the "Man dispatch" command on the Mac to obtain the GCD document.

Why?

GCD provides many advantages over traditional multi-threaded programming:

  1. Ease of use:GCD is easier to use than thread. GCD is based on work unit rather than operations like thread, so GCD can controlWait until the task ends,Monitoring file descriptor,Periodical code execution and work suspension. Block-based lineage makes it extremely simple to pass context between different code scopes.
  2. Efficiency:GCD is implemented so lightweight and elegant that it is more practical and fast in many places than it is dedicated to creating a thread that consumes resources. This is related to ease of use: some of the reasons for GCD's ease of use are that you don't have to worry too much about efficiency, just use it.
  3. Performance:GCD automatically increases or decreases the number of threads based on the system load, which reduces context switching and increases computing efficiency.

Dispatch objects

Although GCD is pure C language, it is grouped into an object-oriented style. A gcd object is called a dispatch object. Dispatch objects are referenced and counted like cocoa objects. Use the dispatch_release and dispatch_retain functions to operate the reference count of the dispatch object for memory management. Unlike cocoa objects, dispatch objects are not involved in the garbage collection system. Therefore, even if GC is enabled, you must manually manage the memory of GCD objects.

Dispatch queues and dispatch sources (which will be introduced later) can be suspended and restored. They can have an associated context pointer and an associated task to complete the trigger function. For more information about these features, see man dispatch_object.

Dispatch queues

The basic concept of GCD is dispatch queue. Dispatch queue is an object that accepts tasks and runs them first in the first run order. Dispatch queue can be concurrent or serial. Concurrent tasks are performed concurrently based on the system load as nsoperationqueue does. Serial queues only execute a single task at the same time.

There are three queue types in GCD:

  1. The main queue:Same as the main thread. In fact, the task submitted to the main queue will be executed in the main thread. Main queue can be obtained by calling dispatch_get_main_queue. Because the main queue is related to the main thread, this is a serial queue.
  2. Global Queues:A global queue is a concurrent queue that is shared by the entire process. There are three global queues in the process: High, Medium (default), and low. You can call the dispatch_get_global_queue function to pass in the priority to access the queue.
  3. User queue:User Queue (GCD is not called this queue, but there is no specific name to describe this queue, so we call it a user queue) is a functiondispatch_queue_createCreated queues. These queues are serialized. Because of this, they can be used to complete synchronization mechanisms, a bit like mutex in a traditional thread.

Create a queue

To use a user queue, we must first create one. Call the dispatch_queue_create function. The first parameter of the function is a tag, which is purely for debug. Apple recommends that you use an inverted domain name to name a queue, such as "com. dreamingwish. subsystem. Task ". These names are displayed in the crash log and can be called by the debugger, which is useful in debugging. The second parameter is not currently supported. Just pass in null.

Submit a job

It is easy to submit a job to a queue: Call the dispatch_async function to input a queue and a block. The queue will execute the code of this block when it is its turn to execute this block. The following example shows how to execute a huge task in the background:

1234 dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{        [self goDoSomethingLongAndInvolved];        NSLog(@"Done doing something long and involved");});

dispatch_asyncThe function returns immediately, and the block is asynchronously executed in the background.

Of course, generally, nslog messages are not a task. In a typical cocoa program, you may want to update the interface when the task is completed, which means you need to execute some code in the main thread. You can simply complete this task-use a nested dispatch to execute background tasks in the outer layer and patch the tasks to the main queue at the internal layer:

123456 dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{        [self goDoSomethingLongAndInvolved];        dispatch_async(dispatch_get_main_queue(), ^{            [textField setStringValue:@"Done doing something long and involved"];        });});

Another function is called dispatch_sync. It does the same thing as dispatch_async, but it will wait for the code in the block to be executed and return. The _ block type modifier can be used to obtain a value from the block in execution. For example, you may have a piece of code executed in the background, and it needs to get a value from the interface control layer. Then you can use dispatch_sync for simple operations:

12345678 __block NSString *stringValue;dispatch_sync(dispatch_get_main_queue(), ^{        // __block variables aren‘t automatically retained        // so we‘d better make sure we have a reference we can keep        stringValue = [[textField stringValue] copy];});[stringValue autorelease];// use stringValue in the background now

We can also use a better method to accomplish this-using a more "Asynchronous" style. Different from the background thread blocking when getting interface layer values, you can use nested blocks to stop background threads and obtain values from the main thread, then, submit the post-processing to the background thread:

    dispatch_queue_t bgQueue = myQueue;    dispatch_async(dispatch_get_main_queue(), ^{        NSString *stringValue = [[[textField stringValue] copy] autorelease];        dispatch_async(bgQueue, ^{            // use stringValue in the background now        });    });

Depending on your needs, myqueue can be a user queue or a global queue.

 

No longer use lock)

User queues can be used to replace locks to complete synchronization. In traditional multi-threaded programming, you may have an object to be used by multiple threads. You need a lock to protect this object:

    NSLock *lock;

The access code will look like this:

1234567891011121314151617181920 - (id)something{    id localSomething;    [lock lock];    localSomething = [[something retain] autorelease];    [lock unlock];    return localSomething;} - (void)setSomething:(id)newSomething{    [lock lock];    if(newSomething != something)    {        [something release];        something = [newSomething retain];        [self updateSomethingCaches];    }    [lock unlock];}

You can use queue instead of GCD:

    dispatch_queue_t queue;

To use the synchronization mechanism, the queue must be a user queue instead of a global queue.dispatch_queue_createInitialize one. Then you can usedispatch_asyncOrdispatch_syncEncapsulate the access code of shared data:

1234567891011121314151617181920 - (id)something{    __block id localSomething;    dispatch_sync(queue, ^{        localSomething = [something retain];    });    return [localSomething autorelease];} - (void)setSomething:(id)newSomething{    dispatch_async(queue, ^{        if(newSomething != something)        {            [something release];            something = [newSomething retain];            [self updateSomethingCaches];        }    });}

It is worth noting that the dispatch queue is very lightweight, so you can use it very much, just like you used lock before.

Now you may ask: "This is good, but is it interesting? I just changed some code to do the same thing ."

In fact, there are several advantages to using GCD:

  1. Parallel Computing:Note that in the code of the second version,-Setsomething: How to Use dispatch_async. Call-Setsomething: The system returns immediately, and the tasks are executed in the background. If updatesomethingcaches is a time-consuming and laborious task and the caller is about to perform a high-load task on the processor, this would be great.
  2. Security:When using GCD, we cannot accidentally write code with unpaired locks. In the general lock code, we may have the code returned before unlocking. With GCD, the queue usually runs continuously and you will return control.
  3. Control:With GCD, We can suspend and restore the dispatch queue, which is not implemented by the lock-based method. We can also direct a user queue to another dspatch queue so that this user queue inherits the attributes of the dispatch queue. In this way, the queue priority can be adjusted-by directing the queue to a different global queue, this queue can even be used to execute code on the main thread if necessary.
  4. Integration:The Event System of GCD is integrated with dispatch queue. Any event or timer required by the object can be directed from the queue of the object, so that these handles can be automatically executed on the queue, so that the handle can be automatically synchronized with the object.

Summary

Now you know the basic concepts of GCD, how to create a dispatch queue, how to submit a job to the dispatch queue, and how to use the queue for thread synchronization. Next I will show you how to use GCD to write Parallel Execution Code to make full use of the performance of multi-core systems ^. I will also discuss more about GCD, including the Event System and queue targeting.

GCD introduction (I): Basic Concepts and dispatch queues

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.