Accelerated image display for iOS devices with pre-rendering

Source: Internet
Author: User
Tags uikit

In a recent example of a uitableview, I found that the performance was good when scrolling. But scrolling back and forth, the first display of the image is not as smooth as the image shown again, there will be a little pause before the appearance.
So I guess the displayed image must have been cached, and after checking the document below, I found it was true.
Later in the article "Improving Image Drawing performance on IOS" found some hints: Originally when the image is displayed, decompression and resampling will consume a lot of CPU time, and if you pre-draw an image in a bitmap context, By caching this image again, you can save this heavy work.

Then I wrote an example program to verify:

//  ImageView.h#import <UIKit/UIKit.h>@interface ImageView : UIView { UIImage *image;}@property (retain, nonatomic) UIImage *image;@end
imageview.m#include<mach/mach_time.h>#import"ImageView.h"@implementationImageView#define LABEL_TAG 1StaticConstCGRect Imagerect = {{0,0}, {100,100}};StaticConstCgpoint Imagepoint = {0,0};@synthesize image;-(void) Awakefromnib {if (!Self.image) {Self.image = [uiimage imagenamed:@ "random.jpg"];}} -(void) DrawRect: (cgrect) rect { if (cgrectequaltorect (rect, imagerect)) {uint64_t start = Mach_absolute_time (); [Image Drawatpoint:imagepoint]; uint64_t drawtime = Mach_absolute_time ()-Start; nsstring *text = [[nsstring Alloc] Initwithformat:@ "%lld", Drawtime]; uilabel *label = (uilabel *) [ Self Viewwithtag:label_tag]; Label.text = text; [Text release]; }}-(void) dealloc {[super Dealloc]; [Image release];}  @end             

Controller code I don't list it, just click the button, update view (call [Self.view Setneedsdisplayinrect:imagerect]), draw a picture, and display the time consumed in the label.
It is worth mentioning that the clock () function can be used directly on the simulator to obtain microsecond-level accuracy, but the accuracy on iOS devices is 10 milliseconds. So I found Mach_absolute_time (), which has nanosecond precision on both Mac and iOS devices.

The test was a JPEG image of 200x200 pixels, named with @2x, which took about 300 microseconds for the first time on iphone 4, and showed about 65 microseconds again.

Now is the time to witness the miracle, add this code to the program:

StaticConstCgsize imageSize = {100,100};-(void) awakefromnib {if (!< Span class= "Hljs-keyword" >self.image) {self.image = [ UIImage imagenamed:@ "random.jpg"]; if (null! =  uigraphicsbeginimagecontextwithoptions) uigraphicsbeginimagecontextwithoptions ( ImageSize, yes, 0); else uigraphicsbeginimagecontext (imageSize); [Image Drawinrect:imagerect]; self.image = uigraphicsgetimagefromcurrentimagecontext (); uigraphicsendimagecontext ();}}         


Here you need to determine if uigraphicsbeginimagecontextwithoptions is null because it was added to iOS 4.0.
Because the JPEG image is opaque, the second parameter is set to Yes.
The third parameter is the scale, IPhone 4 is 2.0, the other is 1.0. Although this can be obtained by using [UIScreen Mainscreen].scale, the system will automatically set the correct proportions when it is actually set to 0.
It is worth mentioning that the image itself is also scaled, the normal image is 1.0 (except UIImage imagenamed: Outside, most of the API can only get this image, and the scale is not changed), HD image is 2.0. The point of the image and the pixel of the screen depend on the scale of the 2, for example, the normal image is 1:4 on the retina display, and the HD image is 1:1 on the retina display.

The next drawinrect: Draw the image into the current image context, then complete the work of understanding compression and resampling. Then get the new image from the image context, and the scale of the image will match the device correctly.

Click the button again to find that the time has been shortened to about 12 microseconds, and then the picture stabilized at about 15 microseconds.

Can it be faster? Let's try the core Graphics.
First define a global cgimageref variable:

static CGImageRef imageRef;

Then set its value in the awakefromnib:

self.image.CGImage;

Finally, draw in DrawRect:

CGContextRef context = UIGraphicsGetCurrentContext();CGContextDrawImage(context, imageRect, imageRef);

Run it off, the discovery time increased to about 33 microseconds, and the image is upside down?

The reason is that the y-axis of the Uikit and core graphics is reversed, and 2 lines of code are added to fix it:

CGContextRef context = UIGraphicsGetCurrentContext();CGContextTranslateCTM(context, 0, 100);CGContextScaleCTM(context, 1, -1);CGContextDrawImage(context, imageRect, imageRef);

This is like finally normal display, the time shortened to about 14 microseconds, the effect is not small, it appears that directly with-drawatpoint: and-drawinrect: also good enough.

Of course, this example is the correct way to use Viewdidload or loadview, but I do not bother to list the controller code, so put in awakefromnib.


September 22, 2011 Update Errata:
Just saw Mach Absolute time units This q&a, found that the unit of Mach_absolute_time () is Mach Absolute time unit, not nanosecond. The conversion relationship between them is CPU dependent, not a constant.
The simplest approach is to use the absolutetonanoseconds and absolutetoduration functions of the coreservices framework to convert. You can also use the Mach_timebase_info function to obtain this ratio.
The Numer and denom I measured on the iphone 4 were 125 and 3, respectively, at a ratio of 42, so the time stated in this article would need to be multiplied by 42.

Accelerated image display for iOS devices with pre-rendering

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.