IOS development: Performance Optimization for MKMapView

Source: Internet
Author: User

IOS development: Performance Optimization for MKMapView

The most recent project is LBS, the main member positioning function. Our uidesign is like this.

At first glance it looks pretty nice. Different people will display different portraits, but when people get together, the problem comes.

When there are many people (for example) slide the map and you will be able to feel that it is obviously not smooth and the feeling can torture the dead, so naturally we have to solve this problem (and so on, don't talk about it first, why don't we need map aggregation, because this is already the biggest place for the map. aggregation is not suitable for this issue)

Analysis

First, let's take a look at how I implemented this annotationView because this annotationsView is special (that is, it cannot be obtained directly by setting the rounded corner) in addition, the images in the image vary with users, so the solution is to use layer. the mask code is as follows:

@ Implementation MMAnnotationView

-(Instancetype) initWithAnnotation :( id) annotation reuseIdentifier :( NSString *) reuseIdentifier

{

Self = [super initWithAnnotation: annotation reuseIdentifier: reuseIdentifier];

If (self)

{

Self. frame = CGRectMake (0, 0, TRACK_ANNOTATION_SIZE.width, TRACK_ANNOTATION_SIZE.height );

Self. centerOffset = CGPointMake (0,-(TRACK_ANNOTATION_SIZE.height-3)/2 );

Self. canShowCallout = NO;

Self. avatarView = [[UIImageView alloc] initWithFrame: self. bounds];

[Self addSubview: self. avatarView];

Self. avatarView. contentMode = UIViewContentModeScaleAspectFill;

CAShapeLayer * shapelayer = [CAShapeLayer layer];

Shapelayer. frame = self. bounds;

Shapelayer. path = self. framePath. CGPath;

Self. avatarView. layer. mask = shapelayer;

Self. layer. shadowPath = self. framePath. CGPath;

Self. layer. shadowRadius = 1.0f;

Self. layer. shadowColor = [UIColor colorWithHex: 0x666666FF]. CGColor;

Self. layer. shadowOpacity = 1.0f;

Self. layer. shadowOffset = CGSizeMake (0, 0 );

Self. layer. masksToBounds = NO;

}

Return self;

}

// Mask path

-(UIBezierPath *) framePath

{

If (! _ FramePath)

{

CGFloat arrowWidth = 14;

CGMutablePathRef path = CGPathCreateMutable ();

CGRect rectangle = CGRectInset (CGRectMake (0, 0, CGRectGetWidth (self. bounds), CGRectGetWidth (self. bounds), 3 );

CGPoint p [3] = {

{CGRectGetMidX (self. bounds)-arrowWidth/2, CGRectGetWidth (self. bounds)-6 },

{CGRectGetMidX (self. bounds) + arrowWidth/2, CGRectGetWidth (self. bounds)-6 },

{CGRectGetMidX (self. bounds), CGRectGetHeight (self. bounds)-4}

};

CGPathAddRoundedRect (path, NULL, rectangle, 5, 5 );

CGPathAddLines (path, NULL, p, 3 );

CGPathCloseSubpath (path );

_ FramePath = [UIBezierPath bezierPathWithCGPath: path];

CGPathRelease (path );

}

Return _ framePath;

}

I used code to generate the shape path and generate the mask and shadowPath of the layer.

You only need to use SDWebImage to set the Avatar.

1

[AnnotationView. avatarView sd_setImageWithURL: [NSURL URLWithString: avatarURL] placeholderImage: placeHolderImage];

Next, use a tool to analyze the problem. Which of the following is the analysis performance? Of course, use the tools (usage will not be introduced here) to open the Core Animation and run the program slide map. You can see the performance analysis is as follows:

It turns out that the average number of frames is less than 30, which is too far from our target 60 frames.

Use Debug Option for further analysis

For MKMapView, we mainly care about these options.

Color Blended Layers

Color Misaligned Images

Color Offscreen-Rendered Yellow

The results of enabling these options are as follows:

We can see that

Color Blended Layers is correct, but it is normal because mask is not transparent.

Color Misaligned Images: in addition to the default profile picture, this is because the image size on the server is different from the displayed size, which leads to scaling. The default profile picture is the same, so no problem.

Color Offscreen-Rendered Yellow: a large number of off-screen rendering due to the use of mask, which is also the main cause of performance degradation

Solution

After finding the cause of the problem, how can we solve it?

First, the mask cannot be used.

Next, we need to pre-process the downloaded image to the actual size.

So it's okay to merge the downloaded image into the final result we want to display? Try it

-(Void) loadAnnotationImageWithURL :( NSString *) url imageView :( UIImageView *) imageView

{

// Cache the merged image

NSString * annoImageURL = url;

NSString * annoImageCacheURL = [annoImageURL stringByAppendingString: @ "cache"];

UIImage * cacheImage = [[SDImageCache nvidimagecache] imageFromDiskCacheForKey: annoImageCacheURL];

If (cacheImage)

{

// LLLog (@ "hit cache ");

ImageView. image = cacheImage;

}

Else

{

// LLLog (@ "no cache ");

[ImageView sd_setImageWithURL: [NSURL URLWithString: annoImageURL]

PlaceholderImage: placeHolderImage

Completed: ^ (UIImage * image, NSError * error, SDImageCacheType cacheType, NSURL * imageURL ){

If (! Error)

{

UIImage * annoImage = [image annotationImage];

ImageView. image = annoImage;

[[SDImageCache nvidimagecache] storeImage: annoImage forKey: annoImageCacheURL];

}

}];

}

}

@ Implementation UIImage (LJC)

-(UIImage *) annotationImage

{

Static UIView * snapshotView = nil;

Static UIImageView * imageView = nil;

If (! SnapshotView)

{

SnapshotView = [UIView new];

SnapshotView. frame = CGRectMake (0, 0, TRACK_ANNOTATION_SIZE.width, TRACK_ANNOTATION_SIZE.height );

ImageView = [UIImageView new];

[SnapshotView addSubview: imageView];

ImageView. clipsToBounds = YES;

ImageView. frame = snapshotView. bounds;

ImageView. contentMode = UIViewContentModeScaleAspectFill;

CGFloat arrowWidth = 14;

CGMutablePathRef path = CGPathCreateMutable ();

CGRect rectangle = CGRectInset (CGRectMake (0, 0, CGRectGetWidth (imageView. bounds), CGRectGetWidth (imageView. bounds), 3 );

CGPoint p [3] = {

{CGRectGetMidX (imageView. bounds)-arrowWidth/2, CGRectGetWidth (imageView. bounds)-6 },

{CGRectGetMidX (imageView. bounds) + arrowWidth/2, CGRectGetWidth (imageView. bounds)-6 },

{CGRectGetMidX (imageView. bounds), CGRectGetHeight (imageView. bounds)-4}

};

CGPathAddRoundedRect (path, NULL, rectangle, 5, 5 );

CGPathAddLines (path, NULL, p, 3 );

CGPathCloseSubpath (path );

CAShapeLayer * shapelayer = [CAShapeLayer layer];

Shapelayer. frame = imageView. bounds;

Shapelayer. path = path;

ImageView. layer. mask = shapelayer;

SnapshotView. layer. shadowPath = path;

SnapshotView. layer. shadowRadius = 1.0f;

SnapshotView. layer. shadowColor = [UIColor colorWithHex: 0x666666FF]. CGColor;

SnapshotView. layer. shadowOpacity = 1.0f;

SnapshotView. layer. shadowOffset = CGSizeMake (0, 0 );

CGPathRelease (path );

}

ImageView. image = self;

Uigraphicsbeginimagecontextwitexceptions (TRACK_ANNOTATION_SIZE, NO, 0 );

[SnapshotView. layer renderInContext: UIGraphicsGetCurrentContext ()];

UIImage * copied = UIGraphicsGetImageFromCurrentImageContext ();

UIGraphicsEndImageContext ();

Return copied;

}

@ End

Then, you only need to make the following simple call.

[Self loadAnnotationImageWithURL: avatarURL imageView: annotationView. avatarView];

See how the modified Instruments works.

Color Blended Layers: This is inevitable because a transparent graph is displayed. However, due to the particularity of the map, the position of the Avatar remains at a long interval, so it is not often merged or animated) so this is not a problem.

Color Misaligned Images is okay because the Avatar has been scaled to the same size.

Color Offscreen-Rendered Yellow is okay, because it simply displays an image and there is no need for off-screen rendering.

Let's take a look at the number of frames.

Oh-Yeah ~ Not only does the number of frames reach our goal of 60 frames (as there are also business logic threads running in the background, it is not so stable) even the average running time is reduced, even if dozens of people are displayed on the map.

Summary

Not only is MKMapView actually, many places including UITableView can use the method mentioned in this article to optimize its core point, that is, merging + caching. Of course, because merging still consumes some resources, it is more suitable for portraits such as small pictures. resource

For image performance optimization, please refer to this article. (I have a detailed explanation of the Debug Option mentioned in this Article)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.