Target-action design mode
The target-action of the iOS design pattern is primarily designed to reduce the coupling of the code. As the name implies Target-action mode refers to the target-action mode, which runs through iOS development always.
Mention Target-action, first say 2 words "high cohesion, low coupling" This is mainly to evaluate the quality of a software
It judges whether the software is good or bad, mainly by the high cohesion between the template, the coupling between the modules is low.
In fact, the target-action pattern is simple, that is, when an event occurs, call that method in that object. For example: When you click the button, call the controller inside the click Method. "That object" is target, "that method" is the action, and the controller is the Targer,click method is action.
The general target is the controller, and the action has its own inherent format:-(ibaction) Click: (ID) sender.
For the target-action mode more professional, so to speak: Target-action is a kind of when an event occurs, an object carries the necessary information design pattern to send a message to another object. The stored information includes two types of data: An action selector that identifies the method being called, and a destination that receives the message. When an event called an action message occurs, the message begins to be sent. Although target can be any object, even a frame object, a typical representation is a custom controller that handles the action message in a special way in an application. An event that raises an action message can be anything, such as an object sending a message that can be any object. For example, a gesture recognition object may send an action message to another object when the gesture is recognized. However the target-action paradigm the most popular finds in controllers such as buttons or sliding strips. When a user operates a control object, it sends a message to a particular object. The control object is a subclass of Uicontrol. Both the action Selecter and the target object are properties that control the object.
The target-action mode mainly communicates between v--c in the MVC pattern. V (view:) is only responsible for sending the corresponding action to target, and does not care what target specifically does. This allows the coupling of the code to be loosened.
Target/action design mode
Class is the principal party
Controller as Agent
Method for protocol
. h file#import<UIKit/UIKit.h>@interfaceTapview:UIView@property (Nonatomic,AssignID target;@property (Nonatomic,assign) SEL action;@endThe. m file for the view#import"TapView.h"@implementationtapview-(void) Touchesbegan: (Nsset *) touches withevent: (Uievent *) event{This is responsible for invokingThe way I do it, I do what I do, and I don't need to know how it's done.If the external method has parameters, then this parameter must be of type tapview* [_target performselector:Self. Action Withobject:Self];}Root View Controller ControlThe. m file for the controller-(void) Viewdidload {[super viewdidload]; //do any additional setup after loading the view. Tapview *TAP1 = [[Tapview alloc] Initwithframe:cgrectmake (100 , 100, 100, 100)]; Tap1.backgroundcolor = [uicolor Redcolor]; [self.view Addsubview:tap1]; tap1 .tag = 1001; Tap1.target = self; //target= controller tap1.action = @ Selector (randomcolor:);
Proxy design mode
Control there are some points of time, the controller can implement this proxy method, in order to adapt to the time to do things
For some point in time, what do you want my proxy object to do?
Agent design mode to write a protocol firstThe protocol name begins with the class name plus the delegate. h file@protocol tapviewdelegate <NSObject>@optionalWhen the view is first touched, the agent is going to implement the method.-(voidTapviewbegan: (Tapview *) Tapview;When the view is in the drag process, you want the agent to implement the way-(voidTapviewmove: (Tapview *) Tapview;When the view finishes touch, the agent is going to implement the method-(void) tapviewend: (Tapview *) Tapview; @end@interface tapview:uiview@property (nonatomic,assign) id<tapviewdelegate> detegate; //Design the proxy object for the receive protocol @end//.m file If the agent is not empty, the trigger agent is typically used at a point in time-(void)Touchesbegan: (Nsset *) touches withevent: (uievent *) event{ if (_detegate! = Nil && [_detegate respondstoselector:@ Selector (Tapviewbegan:)]) {[_detegate tapviewbegan:self];}}
Uiimageview
Notation 1: Create a picture viewUiimageview *imageview = [[Uiimageview Alloc] initwithimage:[UIImage imagenamed:@ "2753441426829430"]; ImageView. frame =Self. View. Bounds; [Self. View Addsubview:imageview];Notation 2: Create a GIF viewHow to get Uiimageview to play multiple pictures, GIF animated imagesUiimageview *imageview = [[Uiimageview Alloc] initWithFrame:CGRectMake (10,150,350,350)];Nsmutablearray *imagearray = [[Nsmutablearray Alloc] Initwithcapacity:4];for (int i =1; I <=4; i++) {NSString *imagename = [NSString stringWithFormat:@ "W12%d.tiff", I];UIImage *image = [UIImage Imagenamed:imagename]; [Imagearray Addobject:image]; } ImageView. animationimages = Imagearray;Specifies the picture that needs to be animated ImageView. animationduration =0.5;Set the playback time rate ImageView.animationrepeatcount = 0; //set the number of cycles [self.view AddSubview : ImageView]; [ImageView startanimating]; //Let it start playing [ImageView release]; //notation 3: Loading pictures via picture path //loading a picture with a file path uiimageview *imageview1 = [[uiimageview alloc] Initwithimage:[uiimage imagewithcontentsoffile:@ "/Users/ Anhao/desktop/2753441426829430.png "]; Imageview1.frame = self.view< Span class= "hljs-variable" >.bounds; [self.view Addsubview:imageview1];
Gesture Recognizer:
Gesture recognizer: Is the touch event is encapsulated, do not need to determine whether a gesture is triggered, the gesture recognizer itself plays a role in recognition, where we can focus on the recognition of what to do after the operation. Very convenient.
The gesture recognizer is an abstract class in iOS that identifies a gesture called a gesture: a regular touch.
There are 7 subclasses of the gesture recognizer:
are: Light fear gesture, nudge gesture, sweep gesture, zoom gesture, rotate gesture, long press gesture, and screen edge panning gesture.
Once the specified gesture is not recognized, you can perform a custom operation.
UITapGestureRecognizer is a pat? Gesture recognizer, which recognizes tapping operations
Uilongpressgesturerecognizer? Long press the gesture recognizer, can recognize? Long press operation.
Uirotationgesturerecognizer is a rotary gesture recognizer that can recognize rotation operations.
Uipinchgesturerecognizer is a pinch gesture recognizer that can identify pinch operations.
Uipangesturerecognizer is a panning gesture recognizer that can recognize drag and drop operations.
Uiswipegesturerecognizer is a swipe gesture recognizer that recognizes drag-and-drop operations.
Uiscreenedgepangesturerecognizer is the screen edge swipe recognizer, which is the new in iOS7? gestures.
How to use the recognizer:
Instead of using the gesture recognizer as an abstract parent, we create objects using a specific gesture recognizer as needed.
1. Create a Uixxxgesturerecognizer object using the Initwithtarget:action: method
2. Configure information about the gesture to be recognized
3. Add gestures to a view
4. Implement the method defined in the gesture recognizer
View's Transform Property
Transform is an important attribute of view. It changes the display state of the view at the matrix level. The zoom of the view can be achieved. Rotate, pan, and so on.
Create a tap gesture to recognize an objectTap the gesture recognizer, once the tap gesture is recognized, it executes the tap in self: The method can be executed in multiple ways!! But we generally do not add UITapGestureRecognizer *tapg =[[uitapgesturerecognizer alloc]initwithtarget:self action:@ Selector (tap:)]; tapg.numberoftapsrequired = 2; //point few tapg.numberoftouchesrequired = 1; A few fingers //Create a rotation gesture uirotationgesturerecognizer *rogr = [[Uirotationgesturerecognizer alloc ] initwithtarget:self action:@ Selector (rotage:)]; //note this place UILabel and Uiimageview must add this sentence imageview.userinteractionenabled = Yes;//[ImageView ADDGESTURERECOGNIZER:TAPG]; [ImageView addgesturerecognizer:rogr]; [ImageView release];
Ui-target...action design mode, gesture recognizer. Uiimageview