One, the response chain
In iOS development, you encounter various action events that can be used by programs to respond to these events.
First, when an event response occurs, you must know who is responding to the event. In iOS, the responder chain responds to events, all event responses are subclasses of Uiresponder, and the responder chain is a hierarchy of different objects in which each object is given an opportunity to respond to event messages in turn. When an event occurs, the event is first sent to the first responder, and the first responder is often the view where the event occurs, where the user touches the screen. The event is passed down the responder chain until it is accepted and processed. In general, the first responder is a view object or its child class object, when it is touched by the event is left to handle, if it does not process, the event will be passed to its View controller object Viewcontroller (if present), Then it is the parent view (Superview) object (if present), and so on, until the top-level view. Next, along the top view to the window (UIWindow object) and to the program (UIApplication object), if UIApplication does not respond, there is also a place to build a global responder as the last link in the response chain, That is the application's delegate, provided that he is a subclass of Uiresponder. If the entire process does not respond to this event, the event is discarded. In general, events stop passing in the responder chain as long as the event is handled by the object.
A typical corresponding roadmap such as:
First Responser--> The Window-->the application--> App Delegate
The normal responder chain process is often interrupted by a delegate (delegation), and an object (usually a view) may delegate the response work to another object (typically the view controller Viewcontroller). This is why it is necessary to implement the corresponding protocol to implement the event delegate in Viewcontroller when the event response is done. In iOS, there is a Uiresponder class that defines all the methods of the responder object. UIApplication, UIView and other classes inherit the Uiresponder class, and the controls in UIWindow and Uikit inherit UIView, so they inherit uiresponder classes indirectly, and instances of these classes can be used as responders.
Manage Event Distribution
Whether the view needs to respond to touch events can be done by setting the Userinteractionenabled property of the view. The default state is yes, and if set to No, you can prevent the view from receiving and distributing touch events. In addition, when the view is hidden (sethidden:yes) or transparent (the alpha value is 0), the event is not received. However, this property is only valid on the view, and if you want the entire program to step in response to the event, you can call UIApplication's Beginingnoringinteractionevents method to completely stop receiving and distributing the event. The Endingnoringinteractionevents method is used to restore the program to receive and distribute events.
If you want the view to receive multi-touch, you need to set its Multipletouchenabled property to Yes, by default this property value is no, that is, the view does not receive multi-touch by default.
Second, touch
The operation of the touch screen in the iphone, before 3.2 is mainly used by the following 4 ways provided by Uiresponder:
-(void) Touchesbegan: (Nsset *) touches withevent: (Uievent *) event//finger touch screen when reporting Uitouchphasebegan events
-(void) touchescancelled: (Nsset *) touches withevent: (Uievent *) Event//Report uitouchphasemoved event when finger moves on screen
-(void) touchesended: (Nsset *) touches withevent: (Uievent *) Event//Report uitouchphaseended event when finger leaves screen
-(void) touchesmoved: (Nsset *) touches withevent: (Uievent *) Event//Report uitouchphasecancelled event when Touch is canceled due to answer call or other factors
Property:
--uitouch
The Uitouch object is created when a finger touches the screen and moves or leaves the screen on the screen. It has several properties and instance methods:
Phase: property, which returns a stage constant indicating that the touch starts, continues, ends, or is canceled, is an enumeration match that contains the
· Uitouchphasebegan (Touch start)
· uitouchphasemoved (Contact point Movement)
· Uitouchphasestationary (Contact point no movement)
· Uitouchphaseended (Touch end)
· Uitouchphasecancelled (Touch Cancel)
Tapcount: Properties, tap the number of screens
TimeStamp: Properties, the time the touch occurred
View: Properties, touch starts from that view
Window: Properties, which windows the touch starts from
Lacationinview: Method, touch the current position in the specified view
Previouslocationview: Method, touch the previous position in the specified view
--uievent
The Uievent object contains a set of related Uitouch objects, composed of Uitouch objects Uievent objects, which can be understood as a complete touch operation is a uievent, and every point in this series of complete operations is Uitouch (pressed, moved, left).
The role of Uievent is to provide a list of related touch actions that you can use if you want to get gestures that touch on the screen, which are stored in the Nsset object in the foundation framework.
But this way of identifying different gestures is really a hassle, and you need to do your own calculations to do different gesture resolution. Later...
Apple has given a more convenient way to use Uigesturerecognizer.
Third, Uigesturerecognizer
Uigesturerecognizer base class is an abstract class, we mainly use its subclasses (the name contains links, you can click to jump to the iOS Developer library, look at the official documents):
UITapGestureRecognizer
Uipinchgesturerecognizer
Uirotationgesturerecognizer
Uiswipegesturerecognizer
Uipangesturerecognizer
Uilongpressgesturerecognizer
From the name we can know, tap (click), Pinch (pinch), Rotation (rotation), Swipe (sliding, fast moving, is used to monitor the direction of the slide), Pan (drag, slow movement, is used to monitor the amount of offsets) as well as longpress (long Press).
For example, you can add a viewdidload function:
[CPP] View plaincopyprint?
-(void) viewdidload
{
[Super Viewdidload];
Do any additional setup after loading the view from its nib.
Uipangesturerecognizer *panrecognizer = [[Uipangesturerecognizer alloc] initwithtarget:self action: @selector ( Handlepanfrom:)];
[Self.view addgesturerecognizer:panrecognizer];//Key statement, add a gesture monitor to Self.view;
Panrecognizer.maximumnumberoftouches = 1;
Panrecognizer.delegate = self;
[Panrecognizer release];
}
Other gesture methods are similar.
The core is to set the delegate and use Addgesturerecognizer to add the specified gesture monitoring on the view that requires gesture monitoring.
Of course remember to add <UIGestureRecognizerDelegate> to the header of the view as a delegate.
But some gestures are related, what do we do? For example tap and longpress, swipe and Pan, or tap once with tap twice.
Gesture recognition is a principle of mutual exclusion, such as clicking and double-clicking, and if it recognizes a gesture, then the gesture will not be recognized. So for associated gestures, to do special processing to help the program to identify, you should put the current gesture into which type of gesture.
For example, when clicking and double-clicking coexist, it can only send a clicked message if it is not processed. To be able to recognize the double-tap gesture, you need to make a special processing logic, that is, to determine whether the gesture is a double-click, in the case of double-clicking failure as a gesture processing. Use
[A requiregesturerecognizertofail:b] function, which specifies that when a gesture occurs, even if a is satisfied, it will not be triggered immediately until the specified gesture B is determined to fail.
[CPP] View plaincopyprint?
-(void) viewdidload
{
Click the recognizer
uitapgesturerecognizer* Singlerecognizer;
Singlerecognizer = [[UITapGestureRecognizer alloc] initwithtarget:selfaction: @selector (Singletap:)];
Number of clicks
singletaprecognizer.numberoftapsrequired = 1; Click
Add a gesture monitor to Self.view;
[Self.view Addgesturerecognizer:singlerecognizer];
Double-click the recognizer
Uitapgesturerecognizer* Double;
Doublerecognizer = [[UITapGestureRecognizer alloc] initwithtarget:selfaction: @selector (Doubletap:)];
doubletaprecognizer.numberoftapsrequired = 2; Double click
Key statement, add a gesture monitor to Self.view;
[Self.view Addgesturerecognizer:doublerecognizer];
Key in this line, double-click gestures to determine the appropriate action to trigger a click gesture when the monitor fails
[Singlerecognizer Requiregesturerecognizertofail:doublerecognizer];
[Singlerecognizer release];
[Doublerecognizer release];
}
-(void) Singletap: (uitapgesturerecognizer*) recognizer
{
Handling Click Actions
}
-(void) Doubletap: (uitapgesturerecognizer*) recognizer
{
Handling double-click operations
}
Iv. approximate types of iphone operating gestures
1. Tap (TAP)
Click as the most commonly used gesture to press or select a control or entry (similar to a normal mouse click),
2. Drag (Drag)
Drag to implement scrolling of some pages, as well as the movement of controls.
3. Slide (Flick)
Swipe for the ability to quickly scroll and page through pages.
4. Sweep (Swipe)
Sweep gesture the shortcut menu for activating list items
5. Double click (double tap)
Double-click to zoom in and center the picture, or restore the original size (if it is currently zoomed in). Also, double-click to activate the Edit menu for text.
6. Zoom in (Pinch Open)
Zooming in gestures enables you to open a feed and open the details of the article. Magnification gestures can also be used to enlarge a picture when viewed in a photo.
7. Zoom Out (Pinch close)
To zoom out, you can implement features that are reversed and correspond to the magnification gesture: turn off the feed to exit to the first page, and close the article to the index. When you view a photo, the zoom out gesture also allows you to zoom out of the image.
8. Long press (Touch &hold)
On my Subscriptions page, long press feeds will automatically go into edit mode, while selecting the feeds that your finger is currently pressing. You can then drag the feed to move the location directly.
With the long press on the text, the Magnifier accessibility feature appears. When released, the Edit menu appears.
Long press on the picture, the Edit menu appears.
9. Shaking (Shake)
Shake gesture, the undo and Redo menu will appear. is primarily for user text input.
V. Find the Uiviewcontroller of UIView through the message responder chain
@interface UIView (Firstviewcontroller)
-(Uiviewcontroller *) Firstviewcontroller;
-(ID) Traverseresponderchainforuiviewcontroller;
@end
@implementation UIView (Firstviewcontroller)
-(Uiviewcontroller *) Firstviewcontroller {
Convenience function for casting and to "mask" the recursive function
Return (Uiviewcontroller *) [self traverseresponderchainforuiviewcontroller];
}
-(ID) Traverseresponderchainforuiviewcontroller {
ID nextresponder = [self nextresponder];
if ([Nextresponder Iskindofclass:[uiviewcontroller class]]) {
return nextresponder;
} else if ([Nextresponder Iskindofclass:[uiview class]]) {
return [Nextresponder Traverseresponderchainforuiviewcontroller];
} else {
return nil;
}
}
Vi. Event Delivery
When a sub-view needs to receive a click event, its parent view also needs to receive the Click event, what to do with it:
Under normal circumstances, when a subclass receives a click event, the event is not actively delivered to the next responder, so the parent class no longer receives the Click event. If a subclass does not handle a click event, the event is passed down until uiapplication.
But we can make the child class handle the response event and still pass the response to the next responder. But we write code to do it.
The code for the Child view is as follows:
-(void) Touchesbegan: (Nsset *) touches withevent: (Uievent *) Event {
Here you can do the sub-view of what you want to do, when done, the event continues to upload, you can let its parents, even the father Viewcontroller get this event
[[Selfnextresponder]touchesbegan:toucheswithevent:event];
}
-(void) touchesended: (Nsset *) touches withevent: (Uievent *) Event {
[[Selfnextresponder]touchesended:toucheswithevent:event];
}
-(void) touchescancelled: (Nsset *) touches withevent: (Uievent *) Event {
[[Selfnextresponder] touchesCancelled:toucheswithEvent:event];
}
-(void) touchesmoved: (Nsset *) touches withevent: (Uievent *) Event {
[[Selfnextresponder] touchesMoved:toucheswithEvent:event];
}
Talk about gestures and touches in iOS