Talking about gestures and touch in iOS, talking about iOS gesture touch

Source: Internet
Author: User

Talking about gestures and touch in iOS, talking about iOS gesture touch

I. response chain

Various operation events will be encountered in IOS development, and these events can be responded through the program.

First, when an event responds, you must know who will respond to the event. In IOS, the responder chain responds to events. All Event Response classes are subclasses of UIResponder. The responder chain is a hierarchical structure composed of different objects, each object obtains the chance to respond to event messages in sequence. When an event occurs, the event is first sent to the first responder. The first responder is the view of the event, that is, the place where the user touches the screen. Events will be passed down along the responder chain until they are accepted and processed. Generally, the first responder is a view object or its subclass object. When it is touched, the event is handled by it. If it is not processed, the event will be passed to its view controller object viewcontroller (if any), then its parent view (superview) object (if any), and so on until the top-level view. Next, we will move the top view to the window (UIWindow object) and then to the Program (UIApplication object). If the UIApplication does not respond, there is another way to build a global responder as the final link of the response chain, that is, the application delegate, provided that it is a subclass of UIResponder. If the entire process does not respond to this event, the event will be discarded. In general, in the responder chain, as long as the event is processed by the object, the event will stop being transmitted.

A typical road map is as follows:

First Responser --> The Window --> The Application --> App Delegate

A normal responder chain process is often interrupted by a delegate. An object (usually a view) may delegate the response to another object (usually a View Controller ViewController ), this is why the corresponding protocol must be implemented in ViewController for event response. In iOS, The UIResponder class exists, which defines all the methods of the responder object. UIApplication, UIView, and other classes all inherit the UIResponder class. The controls in UIWindow and UIKit inherit the UIView class, so they also inherit the UIResponder class indirectly. The instances of these classes can be used as the responder.

 

Manage event Distribution

You can set the userInteractionEnabled attribute of the view to respond to a touch event. The default status is YES. If it is set to NO, the view can prevent the view from receiving and distributing touch events. In addition, events are not received when the view is hidden (setHidden: YES) or transparent (the alpha value is 0. However, this attribute is only valid for the view. If you want the entire program to respond to events in a step, you can call the beginIngnoringInteractionEvents method of UIApplication to completely stop event receipt and distribution. Use the endIngnoringInteractionEvents method to restore events that are received and distributed by programs.

If you want a view to receive multi-point touch, you need to set its multipleTouchEnabled attribute to YES. The default value is NO, that is, the view does not receive multi-point touch by default.

 

2. Touch

For iPhone touch screen processing, the following four methods were mainly used by UIResponder before 3.2:

-(Void) touchesBegan :( NSSet *) touches withEvent :( UIEvent *) event // report the UITouchPhaseBegan event when you touch the screen with your fingers

-(Void) touchesCancelled :( NSSet *) touches withEvent :( UIEvent *) event // report the UITouchPhaseMoved event when the finger moves on the screen

-(Void) touchesEnded :( NSSet *) touches withEvent :( UIEvent *) event // report the UITouchPhaseEnded event when the finger leaves the screen

-(Void) touchesMoved :( NSSet *) touches withEvent :( UIEvent *) event // report the UITouchPhaseCancelled event when a call is canceled or otherwise caused

Attribute:

-- UITouch

A UITouch object is created when a finger contacts the screen and moves or leaves the screen. It has several attributes and instance methods:

Phase: attribute. A stage constant is returned, indicating that the touch starts, continues, ends, or is canceled. It is an enumeration type that contains

· UITouchPhaseBegan (start with touch)

· UITouchPhaseMoved (touch point Movement)

· UITouchPhaseStationary (the touchpoint is not moved)

· UITouchPhaseEnded (touch end)

· UITouchPhaseCancelled (touch canceled)

TapCount: attribute, the number of times the screen is pressed.

TimeStamp: attribute, time when a touch occurs

View: attribute. The view where the touch starts.

Window: attribute. The window in which the touch starts.

LacationInView: method, touch the current position in the specified view

Previuslocationview: method. Touch the previous position in the specified view.

-- UIEvent

The UIEvent object contains a set of related UITouch objects, which are composed of UITouch objects. A complete touch operation is a UIEvent, each of these complete operations is UITouch (press, move, and exit ).

UIEvent provides a list of related touch operations. If you want to obtain the touch gesture on the screen, you can use this object. These Column Operations are stored in the NSSet object in the Foundation framework.

 

However, it is difficult to identify different gesture operations in this way. You need to calculate different gesture resolutions by yourself. Later...

Apple provides a simple way to use UIGestureRecognizer.

Iii. UIGestureRecognizer

The UIGestureRecognizer base class is an abstract class. We mainly use its subclass (the name contains links. You can click to jump to the ios Developer library to see the official documentation ):

UITapGestureRecognizer

UIPinchGestureRecognizer

UIRotationGestureRecognizer

UISwipeGestureRecognizer

UIPanGestureRecognizer

UILongPressGestureRecognizer

We can see from the name that the Tap (click), Pinch (Pinch), Rotation (rotate), Swipe (slide, fast movement, is used to monitor the direction of the slide), Pan (drag, slow movement, is used to monitor the amount of offset) and LongPress (long press ).

For example, you can add the following in the viewDidLoad function:

[Cpp] view plaincopyprint?

-(Void) viewDidLoad

{

[Super viewDidLoad];

// Do any additional setup after loading the view from its nib.

UIPanGestureRecognizer * panRecognizer = [[UIPanGestureRecognizer alloc] initWithTarget: self action: @ selector (handlePanFrom :)];

[Self. view addGestureRecognizer: panRecognizer]; // key statement, which adds a gesture monitoring to self. view;

PanRecognizer. maximumNumberOfTouches = 1;

PanRecognizer. delegate = self;

[PanRecognizer release];

}

Other gestures are similar.

Its core is to set the delegate and use addGestureRecognizer on the view for gesture monitoring to add the specified gesture monitoring.

Of course, remember to add <UIGestureRecognizerDelegate> to the view header file of the delegate.

But some gestures are associated. What should I do? For example, Tap and LongPress, Swipe and Pan, or Tap twice.

Gesture Recognition is mutually exclusive. For example, if you click or double-click a gesture, the subsequent gesture is not recognized. Therefore, for associated gestures, special processing is required to help programs identify the types of gestures that should be attributed to the current gestures.

For example, if you click and double-click to coexist, if you do not process it, it can only send the clicked message. To recognize double-click gestures, you need to make a special processing logic, that is, first determine whether the gesture is double-click. When the double-click fails, it is processed as a click gesture. Use

[A requireGestureRecognizerToFail: B] function, which specifies that when A gesture occurs, even if A has reached the limit, it will not be triggered immediately, it will not be triggered until the specified gesture B is determined to fail.

[Cpp] view plaincopyprint?

-(Void) viewDidLoad

{

// Click Recognizer

UITapGestureRecognizer * singleRecognizer;

SingleRecognizer = [[UITapGestureRecognizer alloc] initWithTarget: selfaction: @ selector (SingleTap :)];

// Number of clicks

SingleTapRecognizer. numberOfTapsRequired = 1; // click

// Add a gesture monitor to self. view;

[Self. view addGestureRecognizer: singleRecognizer];

// Double-click Recognizer

UITapGestureRecognizer * double;

DoubleRecognizer = [[UITapGestureRecognizer alloc] initWithTarget: selfaction: @ selector (DoubleTap :)];

DoubleTapRecognizer. numberOfTapsRequired = 2; // double-click

// Key statement: Add a gesture monitoring to self. view;

[Self. view addGestureRecognizer: doubleRecognizer];

// The key lies in this line. If you double-click the gesture to confirm that the monitoring fails, the corresponding operation of the click gesture will be triggered.

[SingleRecognizer requireGestureRecognizerToFail: doubleRecognizer];

[SingleRecognizer release];

[DoubleRecognizer release];

}

-(Void) SingleTap :( UITapGestureRecognizer *) recognizer

{

// Process the click operation

}

-(Void) DoubleTap :( UITapGestureRecognizer *) recognizer

{

// Process double-click operations

}

Iv. Approximate types of iphone operation gestures

1. Click (Tap)

Click is the most commonly used gesture, used to press or select a control or entry (similar to a common mouse click ),

2. Drag (Drag)

Drag is used to scroll some pages and move controls.

3. Flick)

Slide enables quick Page scrolling and page turning.

4. Sweep (Swipe)

The sweep gesture is used to activate the shortcut menu of the list items.

5. Double-click (Double Tap)

Double-click to zoom in and show the image in the center, or restore the original size (if the current size has been enlarged ). You can also double-click to activate the text editing menu.

6. Zoom in (Pinch open)

The zoom-In gesture enables the following functions: Open the subscription source and open the details of the article. When you view a photo, you can also use the zoom-In gesture to zoom in the image.

7. Zoom out (Pinch close)

Zoom-out gesture: disables the subscription source and exits to the home page, and closes the article to the index page. When you view a photo, the zoom-In gesture can also be used to zoom out the image.

8. Long press (Touch & Hold)

On the my subscription page, the long-pressed subscription source will automatically enter the edit mode and select the subscription source currently pressed by the finger. In this case, you can directly drag the position of the subscription source to move.

If you press the text in a long way, the magnifier function is displayed. The Edit menu is displayed.

The Edit menu is displayed when you press and hold the image.

9. Shake (Shake)

The undo and redo menus are displayed with the shaking gesture. It is mainly for user text input.

 

5. Find the UIViewController of the UIView through the message responder chain

@ Interface UIView (FirstViewController)

 

-(UIViewController *) firstViewController;

-(Id) traverseResponderChainForUIViewController;

 

@ End

 

@ Implementation UIView (FirstViewController)

 

-(UIViewController *) firstViewController {

// Convenience function for casting and to "mask" the recursive function

Return (UIViewController *) [self traverseResponderChainForUIViewController];

}

 

-(Id) traverseResponderChainForUIViewController {

Id nextResponder = [self nextResponder];

If ([nextResponder isKindOfClass: [UIViewController class]) {

Return nextResponder;

} Else if ([nextResponder isKindOfClass: [UIView class]) {

Return [nextResponder traverseResponderChainForUIViewController];

} Else {

Return nil;

}

}

Vi. Event Transmission

When a subview needs to receive click events, its parent view also needs to receive click events. How can this problem be solved:

Under normal circumstances, after the subclass receives the click event, the event will not be automatically transmitted to the next responder, so the parent class will no longer receive the click event. If the subclass does not process click events, the events will be passed until the UIApplication.

However, we can make the subclass still pass the response to the next responder after handling the RESPONSE event. But we can only write the code.

The sub-view code is as follows:

-(Void) touchesBegan :( NSSet *) touches withEvent :( UIEvent *) event {

// Here we can do what the sub-view wants to do. After the sub-view finishes, we can continue to upload the event so that the parent class or even the parent viewcontroller can get the event.

[[SelfnextResponder] touchesBegan: toucheswithEvent: event];

}

-(Void) touchesEnded :( NSSet *) touches withEvent :( UIEvent *) event {

[[SelfnextResponder] touchesEnded: toucheswithEvent: event];

}

-(Void) touchesCancelled :( NSSet *) touches withEvent :( UIEvent *) event {

[[SelfnextResponder] touchesCancelled: toucheswithEvent: event];

}

-(Void) touchesMoved :( NSSet *) touches withEvent :( UIEvent *) event {

[[SelfnextResponder] touchesMoved: toucheswithEvent: event];

}

  • _Code4app.com_drag control _000026.zip (53.2 KB)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.