IOS Development Touch Events and gestures _ios

Source: Internet
Author: User

Events in iOS fall into three categories: touch events, accelerometer events, and remote control events. Only objects that inherit Uiresponder can receive and process events, called Responder objects. UIApplication, Uiviewcontroller and UIView all inherit from Uiresponder. Uiresponder internally provided methods to handle events:

Touch events: Touchesbegan, touchesmoved, touchesended, touchescancelled

Accelerometer events: Motionbegan, motionended, motioncancelled

Remote Control event: Remotecontrolreceivedwithevent

Uiveiw's Touch event handling process:

/** * Call * @param touches < #touches description#> * @param event when the finger starts to touch the view
   
  < #event description#> */(void) Touchesbegan: (Nsset<uitouch *> *) touches withevent: (Uievent *) Event {
NSLog (@ "%s", __func__);
 /** * Call * * @param touches < #touches description#> * @param event < #event description#> when the finger moves on view
 
*/-(void) touchesmoved: (Nsset<uitouch *> *) touches withevent: (Uievent *) event {NSLog (@ "%s", __func__);} /** * Call * * @param touches < #touches description#> * @param event < #event description#> * * When the finger leaves view-
 
(void) touchesended: (Nsset<uitouch *> *) touches withevent: (Uievent *) event {NSLog (@ "%s", __func__);} 
/** * Call * * @param touches < #touches description#> * @param event < #event description#> * * When a touch event is interrupted by a system event -(void) touchescancelled: (Nsset<uitouch *> *) touches withevent: (Uievent *) event {NSLog (@ "%s", __func__);} 
    

A touch action is bound to invoke the three methods of Touchesbeagn, touchesmoved, and touchesended.

When it comes to these touch methods, the first thing to know is Uitouch this object. When a finger touches the screen, it produces a Uitouch object associated with it, and a finger corresponds to a Uitouch object. This object holds information about the touch, such as the location of the touch, the time, the stage, and so on, and the system updates the same Uitouch object when the finger moves. So that it can keep the touch position information of the finger. When the finger leaves the screen, the system destroys the corresponding Uitouch object.

@interface uitouch:nsobject @property (nonatomic,readonly) nstimeinterval timestamp;
@property (nonatomic,readonly) uitouchphase phase;  @property (nonatomic,readonly) Nsuinteger Tapcount; Touch down within a certain point within a certain amount to time//Majorradius and majorradiustolerance are in Poin TS//The Majorradius'll be accurate///the Majorradiustolerance @property (nonatomic,readonly) cgfloat Majorradius
Vailable_ios (8_0);
 
@property (nonatomic,readonly) cgfloat majorradiustolerance Ns_available_ios (8_0);
@property (Nullable,nonatomic,readonly,strong) UIWindow *window;
@property (Nullable,nonatomic,readonly,strong) UIView *view; @property (nullable,nonatomic,readonly,copy) nsarray <uigesturerecognizer *> *gesturerecognizers NS_AVAILABLE_
 
IOS (3_2);
Get current Position-(Cgpoint) Locationinview: (Nullable UIView *) view;
 
Gets the position of the last touch Point-(Cgpoint) Previouslocationinview: (Nullable UIView *) view; Force of the touch, where 1.0 represents The force of Average touch @property (nonatomic,readonly) cgfloat Force Ns_available_ios (9_0); Maximum possible force with this input mechanism @property (nonatomic,readonly) cgfloat Maximumpossibleforce Ns_availab
 
Le_ios (9_0); @end


Eg: let a view move with the movement of the finger

/** *
 Call
 *
 @param touches < #touches description#>
 * @param event  < #event when the finger moves on the view Description#>
 /
-(void) touchesmoved: (Nsset<uitouch *> *) touches withevent: (Uievent *) event {
  NSLog (@ "%s", __func__);
   
  Get Uitouch object
  uitouch *touch = [touches anyobject];
   
  Gets the position of the current point
  cgpoint curp = [Touch locationinview:self];
   
  Gets the position of the previous point
  cgpoint PreP = [Touch previouslocationinview:self];
   
  Calculates the offset of x
  cgfloat offsetX = curp.x-prep.x;
   
  Calculates the offset of y
  cgfloat OffsetY = curp.y = prep.y;
   
  Modify the location of the view
  Self.transform = Cgaffinetransformtranslate (Self.transform, OffsetX, OffsetY);


is based on the location information stored in the Uitouch object.

Generation and delivery of events:

When the touch event is generated, the system adds the event to an event queue managed by UIApplication. UIApplication will remove the first event from the queue and send it to the application's main window for processing. The main window is in the view hierarchy, looking for the most appropriate view and calling the touches method to handle the touch event. The passing of a touch event is passed from the parent control to the child control. If the parent control cannot receive a touch event, then the child control cannot receive the touch event.

How do I find the most appropriate control to handle an event? First, do you know if you can receive touch events? Is the touch point on your body? Iterate over the child control from the back, repeat the previous two steps, and if there are no child controls that match the criteria, then you are most appropriate to handle them.

The control uses the Hittest:withevent: method to find the most appropriate view, using the Pointinside method to determine that the point is not in the method caller, the control.

The underlying implementation of the HitTest method:

-(UIView *) HitTest: (cgpoint) point withevent: (Uievent *) event {
   
  //Determine whether the current control can receive the touch event
  if ( self.userinteractionenabled = NO | | Self.hidden = = YES | | Self.alpha <= 0.01) {return
    nil;
  }
   
  Determine if the touch point is on the current control
  if ([self pointinside:point withevent:event] = NO) {return
    nil;
  }
   
  Traverse your own child control from the back
  nsinteger count = self.subviews.count;
  for (Nsinteger i = count-1 i >= 0; i--) {
    UIView *childview = self.subviews[i];
     
    Converts the coordinate system on the current control to the coordinate system on the child control
    cgpoint childpoint = [self convertpoint:point toview:childview];
     
    Recursive call HitTest method to find the most appropriate view
    uiview *fitview = [Childview hittest:childpoint withevent:event];
     
    if (Fitview) {return
      Fitview
    }}
   
  End of Loop, no more appropriate view than yourself, return to self
  ;
   

However, there are drawbacks to using the touches method to monitor touch events, such as customizing view, so after iOS3.2 Apple launched the gesture recognition function Uigesturerecognizer. Uigesturerecognizer is an abstract class whose subclasses can handle specific gestures.

The

Specifically has the following gestures:

//click gesture//UITapGestureRecognizer *tap = [UITapGestureRecognizer alloc]initwithtarget:&l t;# (Nullable ID) #> action:<# (nullable SEL) #>//long by gesture by default is triggered two times//Uilongpressgesturerecognizer *LONGP = [Uilo 
Ngpressgesturerecognizer alloc]initwithtarget:<# (Nullable ID) #> action:<# (nullable SEL) #>//light sweep gesture The default direction is to the right Uiswipegesturerecognizer *swipe = [Uiswipegesturerecognizer alloc]initwithtarget:<# (Nullable ID) #> action: <# (nullable SEL) #>//rotate gesture//Uirotationgesturerecognizer *rotation = [Uirotationgesturerecognizer ALLOC]INITW ithtarget:<# (Nullable ID) #> action:<# (nullable SEL) #>//kneading gesture//Uipinchgesturerecognizer *pinch = [uipinc Hgesturerecognizer alloc]initwithtarget:<# (Nullable ID) #> action:<# (nullable SEL) #>//drag gesture//uipangest Urerecognizer *pan = [Uipangesturerecognizer alloc]initwithtarget:<# (Nullable ID) #> action:<# (nullable SEL) # > 

Actual use:

@interface Viewcontroller () <UIGestureRecognizerDelegate> @property (weak, nonatomic) Iboutlet Uiimageview *
 
ImageView;
 
  @end @implementation Viewcontroller-(void) viewdidload {[Super viewdidload];
   
  [Self setuppinch];
 
  [Self setuprotation];
   
[Self setuppan]; #pragma mark-Gesture Agent method//is allowed to start trigger gestures//-(BOOL) Gesturerecognizershouldbegin: (Uigesturerecognizer *) Gesturerecognizer// {//return no;//}//whether to allow multiple gestures to be supported at the same time, by default it does not support multiple gestures//returns YES to support multiple gestures-(BOOL) Gesturerecognizer: (Uigesturerecognizer *) Gestur
  Erecognizer Shouldrecognizesimultaneouslywithgesturerecognizer: (Uigesturerecognizer *) OtherGestureRecognizer {
return YES; //Whether to allow the touch point of the receiving finger//-(BOOL) Gesturerecognizer: (Uigesturerecognizer *) Gesturerecognizer Shouldreceivetouch: (Uitouch *) touch{////Get current touch point//Cgpoint curp = [touching locationInView:self.imageView];////if (Curp.x < self.imageview.b Ounds.size.width * 0.5) {//Return no.//}else{//return YES/////} #pragma mark-clickGesture-(void) Setuptap {//Create point by gesture uitapgesturerecognizer *tap = [[UITapGestureRecognizer alloc] initwithtarget:self AC
   
  tion: @selector (tap:)];
   
  Tap.delegate = self;
[_imageview Addgesturerecognizer:tap];
 
(void) Tap: (UITapGestureRecognizer *) Tap {NSLog (@ "%s", __func__);} #pragma mark-long press gesture//default will trigger two times-(void) setuplongpress {Uilongpressgesturerecognizer *longpress = [[Uilongpressgesturere
   
  Cognizer Alloc] initwithtarget:self action: @selector (longpress:)];
[Self.imageview addgesturerecognizer:longpress]; }-(void) Longpress: (Uilongpressgesturerecognizer *) longpress {if (longpress.state = = Uigesturerecognizerstateb
  Egan) {NSLog (@ "%s", __func__); #pragma mark-light sweep-(void) Setupswipe {//The default light sweep direction is to the right uiswipegesturerecognizer *swipe = [[Uiswipegesturerecognize
   
  R Alloc] initwithtarget:self action: @selector (swipe)];
   
  Swipe.direction = Uiswipegesturerecognizerdirectionup;
   
  [Self.imageview Addgesturerecognizer:swipe]; // If you later want a control to support multiple directions of light sweep, you must create multiple light sweep gestures, a light sweep gesture only supports One Direction//The default light sweep direction is toward the right uiswipegesturerecognizer *swipedown = [[Uiswipegesturerec
   
  Ognizer Alloc] initwithtarget:self action: @selector (swipe)];
   
  Swipedown.direction = Uiswipegesturerecognizerdirectiondown;
 
   
[Self.imageview Addgesturerecognizer:swipedown];
 
}-(void) Swipe {NSLog (@ "%s", __func__);}  #pragma mark-rotation gesture-(void) setuprotation {Uirotationgesturerecognizer *rotation = [[Uirotationgesturerecognizer alloc]
  Initwithtarget:self action: @selector (rotation:)];
  Rotation.delegate = self;
[Self.imageview addgesturerecognizer:rotation]; The rotation angle of the default pass is all relative to the start position-(void) Rotation: (Uirotationgesturerecognizer *) Rotation {self.imageView.transform = C
   
  Gaffinetransformrotate (Self.imageView.transform, rotation.rotation);
   
  Reset rotation.rotation = 0;
Gets the angle of the gesture rotation NSLog (@ "%f", rotation.rotation); #pragma mark-kneading-(void) Setuppinch {uipinchgesturerecognizer *pinch = [Uipinchgesturerecognizer AllOC] initwithtarget:self action: @selector (pinch:)];
  Pinch.delegate = self;
[Self.imageview Addgesturerecognizer:pinch]; }-(void) Pinch: (Uipinchgesturerecognizer *) Pinch {self.imageView.transform = Cgaffinetransformscale (Self.imageview.
   
  Transform, Pinch.scale, Pinch.scale);
Reset Pinch.scale = 1; #pragma mark-drag-(void) Setuppan {Uipangesturerecognizer *pan = [[Uipangesturerecognizer alloc] Initwithtarget:se
   
   
  LF Action: @selector (pan:)];
[Self.imageview Addgesturerecognizer:pan];
   
  }-(void) pan: (Uipangesturerecognizer *) Pan {//Get touch point of gesture//cgpoint curp = [Pan LocationInView:self.imageView];
   
  Move View//Get gesture movement, also relative to the beginning position cgpoint TRANSP = [Pan TranslationInView:self.imageView];
   
  Self.imageView.transform = Cgaffinetransformtranslate (Self.imageView.transform, transp.x, TRANSP.Y);
   
 Reset [Pan Settranslation:cgpointzero InView:self.imageView];
NSLog (@ "%@", Nsstringfromcgpoint (CURP)); } @end

The above is the iOS touch events and gestures related content introduction, hope to learn iOS program design help.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.