We often involve development in the iPhone by touching your own buttons or views. For example, the following three pictures: Touch your finger to move the gray button on the screen:
The following describes how to achieve this effect:
1. First, we define a button. mybutton inherits from uibutton.
# Import <uikit/uikit. h> @ interface mybutton: uibutton {cgfloat xdistance; // The distance between the touch point and the center point X to cgfloat ydistance; // The distance between the touch point and the center point y} @ end
Note:
(1). cgpoint is only a struct that typdef has passed. It is a value type and cannot add stars (*)
struct CGPoint { CGFloat x; CGFloat y;};typedef struct CGPoint CGPoint;
(2) cgfloat is actually a float.
# define CGFLOAT_TYPE floattypedef CGFLOAT_TYPE CGFloat;
2. Add code in mybutton. M.
// Press your finger to start touch-(void) touchesbegan :( nsset *) touches withevent :( uievent *) event {// obtain the coordinates uitouch * Touch = [touches anyobject] in the parent view of the touch button; cgpoint currentpoint = [Touch locationinview: Self. superview]; xdistance = self. center. x-currentpoint. x; ydistance = self. center. y-currentpoint. y;} // press and hold the finger to move the process-(void) touchesmoved :( nsset *) touches withevent :( uievent *) event {If (isdrag) {// obtain the coordinates uitouch * Touch = [touches anyobject] in the parent view of the touch button; cgpoint currentpoint = [Touch locationinview: Self. superview]; // move the button to the current touch position cgpoint newcenter = cgpointmake (currentpoint. X + xdistance, currentpoint. Y + ydistance); self. center = newcenter ;}}
Explanation: Our mybutton inherits from uibutton, uibutton inherits from uicontrol, uicontrol inherits from uiview, and uiview inherits from uiresponder.
The uiresponder class provides responses to user interaction events. The above two events are included. These functions are system callback functions. That is to say, when the program interacts with the user, the system automatically calls and executes the functions, without the need for programmers to manually call and respond.
At last, you only need to instantiate mybutton in uiviewcontroller and compile and run it to implement the function.
Note: You can also obtain a set of touch points from the response event:
Nsset * mytouches = [eventtouchesforview: Self. View];
Next, we will briefly introduce the interactive event response chain in IOS:
Responder: | uiimageview
| Uiapplication | uilabel
| Uibutton
| Uislider
Nsobject ----- uiresponder ------ | uiview ------------------------- | uicontrol ----------------- |
| Uiswitch
| Uiviewcontroller | uiwindow
| Uitextfield
| Uiscrollview
In iOS, the currently touchable responder object is called the first responder, indicating that the object is interacting with the user. It is the beginning of the responder chain.
The responder chain represents a series of responder objects. The event is handled by the first responder. If the first responder does not handle the event, the event is passed up along the responder chain and handed over to the next responder. Generally, the first responder is a view object or its subclass object. When it is touched, the event is handled by it. If it is not processed, the event will be passed to its view controller object (if any), then its parent view (superview) object (if any), and so on until the top-level view. Next we will follow the top view (top
View) to the window (uiwindow object) to the Program (uiapplication object ). If the entire process does not respond to this event, the event will be discarded. In general, in the responder chain, as long as the event is processed by the object, the event will stop being transmitted. However, you can determine whether to continue to pass events based on certain conditions in the view response method.
Whether the view needs to respond to a touch event. You can set the view'sUserinteractionenabledAttribute. The default status is yes. If it is set to no, the view can prevent the view from receiving and distributing touch events.
In addition, events are not received when the view is hidden (sethidden: Yes) or transparent (the Alpha value is 0.
However, this attribute is only valid for the view. If you want the entire program to respond to the event step, you can callBeginingnoringinteractioneventsTo completely stop event receipt and distribution. PassEndingnoringinteractioneventsMethods To restore the program to receive and distribute events.
If you want the view to receive multi-point touch, you need to set itsMultipletouchenabledIf the attribute is yes, the default attribute value is no, that is, the view does not receive multi-point touch by default.