Today I wrote such a short test code as follows: [cpp] CGRect imageRect = (CGRect) {100,100,100,100}; UIImageView * imageView = [[UIImageView alloc] initWithFrame: imageRect] autorelease]; imageView. backgroundColor = [UIColor yellowColor]; [self. view addSubview: imageView]; UIButton * maskBtn = [UIButton buttonWithType: UIButtonTypeCustom]; maskBtn. frame = imageView. bounds; maskBtn. backgroundColor = [UIColor redColor]; [maskBtn add Target: self action: @ selector (maskBtnDidClick :) forControlEvents: UIControlEventTouchUpInside]; [imageView addSubview: maskBtn]; click the result button to not respond to the event, SO you will get the information on SO: UIImageView has userInteractionEnabled set to NO by default. you are adding the button as a subview to the image view. you shoshould set it to YES. therefore, a line of code is added to set imageView to respond to user interaction: [cpp] CGRect imageRect = (CGRect) {100,100,100,100}; UIImageView * ima GeView = [[[UIImageView alloc] initWithFrame: imageRect] autorelease]; imageView. backgroundColor = [UIColor yellowColor]; [self. view addSubview: imageView]; imageView. userInteractionEnabled = YES; UIButton * maskBtn = [UIButton buttonWithType: UIButtonTypeCustom]; maskBtn. frame = imageView. bounds; maskBtn. backgroundColor = [UIColor redColor]; [maskBtn addTarget: self action: @ selector (maskBtnDidClick :) ForControlEvents: UIControlEventTouchUpInside]; [imageView addSubview: maskBtn]; this is purely a pitfall caused by knowledge points, because in the past, TapGesture was used in UIImageView to respond to user interactions, so I'm not very impressed with this pitfall-have I ever met? The view level constructed by the code above is roughly as follows: the red box indicates the UIButton button on the UIImageView. For more information, see View Programming Guide for iOS. After you click the red button: 1. the hardware will notify the UIKit of a touch event; 2. UIKit encapsulates the touch Event information into a UIEvent object and distributes it to a suitable view. 1) UIKit places the Event object in the Event queue of the current App. 2) See the Event Handling Guide for iOS documentation, the current App extracts an event object from the event queue and sends it to the key window object. 3) the key window object obtains the view object of the touch event through Hit-Testing. 4) the view obtained through Hit-Testing becomes the first object that can respond to events. first responder, if it does not respond, the event object will be passed along the responder chain. The Responder Chain is a string of objects from first responder to the current App object. They all inherit from the UIResponder class. For details, see this document. 3. Find the appropriate object for event processing. For example, the above Code is self (ViewController) and responds to the event to do something. If not, discard it.