This article mainly introduces the JavaScript events in the touch screen. The example analyzes the types, principles, and related usage skills of the touch screen events. For more information, see the example in this article. Share it with you for your reference. The specific analysis is as follows:
I. Touch events
Ontouchstart
Ontouchmove
Ontouchend
Ontouchcancel currently, mobile browsers support these four touch events, including IE. Because the touch screen also supports MouseEvent, their order should be noted: touchstart → mouseover → mousemove → mousedown → mouseup → click1
Example:
/*** OnTouchEvent */var p = document. getElementById ("p"); // touchstart is similar to mousedownp. ontouchstart = function (e) {// The touches attribute of the event is an array. One element represents a touch point at the same time, // you can use touches to obtain each touch point of multi-touch. // since we only have one touch point, we direct it to [0] var touch = e. touches [0]; // obtain the coordinates of the current touch point, equivalent to clientX/clientYvar x = touch of the MouseEvent event. clientX; var y = touch. clientY;}; // touchmove is similar to mousemovep. ontouchmove = function (e) {// you can add preventDefault to the touchstart and touchmove events to prevent the browser from scaling or scrolling e. preventDefault () ;}; // touchend is similar to mouseupp. ontouchup = function (e) {// nothing to do };
2. gesture event gestures refer to the use of multi-touch rotation, stretching, and other operations, such as the enlargement and rotation of samples and webpages. A gesture event is triggered only when two or more fingers are simultaneously touched. We need to pay attention to the coordinates of the element when scaling: we usually use offsetX, getBoundingClientRect, and other methods to obtain the coordinates of the element, however, in Mobile browsers, pages are often scaled during use. Will the scaled element coordinates change? The answer is yes. This problem is illustrated in a scenario: After page a is loaded, JavaScript obtains the coordinates of the element in the document as (100,100), and the user zooms in the page, at this time, the element coordinates are output again in JavaScript, and still (100,100), but the element will be offset in the response area of the screen according to the zoom ratio. You can open the brick-and-mortar game demo and zoom in after the page is fully loaded. Then, you will find that even if your fingers touch the outside of the "touch here" area, you can control the ball board, this is because an offset occurs in the region. The offset will always exist unless the page is refreshed or scaled back.
/*** OnGestureEvent */var p = document. getElementById ("p"); p. ongesturechange = function (e) {// scale represents the scaling ratio generated by the gesture. If it is smaller than 1, it is scaled down. If it is greater than 1, it is scaled up. The original value is 1var scale = e. scale; // rotation indicates the angle of the rotation gesture. The value range is [0,360], and the positive value rotates clockwise. The negative value is counterclockwise var angle = e. rotation ;};
3. Gravity sensing is relatively simple. You only need to add the onorientationchange event to the body node. In this event, the window. orientation attribute is used to obtain the value representing the current mobile phone direction. The Value List of window. orientation is as follows:
0: Consistent with the direction when the page is loaded for the first time
-90: 90 ° clockwise relative to the original direction
180: switched to 180 °
90: It turns 90 ° counter-clockwise. According to my test, Android2.1 does not support gravity sensing. The above is the current touch screen events. These events have not yet been incorporated into the standard, but have been widely used. I am Android2.1 and have not been tested in other environments.
I hope this article will help you design javascript programs.