Details about mobile js touch events and js touch events
In mobile development, it is easier to design a prototype on the desktop, and then process the unique parts of the mobile device on the device to be supported. Multi-touch is one of the features that are difficult to test on a PC, because most PCs do not have touch input.
Tests that have to be performed on mobile devices may lengthen your development cycle, because every change you make requires code to be submitted to the server and then loaded to the device. Then, once the application is run, there will be no much debugging for the application, because tablet computers and smart phones lack the tools used by web developers.
A solution to this problem is to simulate trigger events on the development machine. For single-touch, touch events can be simulated based on mouse events. If you have a touch input device, such as the modern App MacBook, then multi-touch can also be simulated.
Single point of touch event
If you want to simulate a single point of touch event on the desktop, try Phantom Limb. This program simulates the touch event on the web page and provides a giant hand for guidance.
In addition, the jQuery extension Touchable unifies touch and mouse events across platforms.
Multi-touch events
To enable your multi-touch web application to work on your browser or multi-touch control panel (such as Apple MacBook or MagicPad), I created this MagicTouch. the js fill tool captures touch events from touchpad and converts them to standard compatible touch events.
1. Download The npTuioClient NPAPI plug-in and install it ~ /Library/Internet Plug-Ins/directory.
2. Download The TongSeng TUIO application of the Mac MagicPad and start the server.
3. Download MagicTouch. js, a javascript Library, to simulate compatible touch events based on npTuioClient callback.
4. Add the magictouch. js script and npTuioClient plug-in to your application as follows:
< head> ... < script src="/path/to/magictouch.js" kesrc="/path/to/magictouch.js">< /script> < /head> < body> ... < object id="tuio" type="application/x-tuio" style="width: 0px; height: 0px;"> Touch input plugin failed to load! < /object> < /body>
I only tested this method on Chrome 10, but it should be able to work on other modern browsers as long as it is slightly adjusted.
If your computer does not have multi-touch input, you can use another TUIO tracker, such as reacTIVision, to simulate touch events. For more information, see the TUIO project page.
Note that your gesture can be the same as the OS-level multi-touch gesture. On OS X, you can configure System-wide events by entering the Trackpad preference settings page in System Preferences.
As the multi-touch feature is gradually widely supported across mobile browsers, I am very happy to see that new web applications make full use of this rich API.
Source: html5rocks.com
Original article title: Developing for Multi-Touch Web Browsers
1. Touch events on mobile phones
Basic events:
Touchstart // triggered when the finger is in touch with the screen
Touchmove // triggered when the finger moves on the screen
Touchend // triggered when the finger is removed from the screen
Below is a relatively small usage: touchcancel // triggered when the touch process is canceled by the System
Each event has the following list. For example, the targetTouches of touchend is certainly 0:
Touches // list of all fingers on the screen
TargetTouches // list of all fingers on the element
ChangedTouches // list of all fingers involved in the current event
Each event has a list, and each list has the following attributes:
Commonly used coordinates include pageX and pageY:
PageX // The X coordinate of the page
PageY // Y coordinate relative to the page
ClientX // The X coordinate relative to the video area
ClientY // Y coordinate relative to the Area
ScreenX // The X coordinate relative to the screen
ScreenY // Y coordinate relative to the screen
Identifier // unique ID of the current touch point
Target // The DOM element touched by the finger
Other related events:
Event. preventDefault () // block the browser's zoom and scroll when the touch is reached
Var supportTouch = "createTouch" in document // determine whether touch events are supported
Ii. Example
The following describes how to obtain different types of sliding Code. It is encapsulated based on previous ideas and can be used for reference:
Var touchFunc = function (obj, type, func) {// click to process the sliding range within 5x5. s is the start, e is the end var init = {x: 5, y: 5, sx: 0, sy: 0, ex: 0, ey: 0}; var sTime = 0, eTime = 0; type = type. toLowerCase (); obj. addEventListener ("touchstart", function () {sTime = new Date (). getTime (); init. sx = event.tar getTouches [0]. pageX; init. sy = event.tar getTouches [0]. pageY; init. ex = init. sx; init. ey = init. sy; if (type. indexOf ("start ")! =-1) func () ;}, false); obj. addEventListener ("touchmove", function () {event. preventDefault (); // prevents the browser from zooming and scrolling init when the touch is reached. ex = event.tar getTouches [0]. pageX; init. ey = event.tar getTouches [0]. pageY; if (type. indexOf ("move ")! =-1) func () ;}, false); obj. addEventListener ("touchend", function () {var changeX = init. sx-init. ex; var changeY = init. sy-init. ey; if (Math. abs (changeX)> Math. abs (changeY) & Math. abs (changeY)> init. y) {// left and right events if (changeX> 0) {if (type. indexOf ("left ")! =-1) func ();} else {if (type. indexOf ("right ")! =-1) func () ;}} else if (Math. abs (changeY)> Math. abs (changeX) & Math. abs (changeX)> init. x) {// event if (changeY> 0) {if (type. indexOf ("top ")! =-1) func ();} else {if (type. indexOf ("down ")! =-1) func () ;}} else if (Math. abs (changeX) <init. x & Math. abs (changeY) <init. y) {eTime = new Date (). getTime (); // click the event, which is subdivided into if (eTime-sTime)> 300) {if (type. indexOf ("long ")! =-1) func (); // long press} else {if (type. indexOf ("click ")! =-1) func (); // processing when clicked} if (type. indexOf ("end ")! =-1) func () ;}, false );};
Reposted articles:JS events of Mobile Phone Touch Screen
Processing Touch events allows you to track the position of each finger of a user. You can bind the following four Touch events:
1. touchstart: // triggered when the finger is placed on the screen
2. touchmove: // triggered when the finger moves on the screen
3. touchend: // triggered when the finger is picked up from the screen
4. touchcancel: // triggered when the touch event is canceled by the system. When will the system be canceled? Unknown
Attribute
1. client/clientY: // position of the touch point relative to the browser window viewport
2. pageX/pageY: // position of the touch point relative to the page
3. screenX/screenY: // position of the touch point relative to the screen
4. identifier: // unique ID of the touch object
// Touchstart event function touchSatrtFunc (e) {// evt. preventDefault (); // var touch = e. touches [0]; // obtain the first contact var x = Number (touch. pageX); // page contact X coordinate var y = Number (touch. pageY); // page contact Y coordinate // record contact initial position startX = x; startY = y;} // touchmove event function touchMoveFunc (e) {// evt. preventDefault (); // var touch = evt, such as browser scaling and scroll bar scrolling, when the touch is blocked. touches [0]; // obtain the first contact var x = Number (touch. pageX); // page contact X Coordinate var y = Number (touch. pageY); // page contact Y coordinate var text = 'touchmove event triggering :( '+ x +', '+ y + ')'; // determine the moving direction if (x-startX! = 0) {// sliding left and right} if (y-startY! = 0) {// slide up and down }}
Reprinted second article:Mobile Web Front-end development series: Event Processing (2)
In the previous article, we talked about the basic html events. In this article, we will focus on touch events, which are triggered by a finger touching the screen, moving the finger on the screen, or leaving the screen. An event is a set of touch. It starts when the finger is placed on the screen for the first time and ends when the last finger leaves the screen. All touch operations during the event process from start to end are stored in records of the same event.
Touch event
Touch events can be divided into single-touch and multi-touch events. Single-touch high-end hosts are generally supported. Safari2.0 and Android3.0 and later versions support multi-touch events. A maximum of five fingers can be used to simultaneously touch the screen, the ipad supports up to 11 fingers to touch the screen at the same time. We can use the following event model to capture these events:
Ontouchstart ontouchmove ontouchend ontouchcancel
When you press your finger on the screen, ontouchstart will be triggered. When you move one or more fingers, ontouchmove will be triggered. When you remove your finger, ontouchend will be triggered. When Will ontouchcancel be triggered? When a higher level event occurs, for example, alert, a call or a push message prompt, the current touch operation is canceled, that is, ontouchcancel is triggered. When developing a web game, ontouchcancel is very important to you. You can pause or save the game when ontouchcancel is triggered.
Gesture event
The running principle of the gesture event is the same as that of the touch event, but the gesture event is triggered only when there are at least two fingers on the screen. Therefore, Safari2.0 and Android3.0 are supported, and gestures have many advantages, the event model helps us measure the two-finger scale-down and rotation operations:
Ongesturestart ongesturechange ongestureend
Event attributes
Whether you use touch or gesture events, you need to convert these events into independent touch to use them. Therefore, you need to access a series of attributes of the event object.
TargetTouches all current touch of the target element changedTouches page all the touch of the latest change on the touches page
ChangedTouches, targetTouches, and touches contain slightly different touch lists. TargetTouches and touches contain the finger list currently on the screen, but changedTouches only lists the last touch. If you are using the touchend or gestureend event, this attribute is very important. In both cases, no finger appears on the screen, so targetTouches and touches should be empty, but you can still view the changedTouches array to understand what happened.
Because all the touch properties generate arrays, you can use JavaScript Array functions to access them. This means that event. touches [0] will return the first touch, and you can use event. touches. length to calculate the number of currently stored touches.
When using a single touch, you can use event.tar getTouches [0]. You can also access other touch types. Each touch contains some specific information,
The unique identifier of clientX and clientY relative to the X or Y position of the current screen pageX and pageY relative to the X or Y position of the overall page screenX and screenY relative to the X or Y position of the user's computer screen identifier event target generate the target object of the touch
The event object of a gesture event has two more attributes than a normal touch event, and the rotation angle of the rotation finger scales down.
Reprinted article:JavaScript touch and gesture events
In order to send some special information to developers, iOS Safari adds some proprietary events. Because iOS devices have neither a mouse nor a keyboard, regular mouse and keyboard events are insufficient when developing interactive web pages for mobile Safari. With the addition of WebKit in Android, many such Proprietary events have become the de facto standard.
1. Touch events
When the iPhone 3G, which contains iOS2.0 software, is released, it also contains a new version of Safari browser. This new mobile Safari provides some new events related to touch operations. Later, the Android browser implemented the same event. A touch event is triggered when the user's finger is placed on the screen, slide on the screen, or removed from the screen. Specifically, there are several touch events.
Touchstart: triggered when the finger touches the screen, even if one finger is already on the screen.
Touchmove: triggered continuously when the finger slides on the screen. When this event occurs, call preventDefault () to prevent scrolling.
Touchend: triggered when the finger is removed from the screen.
Touchcancel: triggered when the system stops tracking touch. The exact trigger event of this event is not explicitly described in the document.
The above events will be bubbling and can be canceled. Although these touch events are not defined in the DOM specification, they are implemented in a DOM-compatible manner. Therefore, each touch event is not defined in the DOM specification, but they are implemented in a DOM-compatible manner. Therefore, the event object of each touch event provides common properties in the mouse event: bubbles, cancelable, view, clientX, clientY, screenX, screenY, detail, altKey, shiftKey, ctrlKey, and metaKey.
In addition to common DOM attributes, a touch event also contains the following three attributes for tracking touch.
Touches: an array of Touch objects for the currently tracked Touch operation.
TargetTouches: an array of Touch objects specific to the event target.
ChangeTouches: indicates the array of Touch objects that have changed since the last Touch.
Each Touch object contains the following attributes.
ClientX: Touch the X coordinate of the target in the viewport.
ClientY: Touch the Y coordinate of the target in the viewport.
Identifier: the unique ID of the touch.
PageX: Touch the x coordinate of the target in the page.
PageY: Touch the y coordinate of the target in the page.
ScreenX: Touch the x coordinate of the target in the screen.
ScreenY: Touch the y coordinate of the target in the screen.
Target: the coordinate of the DOM node to be touched.
These attributes can be used to track users' touch operations on the screen. Let's look at the example below.
Function handleTouchEvent (event) {// only one touch if (event. touches. length = 1) {var output = document. getElementById ("output"); switch (event. type) {case "touchstart": output. innerHTML = "Touch started (" + event. touches [0]. clientX + "," + event. touches [0]. clientY + ")"; break; case "touchend": output. innerHTML + = "<br> Touch ended (" + event. changedTouches [0]. clientX + "," + event. changeTouches [0]. clientY + ")"; break; case "touchmove": event. preventDefault (); // prevents scrolling output. innerHTML + = "<br> Touch moved (" + event. changedTouches [0]. clientX + "," + event. changedTouches [0]. clientY + ")"; break ;}} document. addEventListener ("touchstart", handleTouchEvent, false); document. addEventListener ("touchend", handleTouchEvent, false); document. addEventListener ("touchmove", handleTouchEvent, false );
The above code tracks a touch operation on the screen. For simplicity, information is output only when one active touch operation is performed. When the touchstart event occurs. Will output the Touch Location Information
Element. When the touchmove event occurs, the default behavior is canceled to prevent scrolling (the default behavior of the touch movement event is to scroll the page), and then the changes of the touch operation are output. The touched event outputs the final information about the touch operation. Note: When the touched event occurs, there will be no Touch objects in the touched set because there is no active Touch operation. In this case, you must use the changeTouchs set instead.
These events are triggered on all the elements of the document, so you can operate on different parts of the page separately. When you touch the elements on the screen, these events occur in the following order:
Touchstart
Mouseover
Mousemove
Mousedown
Mouseup
Click
Touchend
Browsers that support touch events include iOS Safari, Android WebKit, beta Dolfin, BlackBerry WebKit in OS6 +, Opera Mobile 10.1, and phantom browsers in LG proprietary OS. Currently, only Safari for IOS supports multi-point touch screens. Desktop versions of Firefox 6 + and Chrome also support touch events.
2. gesture events
Safari in IOS 2.0 also introduces a set of gesture events. When two fingers touch the screen, a gesture is generated. A gesture usually changes the size of the display item or rotates the display item. There are three gesture events.
Gesturestart:Triggered when one finger is already on the screen and the other finger is on the screen.
Gesturechange:Triggered when the position of any finger on the touch screen changes.
Gestureend:Triggered when any finger is removed from the screen.
These events are triggered only when both fingers touch the event receiving container. Setting an event handler on an element means that two fingers must be within the range of the element to trigger a gesture event (this element is the target ). Because these events are bubbling, the event handler can also process all gesture events in the document. At this point, the event is the element in the range where both fingers are located.
There is a relationship between a touch event and a gesture event. When a finger is placed on the screen, the touchstart event is triggered. If another finger is placed on the screen, the gesturestart event is triggered first. If another finger is placed on the screen, the gesturestart event is triggered first, and then the touchstart event based on the finger is triggered. If one or two fingers slide on the screen, the gesturechange event is triggered. However, if one finger is removed, the gestureend event is triggered, followed by the touchend event based on the finger.
Like a touch event, the event object of each gesture event contains standard mouse event attributes: bubbles, cancelable, view, clientX, clientY, screenX, screenY, detail, altKey, shiftKey, ctrlKey, and metaKey. It also contains two additional attributes: rotation and scale. The rotation attribute indicates the rotation angle caused by finger changes. The negative value indicates clockwise rotation, and the positive value indicates clockwise rotation (the value starts from 0 ). The scale attribute indicates the variation of the distance between two fingers (for example, the distance is shortened when the distance is reduced). The value starts from 1 and increases with the distance and decreases with the distance.
The following is an example of a gesture event:
function handleGestureEvent(event) { var output = document.getElementById("output"); switch(event.type) { case "gesturestart": output.innerHTML = "Gesture started (rotation=" + event.ratation +",scale=" + event.scale + ")"; break; case "gestureend": output.innerHTML += "<br>Gesture ended (rotation+" + event.rotation + ",scale=" + event.scale + ")"; break; case "gesturechange": output.innerHTML += "<br>Gesture changed (rotation+=" + event.rotation + ",scale+" + event.scale + ")"; break; } } document.addEventListener("gesturestart", handleGestureEvent, false); document.addEventListener("gestureend", handleGestureEvent, false); document.addEventListener("gesturechange", handleGestureEvent, false);
Like the previous example of a touch event, the code here only Associates each event with the same function, and then outputs relevant information for each event through this function.
The above is all the content of this article. I hope it will be helpful for your learning and support for helping customers.