In mobile development, it is easier to start prototyping on the desktop first and then handle the mobile-specific parts on the device that you intend to support. Multi-Touch is one of those features that is hard to test on a PC because most PCs have no touch input.
Tests that have to be done on a mobile device may lengthen your development cycle, because each change you make needs to be submitted to the server and then loaded onto the device. Then, once running, there is not much to do with the application, since tablets and smartphones lack the tools used by web developers.
One solution to this problem is to simulate triggering events on the development machine. For a single touch, a touch event can be modeled based on mouse events. If you have a touch input device, such as a modern app MacBook, then multi-touch can also be emulated.
single point Touch event
If you want to simulate a single touch event on your desktop, try Phantom Limb, which simulates a touch event on a Web page and provides a giant hand to boot.
There is also touchable, a jquery plug-in that unifies the touch and mouse events across the platform
Multi-point touch events
To make your multi-touch Web application work in your browser or multi-touch panel (such as Apple MacBook or Magicpad), I created this magictouch.js fill tool that captures the touch events from the trackpad and then converts them to standard-compatible touch events.
1. Download the nptuioclient Npapi plugin and install it into the ~/library/internet plug-ins/directory.
2. Download this Mac Magicpad Tongseng Tuio application and start the server.
3. Download Magictouch.js this JavaScript library to be based on the Nptuioclient callback simulation specification compliant touch events.
4. Include magictouch.js scripts and Nptuioclient Plug-ins in your application as follows:
< head> ...
< script src= "/path/to/magictouch.js" kesrc= "/path/to/magictouch.js" ></script>
I only tested this method on Chrome 10, but it should be able to work in other modern browsers with a little tweaking.
If your computer doesn't have multi-touch input, you can use other Tuio trackers, such as reactivision, to simulate touch events. For more information, please refer to the Tuio project page.
One thing to note is that your gestures can be the same as the multi-touch gestures on the OS level. On OS X, you can configure system-wide events by entering the Trackpad Preferences section in System Preferences.
As the multi-touch feature has gradually gained wide support across mobile browsers, I am delighted to see that new Web applications take full advantage of this rich API.
Original source: html5rocks.com
Original title: Developing for Multi-Touch Web Browsers
A touch event on the phone
Basic events:
Touchstart//The finger just touches the screen when it triggers
Touchmove//trigger when the finger moves on the screen
Touchend//trigger when the finger moves away from the screen
The following is less used: the Touchcancel//touch process is triggered when the system cancels
Each event has the following list, such as Touchend's targettouches of course is 0:
Touches//List of all fingers located on the screen
Targettouches//List of all fingers located on the element
Changedtouches//List of all fingers that involve the current event
Each event has a list, and each list has the following properties:
Where coordinates commonly used Pagex,pagey:
Pagex//relative to the X coordinate of the page
Pagey//Relative to the Y-coordinate of the page
ClientX//X coordinates relative to viewport
ClientY//Relative to the viewport's Y-coordinate
ScreenX//relative to the screen X coordinate
ScreenY//Relative to the screen's Y-coordinate
Identifier//Unique number of the current touch point
Target//The DOM element that the finger touches
Other related events:
Event.preventdefault ()//block browser zoom, scroll bar scrolling when touching
var Supporttouch = "Createtouch" in document//To determine whether to support touch events
Second, the example
The following is to obtain the different types of sliding code specific practices, combined with the ideas of predecessors, encapsulated well, you can learn from:
var touchfunc = function (obj,type,func) {//sliding range within 5x5 to do click Processing, S is to start, E is to end var init = {x:5,y:5,sx:0,sy:0,ex:0,ey:0};
var stime = 0, etime = 0;
Type = Type.tolowercase ();
Obj.addeventlistener ("Touchstart", function () {stime = new Date (). GetTime ();
INIT.SX = Event.targettouches[0].pagex;
Init.sy = Event.targettouches[0].pagey;
Init.ex = INIT.SX;
Init.ey = Init.sy;
if (Type.indexof ("Start")!=-1) func ();
}, False); Obj.addeventlistener ("Touchmove", function () {event.preventdefault ();//browser zoom, scroll bar scrolling init.ex = Event.targettouc when touching is blocked
Hes[0].pagex;
Init.ey = Event.targettouches[0].pagey;
if (Type.indexof ("move")!=-1) func ();
}, False);
Obj.addeventlistener ("Touchend", function () {var changex = Init.sx-init.ex;
var changey = Init.sy-init.ey;
if (Math.Abs (Changex) >math.abs (changey) &&math.abs (changey) >init.y) {//left or right event if (Changex > 0) {
if (Type.indexof ("left")!=-1) func (); }else{if (Type.indexof ("right")!=-1) func (); } else if (Math.Abs (changey) >math.abs (Changex) &&math.abs (Changex) >init.x) {//Up/down event if (changey
> 0) {if (Type.indexof ("top")!=-1) func ();
}else{if (Type.indexof ("Down")!=-1) func ();
} else if (Math.Abs (Changex) <init.x && math.abs (changey) <init.y) {etime = new Date (). GetTime ();
Click on the event, where according to the time Difference subdivision if ((Etime-stime) >) {if (Type.indexof ("Long")!=-1) func ();//Long Press} else { if (Type.indexof ("click")!=-1) func ();
When clicking on the process}} if (Type.indexof ("End")!=-1) func ();
}, False);
};
Reprint of the article: Mobile phone touch screen JS event
Handling Touch events allows you to track the position of each finger of the user. You can bind the following four kinds of touch events:
1.touchstart://trigger when the finger is placed on the screen
2.touchmove://Trigger when the finger moves on the screen
3.touchend://trigger when the finger picks up from the screen
4touchcancel://The system cancels the touch event when triggered. As to when the system will be canceled, unknown
Property
1.client/clienty://the position of the touch point relative to the browser window viewport
2.pagex/pagey://the position of the touch point relative to the page
3.screenx/screeny://the position of the touch point relative to the screen
4.identifier://Touch object's unique ID
Touchstart Event
function Touchsatrtfunc (e) {
//evt.preventdefault ();///Prevent browser zoom, scroll bar scrolling, etc. var touch when touching
= E.touches[0]; Gets the first contact
var x = number (Touch.pagex);//page contact X coordinate
var y = number (touch.pagey);//page contact y coordinate/
/record contact initial position
StartX = x;
Starty = y;
}
Touchmove Event
function Touchmovefunc (e) {
//evt.preventdefault ();///Prevent browser zoom, scroll bar scrolling, etc. var touch when touching
= Evt.touches[0]; Gets the first contact
var x = number (Touch.pagex);//page contact X coordinate
var y = number (touch.pagey);//page Contact y-coordinate
var text = ' Touchm Ove Event Trigger: (' + x + ', ' + y + ') ';
To determine the slip direction
if (x-startx!= 0) {
//left/right sliding
}
if (y-starty!= 0) {
//up and Down
}
}
Reprint of the second article:Mobile Web Front-End Development Series: Event Processing (ii)
In the previous article we talked about the basics of HTML, and this article focuses on the next touch event, which is triggered by touching the finger screen, moving the finger on the screen, or leaving the screen. An event is a collection of touches that start when the finger is first placed on the screen, ending with the last finger leaving the screen. All touch operations from the beginning to the end of an event are stored in the same event's record.
Touch Events
Touch events can be divided into two kinds of single point touching and multi-touch, single touch high-end machines are generally supported, Safari2.0, Android3.0 version support multi-touch, support up to 5 fingers at the same time touch the screen, the ipad supports up to 11 fingers at the same time touch the screen, We can capture these events using the following event model:
Ontouchstart Ontouchmove ontouchend Ontouchcancel
When the user presses the finger on the screen, the Ontouchstart is triggered, and when the user moves one or more fingers, the ontouchmove is triggered, and the ontouchend is triggered when the user removes the finger. When does that trigger Ontouchcancel? When a higher level event occurs, such as alert, when a call is made or a push message is prompted, the current touch is canceled, triggering the ontouchcancel. When you are developing a Web game, Ontouchcancel is important to you, you can pause the game or save the game when the ontouchcancel is triggered.
Gesture Events
The gesture event works the same as the touch event, except that the gesture event is triggered only when there are at least two fingers on the screen, so Safari2.0, Android3.0 above version support, gestures have many advantages that can help us measure the two-finger scaling and rotation operations, The event model is as follows:
Ongesturestart ongesturechange ongestureend
Event Properties
Whether you use touch or gesture events, you need to convert these events into individual touches to use them. To do this, you need to access a series of properties of the event object.
Targettouches all touches on the Changedtouches page for all current touches on the target element touches page
Changedtouches, targettouches, and touches each contain a slightly different touch list. Targettouches and touches contain a list of the fingers that are currently on the screen, but changedtouches only lists the last touches that occur. This property is important if you are using Touchend or Gestureend events. In either case, the finger will no longer appear on the screen, so targettouches and touches should be empty, but you can still look at the changedtouches array to find out what happened last.
Because the touch properties generate arrays, you can use JavaScript array functions to access them. This means that event.touches[0] returns the first touch and can use Event.touches.length to calculate the number of touches currently stored.
When viewing individual touches, by using event.targettouches[0], you can also access other touches, and each touch will contain some specific information,
ClientX, ClientY relative to the X or Y position of the current screen Pagex, pagey to the X or Y position of the overall page ScreenX, ScreenY the unique identifier of the x or Y position identifier event relative to the user's computer screen target The target object that generated the touch
The event object of the gesture event will be two more properties than the ordinary touch event, rotation the rotation angle of the finger scale the value of the contraction
Reprint article:JavaScript touch and gesture events
The iOS version of Safari adds some proprietary events in order to convey some special information to developers. Because iOS devices have neither a mouse nor a keyboard, regular mouse and keyboard events are not sufficient when developing interactive Web pages for Mobile Safari. With the addition of Android's WebKit, many of these proprietary events have become factual standards.
1. Touch Event
The iphone 3G, which contains the iOS2.0 software, also contains a new version of Safari browser. This new mobile Safari provides some new events related to touch operations. Later, the Android browsers also implemented the same event. A touch event is triggered when the user's finger is on the screen, slides on the screen, or moves away from the screen. Specifically, there are several touch events below.
Touchstart: Triggers when the finger touches the screen, even if a finger is already on the screen.
Touchmove: A continuous trigger when the finger slides on the screen. During this event, calling Preventdefault () prevents scrolling.
Touchend: Triggers when the finger moves away from the screen.
Touchcancel: Triggered when the system stops tracking the touch. The exact triggering event for this event is not specified in the document.
All the above events will bubble up and can be canceled. Although these touch events are not defined in the DOM specification, they are implemented in a way that is compatible with DOM. Therefore, each touch event is not defined in the DOM specification, but they are implemented in a way that is compatible with the DOM. Therefore, the event object for each touch event provides properties that are common in mouse events: Bubbles, cancelable, view, ClientX, ClientY, ScreenX, ScreenY, detail, Altkey, Shiftkey, Ctrlkey and Metakey.
In addition to the common DOM properties, the touch event also contains the following three properties for tracking the touch.
Touches: An array of touchscreen objects that represent the touch operations of the current trace.
Targettouches: An array of touch objects that are specific to the event target.
Changetouches: An array of touch objects that indicate what has changed since the last time it was touched.
Each Touch object contains the following properties.
ClientX: The x-coordinate of the touch target in the viewport.
ClientY: The y-coordinate of the touch target in the viewport.
Identifier: A unique ID that represents a touch.
Pagex: The x-coordinate of the touch target on the page.
Pagey: The y-coordinate of the touch target on the page.
ScreenX: The x-coordinate of the touch target in the screen.
ScreenY: The y-coordinate of the touch target in the screen.
Target: The DOM node coordinates of the touch.
Use these properties to track user touch actions on the screen. Look at the example below.
function handletouchevent (event) {//tracking only one touch if (event.touches.length = 1) {var o
Utput = document.getElementById ("Output"); Switch (event.type) {case "Touchstart": output.innerhtml = "Touch Started" + Event.touches[0].clientx + "," + Event
. Touches[0].clienty + ")";
Break Case "Touchend": output.innerhtml + + <br>touch ended ("+ Event.changedtouches[0].clientx +", "+ Event.changetou
Ches[0].clienty + ")";
Break Case "Touchmove": Event.preventdefault (); Block scrolling output.innerhtml + + <br>touch moved ("+ Event.changedtouches[0].clientx +", "+ event.changedtouches[0].c
Lienty + ")";
Break
}} document.addeventlistener ("Touchstart", handletouchevent, false);
Document.addeventlistener ("Touchend", handletouchevent, false);
Document.addeventlistener ("Touchmove", handletouchevent, false);
The above code tracks one touch operation that occurs on the screen. For simplicity, the information will only be output if there is an active touch operation. When the Touchstart event occurs. Will output the touch location information to the
Element. When the Touchmove event occurs, it cancels its default behavior, preventing scrolling (the default behavior of touch movement is scrolling the page), and then outputting the change information for the touch operation. The touched event then outputs the final information about the touch operation. Note that when the touched event occurs, there is no touch object in the touched collection, because there are no active touching operations, and you must instead use the Changetouchs collection.
These events are triggered on all of the elements of the document, so that you can manipulate different parts of the page separately. When you touch elements on the screen, the following sequence of events occurs:
Touchstart
MouseOver
MouseMove
MouseDown
MouseUp
Click
Touchend
Browsers that support touch events include iOS version safari, Android version WebKit, beta Dolfin, BlackBerry WebKit in os6+, and Phantom browsers in Opera Mobile 10.1 and LG proprietary OS. Currently only iOS version Safari supports multi-touch screens. The desktop version of Firefox 6+ and Chrome also supports touch events.
2. Gesture Events
Safari in IOS 2.0 also introduces a set of gesture events. Gestures are generated when two fingers touch the screen, and gestures usually change the size of the displayed item or rotate the display item. There are three gesture events, respectively, as follows.
Gesturestart: When one finger has been pressed on top of the screen and another finger has touched the screen, it triggers.
Gesturechange: triggers when the position of any finger on the touch screen changes.
gestureend: triggers when any one of the fingers moves away from the screen.
These events are only triggered when only two fingers touch the receiving container of the event. Setting an event handler on an element means that two fingers must be within the range of the element at the same time to trigger the gesture event (this element is the target). Because these events bubble, placing an event handler on a document can also handle all gesture events. At this point, the object of the event is the element whose two fingers are in its range.
There is a relationship between touch events and gesture events. When a finger is placed on the screen, the Touchstart event is triggered. If another finger is placed on the screen, the Gesturestart event is triggered first. If another finger is placed on the screen, the Gesturestart event is triggered first and then the Touchstart event based on that finger is triggered. If one or two fingers slide on the screen, the Gesturechange event is triggered, but as soon as one finger is moved, the Gestureend event is triggered, followed by triggering the Touchend event based on the finger.
As with touch events, the event object for each gesture event contains standard mouse event properties: Bubbles, cancelable, view, ClientX, ClientY, ScreenX, ScreenY, detail, Altkey, Shiftkey, Ctrlkey and Metakey. In addition, there are two additional attributes: rotation and scale. Where the Rotation property represents the angle of rotation caused by a finger change, a negative value indicates a counter-clockwise rotation, and a positive value indicates a clockwise rotation (which starts at 0). The Scale property indicates a change in the spacing of two fingers (for example, inward contraction shortens the distance); This value starts at 1 and grows with distance, decreasing with distance.
Here is an example of using gesture events:
function Handlegestureevent (event) {
var output = document.getElementById ("output");
Switch (event.type) {case
"Gesturestart":
output.innerhtml = "gesture Started" (rotation= "+ Event.ratation +", Scale= "+ Event.scale +");
break;
Case "Gestureend":
output.innerhtml + + <br>gesture ended (rotation+ "+ Event.rotation +", scale= "+ event.scal") E + ")";
break;
Case "Gesturechange":
output.innerhtml + + <br>gesture changed (rotation+= "+ Event.rotation +", scale+ "+ Eve") Nt.scale + ")";
break;
}
}
Document.addeventlistener ("Gesturestart", handlegestureevent, false);
Document.addeventlistener ("Gestureend", handlegestureevent, false);
Document.addeventlistener ("Gesturechange", handlegestureevent, false);
As with the previous example of a touch event, the code here simply associates each event to the same function and then outputs information about each event through the function.
The above is the entire content of this article, I hope to help you learn, but also hope that we support the cloud habitat community.