Brief introduction to Android touch screen gesture recognition

Source: Internet
Author: User
Tags event listener

Many times, the use of touch screen fling, scroll and other gesture (gesture) operation will make the application of the user experience greatly improved, such as using scroll gestures in the browser scrolling, using fling in the reader page, etc. In the Android system, gesture recognition is achieved through the Gesturedetector.ongesturelistener interface, but William Rummaged through the official Android document and did not find a relevant example, the API The Touchpaint in the demo is only referring to the handling of the Ontouch event, which does not involve gestures. Android developer There are a lot of people in the discussion group who have similar problems with me, in combination with the methods they mentioned and the experiments I've done, I'll give you a brief look at the implementation of gesture recognition in Android.

Let's start by defining some concepts, first of all, that the Android event-handling mechanism is based on listener (listener), which is more than the touch-screen event we're talking about today, through Ontouchlistener. Second, all view subclasses can add listeners to a class of events by means of Setontouchlistener (), Setonkeylistener (), and so on. Thirdly, listener is generally provided in the form of Interface (interface), which contains one or more abstract (abstract) methods, and we need to implement these methods to accomplish Ontouch (), OnKey (), and so on. Thus, when we set the event listener for a view and implement the abstract method in it, the program can give the appropriate response through the CALLBAKC function when a particular event is dispatch to the view.
Looking at a simple example, use the simplest textview to illustrate (in fact, there is no difference between the skeleton generated in ADT).

Java code
  1. Public class Gesturetest extends Activity implements ontouchlistener{
  2. @Override
  3. protected void OnCreate (Bundle savedinstancestate) {
  4. super.oncreate (savedinstancestate);
  5. Setcontentview (R.layout.main);
  6. //init TextView
  7. TextView TV = (TextView) Findviewbyid (r.id.page);
  8. //Set Ontouchlistener on TextView
  9. Tv.setontouchlistener (this);
  10. //Show some text
  11. Tv.settext (R.string.text);
  12. }
  13. @Override
  14. Public Boolean OnTouch (View V, motionevent event) {
  15. Toast.maketext (This, "OnTouch", Toast.length_short). Show ();
  16. return false;
  17. }


We set a ontouchlistener for the TextView instance TV, because the Gesturetest class implements the Ontouchlistener interface, so simply give a this as a parameter. The Ontouch method is to implement the abstract method in Ontouchlistener, as long as we add the logic code here to respond when the user touches the screen, as we do here-to play a message.

Here, we can get the type of touch event through the Motionevent getaction () method, including Action_down, Action_move, Action_up, and Action_cancel. Action_down refers to the touch screen, action_move refers to the touch screen after the move by Lidian weights, ACTION_UP is to release the touch screen, Action_cancel will not be triggered directly by the user (so not in today's discussion scope, Please refer to Viewgroup.onintercepttouchevent (motionevent)). With Getrawx (), Getrawy (), GetX () and gety () and other methods to obtain coordinates, we can implement functions such as dragging a button, dragging a scrollbar, and so on, with the help of different user actions. Standby can look at the Motionevent class of documents, but also can see the test touchpaint example.
Back to the point today, how do we identify the user's gesture when we capture the touch operation? Here we need the help of the Gesturedetector.ongesturelistener interface, so our Gesturetest class becomes this way.

Java code
    1. Public class Gesturetest extends Activity implements Ontouchlistener,
    2. Ongesturelistener {
    3. ....
    4. }


Then, in the Ontouch () method, we call Gesturedetector's Ontouchevent () method to give the captured Motionevent to Gesturedetector To analyze whether there is a suitable callback function to handle the user's gestures.

Java code
    1. @Override
    2. Public Boolean OnTouch (View V, motionevent event) {
    3. //Ongesturelistener'll analyzes the given motion event
    4. return Mgesturedetector.ontouchevent (event);
    5. }


Next, we implement the following 6 abstract methods, the most useful of course is onfling (), Onscroll () and onlongpress (). I have written the meaning of the gesture represented by each method in the comments, and you can see it.
User touches touch screen, triggered by 1 motionevent action_down

Java code
  1. @Override
  2. Public Boolean ondown (Motionevent e) {
  3. //TODO auto-generated method stub
  4. Toast.maketext (This, "Ondown", Toast.length_short). Show ();
  5. return false;
  6. }
  7. //user touches touch screen, not released or dragged, triggered by a 1 motionevent action_down
  8. //Note and Ondown (), emphasizing that there is no release or drag state
  9. @Override
  10. public void Onshowpress (Motionevent e) {
  11. //TODO auto-generated method stub
  12. }


User (after touch touchscreen) released, triggered by a 1 motionevent action_up

Java code
    1. @Override
    2. Public Boolean onsingletapup (Motionevent e) {
    3. //TODO auto-generated method stub
    4. return false;
    5. }


The user presses the touchscreen, moves quickly, and releases, triggered by 1 motionevent action_down, multiple action_move, and 1 action_up

Java code
  1. @Override
  2. Public Boolean onfling (motionevent E1, motionevent E2, float Velocityx,
  3. float Velocityy) {  
  4. //TODO auto-generated method stub
  5. return false;
  6. }
  7. User long press touch screen, triggered by multiple motionevent Action_down
  8. @Override
  9. Public void Onlongpress (Motionevent e) {
  10. //TODO auto-generated method stub
  11. }
  12. The user presses the touchscreen, and drags, by 1 motionevent action_down, multiple Action_move trigger
  13. @Override
  14. Public Boolean onscroll (motionevent E1, motionevent E2, float Distancex,
  15. float Distancey) {  
  16. //TODO auto-generated method stub
  17. return false;
  18. }


Let's try to do a onfling () event, the meaning of each parameter in the Onfling () method I wrote in the comments, note that the fling event's handling code, in addition to the first trigger fling Action_down and the last Action_ In addition to information such as the coordinates contained in the move, we can also be conditional on the movement speed of the user on the x-axis or the y-axis. For example, in the code below, we will only process when the user moves more than 100 pixels, and the x-axis moves faster than 200 pixels per second.

Java code
  1. @Override
  2. Public Boolean onfling (motionevent E1, motionevent E2, float Velocityx,
  3. float Velocityy) {  
  4. //Parameter explanation:
  5. //E1:1th action_down motionevent
  6. //E2: Last Action_move motionevent
  7. //Velocityx:x movement speed on the axis, pixels per second
  8. //Velocityy:y movement speed on the axis, pixels per second
  9. //Trigger conditions:
  10. the coordinate displacement of the x-axis is greater than fling_min_distance and moves faster than fling_min_velocity pixels per second
  11. if (E1.getx ()-E2.getx () > Fling_min_distance
  12. && Math.Abs (Velocityx) > Fling_min_velocity) {
  13. //Fling Left
  14. Toast.maketext (This, "Fling left", Toast.length_short). Show ();
  15. } Else if (E2.getx ()-E1.getx () > Fling_min_distance
  16. && Math.Abs (Velocityx) > Fling_min_velocity) {
  17. //Fling Right
  18. Toast.maketext (This, "Fling right", Toast.length_short). Show ();
  19. }
  20. return false;
  21. }


The problem is, if we try to run the program at this point, you'll find that we don't get the results we want at all, and tracking the execution of the code will find that the onfling () event has not been captured. This is the first question that bothers me.
I found the answer in the gesture detection of the discussion group that we need to tv.setontouchlistener in OnCreate, and then add the following line of code.
Tv.setlongclickable (TRUE); only then can the view handle the hold (i.e. Action_move, or multiple Action_down) that is different from tap We can also do this through the android:longclickable in the layout definition.
This encounter with this problem and the last Mapview Setonkeylistener encountered problems quite similar, in fact, the SDK is not fully understood, encountered a remember the good. But then again, Google does need to strengthen the document, at least in the Ongesturelistener to explain the need to meet those conditions in order to ensure that the gesture is correctly recognized.
Android Touch screen gesture recognition is simply introduced here, I hope to be useful to everyone. The effect of the operation you can click Download demo sourcecode to experience.

Brief introduction to Android touch screen gesture recognition

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.