Brief Introduction to Android touch screen Gesture Recognition

Source: Internet
Author: User

Most of the time, using the touch screen fling, scroll, and other gesture (gesture) operations will make the applicationProgramThe user experience is greatly improved, for example, using scroll gestures in
Scroll through the browser and use fling to flip pages in the reader. In Android, Gesture Recognition
The gesturedetector. ongesturelistener interface is implemented, but William does not find a phase after reading the official Android documentation.
In API demo, touchpaint only mentions ontouch event processing, and does not involve gestures. Android
There are also many people in the developer discussion group who have similar issues with me. In combination with the methods they mentioned and my experiments, I will give you a brief introduction to the implementation of Gesture Recognition in Android.

First, let's clarify some concepts. First, Android's event processing mechanism is implemented based on the listener.
Through ontouchlistener. Second, all view subclasses can use setontouchlistener (),
Setonkeylistener () and other methods to add listeners for a certain type of events. Third, listener is generally provided through interfaces.

Contains one or more abstract methods. We need to implement these methods to perform operations such as ontouch () and onkey. In this way, when we set a view

After the event listener is set and the abstract method is implemented, the program can use the callbakc function to adapt a specific event to the view when it is dispatched to the view.
Response.

Let's look at a simple example and use the simplest textview to describe it (in fact, it is no different from the skeleton generated in the ADT ).

Java Code
  1. Public ClassGesturetestExtendsActivityImplementsOntouchlistener {
  2. @ Override
  3. Protected VoidOncreate (bundle savedinstancestate ){
  4. Super. Oncreate (savedinstancestate );
  5. Setcontentview (R. layout. Main );
  6. // Init textview
  7. Textview TV = (textview) findviewbyid (R. Id. Page );
  8. // Set ontouchlistener on textview
  9. TV. setontouchlistener (This);
  10. // Show some text
  11. TV. settext (R. String. Text );
  12. }
  13. @ Override
  14. Public BooleanOntouch (view V, motionevent event ){
  15. Toast. maketext (This,"Ontouch", Toast. length_short). Show ();
  16. Return False;
  17. }

We set an ontouchlistener for the textview instance TV, because the gesturetest class implements
Ontouchlistener
Interface, so simply give this as a parameter. The ontouch method implements the abstract method in ontouchlistener. You only need to add the logical code here, that is
You can make a response when you touch the screen, just like what we do here-to make a prompt.

Here, we can use the getaction () method of motionevent to obtain the touch event type, including action_down,
Action_move, action_up,
And action_cancel. Action_down refers to pressing the touch screen, action_move refers to moving the affected device after pressing the touch screen, and action_up refers to loose

On the touch screen, action_cancel will not be directly triggered by the user (so it is not in the scope of today's discussion, please refer
Viewgroup. onintercepttouchevent (motionevent )). Use getrawx (),
After getrawy (), getx (), and Gety () methods are used to obtain coordinates, we can implement functions such as dragging a button and dragging a scroll bar. Wait for a while
Motionevent Class documentation, you can also take a look at the touchpaint example.

Back to what we want to talk about today, how can we identify users' gesture when we capture touch operations? Here we need the help of the gesturedetector. ongesturelistener interface, so our gesturetest class becomes like this.

Java code
    1. Public ClassGesturetestExtendsActivityImplementsOntouchlistener,
    2. Ongesturelistener {
    3. ....

Then, in the ontouch () method, we call the ontouchevent () method of gesturedetector and hand over the captured motionevent to gesturedetector to analyze whether there is a proper callback function to process user gestures.

Java code
    1. @ override
    2. Public Boolean ontouch (View v, motionevent) {
    3. // ongesturelistener will analyzes the given motion event / span>
    4. return mgesturedetector. ontouchevent (event);
    5. }

Next, we have implemented the following six Abstract METHODS: onfling (), onscroll (), and onlongpress. I have written the meaning of the gesture represented by each method in the comments. You can see it at a glance.

// Touch the touch screen, triggered by one motionevent action_down

Java code
  1. @ Override
  2.  Public BooleanOndown (motionevent e ){
  3. // Todo auto-generated method stub
  4. Toast. maketext (This,"Ondown", Toast. length_short). Show ();
  5. Return False;
  6. }
  7.  // Touch the touch screen, which has not been released or dragged, triggered by one motionevent action_down
  8.  // Note the difference between ondown () and ondown (), emphasizing that the status is not released or dragged.
  9.  @ Override
  10.  Public VoidOnshowpress (motionevent e ){
  11. // Todo auto-generated method stub
  12. }

// The user (after touching the touch screen) is released and triggered by one motionevent action_up

Java code
    1. @ Override
    2. Public BooleanOnsingletapup (motionevent e ){
    3. // Todo auto-generated method stub
    4. Return False;
    5. }

// The user presses the touch screen and moves quickly before releasing it. It is triggered by one motionevent action_down, multiple action_move, and one action_up.

Java code
  1. @ Override
  2. Public BooleanOnfling (motionevent E1, motionevent E2,FloatVelocityx,
  3. FloatVelocityy ){
  4. // Todo auto-generated method stub
  5. Return False;
  6. }
  7. // The user presses the touch screen for a long time and is triggered by multiple motionevent action_down
  8. @ Override
  9. Public VoidOnlongpress (motionevent e ){
  10. // Todo auto-generated method stub
  11. }
  12. // Press the touch screen and drag it. It is triggered by one motionevent action_down and multiple action_move operations.
  13. @ Override
  14. Public BooleanOnscroll (motionevent E1, motionevent E2,FloatDistancex,
  15. FloatDistancey ){
  16. // Todo auto-generated method stub
  17. Return False;
  18. }

Let's try to handle an onfling () event. I wrote the meaning of each parameter in the onfling () method in the comments. Note that the fling event processing
Generation
In addition to the coordinates of the first action_down trigger fling and the last action_move, we can also
The moving speed is used as a condition. For example, in the following code, we can only process a user moving more than 100 pixels and moving more than 200 pixels per second on the X axis.

Java code
  1. @ Override
  2. Public BooleanOnfling (motionevent E1, motionevent E2,FloatVelocityx,
  3. FloatVelocityy ){
  4. // Parameter explanation:
  5. // E1: 1st action_down motionevent
  6. // E2: The last action_move motionevent
  7. // Velocityx: moving speed on the X axis, pixel/second
  8. // Velocityy: The moving speed on the Y axis, pixel/second
  9. // Trigger condition:
  10. // The coordinate displacement of the X axis is greater than fling_min_distance, and the movement speed is greater than that of fling_min_velocity pixels/s.
  11. If(E1.getx ()-e2.getx ()> fling_min_distance
  12. & Math. Abs (velocityx)> fling_min_velocity ){
  13. // Fling left
  14. Toast. maketext (This,"Fling left", Toast. length_short). Show ();
  15. }Else If(E2.getx ()-e1.getx ()> fling_min_distance
  16. & Math. Abs (velocityx)> fling_min_velocity ){
  17. // Fling right
  18. Toast. maketext (This,"Fling right", Toast. length_short). Show ();
  19. }
  20. Return False;
  21. }

The problem is that if we try to run the program at this time, you will find that we don't get the desired results at all. If we trace the code execution, we will find that the onfling () event has never been captured. This is the problem that plagued me at the beginning. Why?

I found the answer in the discussion group's gesture detection post, that is, we need to add the following code in oncreate TV. setontouchlistener (this.

TV. setlongclickable (true );

Only in this way can the view process be different from the hold (action_move, or multiple action_down) of the TAP (touch). We can also achieve this through Android: longclickable in the layout definition.

The problem encountered this time is similar to the problem encountered by setonkeylistener in the previous mapview. In fact, it is not comprehensive enough to understand the SDK.
Okay. However, Google does need to improve its documentation. At least it can be stated in ongesturelistener that the conditions must be met to ensure that the gesture is
Identification.

Android touch screen gesture recognition is a simple introduction here, hoping to be useful to everyone. For the running effect, you can download the sourcecode of the demo to try it out.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.