Android event distribution completely parses where events come from and where android comes from

Source: Internet
Author: User

Android event distribution completely parses where events come from and where android comes from

Respect Original reprinted Please note: From AigeStudio (http://blog.csdn.net/aigestudio) Power by Aige infringement must be investigated!

ArtilleryTown Building

In the previous section, the Android event distribution completely resolves why we briefly analyzed the origin of the event distribution mechanism. Here we should explain why Android (or any driver system) there are a large number of different types of events, such as buttons, trackball, mouse, touch, and infrared. Here, we want to simplify the problem and make it practical, we only analyze touch events. As for some other miscellaneous events, we can't say much about them.

So where did a touch event come from in Android? Kids shoes who have a little knowledge about event distribution must know the dispatchtouchevent method and the starting point of View's distribution of touch events. But where did the touch events passed in the dispatchtouchevent method come from? Step by step, you will find that the Code call is endless and cannot find the header ...... Sometimes blindly reading the fuck source code will make you more confused. In fact, you can quickly find the answer by thinking about the logic. We all know that the generation of an event requires user interaction, that is to say, the system can respond to an event only after you touch the screen or press an operation such as a button, and every such operation can be regarded as the "Source" of the event ", who are the hunters who capture the most primitive interaction information? Will it be a View? Will it be Activity? Will it be ViewRootImpl or WMS? The components in these frameworks are too "advanced" relative to the underlying mechanism. We know that Android is a Linux-based operating system, linux itself has a very Perfect Input subsystem architecture. Although Android also implements several of its own mechanisms, most underlying calls are based on the Operation interfaces provided by Linux, for example, the compilation of the Input driver is based on the character-driven operation interface of the Linux Input system. If you are interested in this part of the Linux system, you can check out the private food, here, you only need to know that in Android, the Linux Input subsystem reads and writes hardware input device nodes in the/dev/Input/path named event [NUMBER. These nodes are related to specific hardware, so the specific node names of each device may be different, for example, in mx3,/dev/input/event0 is mxhub-key and/dev/input/event1 is gp2ap. You can view the specific node information through the getevent tool provided by Android. If your device has been connected to a PC or the simulator has been started, the real-time status of event read/write can be obtained by getevent after adb shell, of course, each device is different. For example, you can use getevent in mx3 to view all Input nodes:

Aigestudio>adb shellshell@mx3:/ $ geteventgeteventadd device 1: /dev/input/event0  name:     "mxhub-keys"add device 2: /dev/input/event4  name:     "lsm330dlc_gyr"add device 3: /dev/input/event3  name:     "lsm330dlc_acc"add device 4: /dev/input/event1  name:     "gp2ap"could not get driver version for /dev/input/mouse0, Not a typewriteradd device 5: /dev/input/event5  name:     "mx_ts"add device 6: /dev/input/event6  name:     "gpio-keys"add device 7: /dev/input/event7  name:     "Headset"add device 8: /dev/input/event2  name:     "compass"could not get driver version for /dev/input/mice, Not a typewriter
It can be seen that there are eight Input subsystems in mx3:

  • The Reading and Writing meizu breathing lamp button at event0 node is the "mxhub-keys" subsystem of the luminous primary key at the bottom of the screen.
  • The "lsm330dlc_gyr" subsystem of the read/write gravity sensor located at the event4 Node
  • The "lsm330dlc_acc" subsystem of the read/write accelerometer located at event3 Node
  • The "gp2ap" subsystem for reading and writing infrared sensors at the event1 node (meizu mx3 uses infrared rays to measure light sensitivity and distance)
  • The "mx_ts" subsystem that is located under the event5 node to read and write the screen touch
  • The "gpio-keys" subsystem of the read/write physical buttons located in the event6 Node
  • The "Headset" subsystem of the read/write earphone buttons Under event7 node (the system of some mobile phone monitoring line control devices is often named as hook, here, meizu uses a rare Headset to indicate that this class does not know whether there is a layout head-mounted device)
  • The "compass" subsystem for reading and writing compass under event2 Node

Mx3 (not Android) reads and writes device event information from these system nodes, the above information is obtained after I press the power key to close the screen when mx3 is off the screen. If we press the power to light the screen again, the kernel driver will constantly monitor some necessary read/write events, here we don't want our Terminal to be output all the time. Use the-c parameter of getevent to set the maximum number of output records:


Here I have set a maximum of 16 outputs. After the screen is highlighted, the above information is displayed. If no output limit is set, Terminal will always output ...... That is to say, the subsystems of the acceleration and infrared sensors will constantly detect changes in the external environment. As for the reason, I think everyone should be aware of the acceleration and infrared sensors. If we Touch the page quickly after getevent, the subsystem under event5 will respond immediately:

For example, we may not understand the information we get after quick access to the screen, right? Add the-l parameter to the getevent to format the output:


Note: Your output results may not be the same as those of me due to factors such as hardware devices, touch area intensity, and duration. The specific output shall prevail, however, the output information is similar.

Here we take the first message "/dev/input/event5: EV_ABS ABS_MT_TRACKING_ID 000008e0", where/dev/input/event5 indicates the device node, and EV_ABS indicates the type event type; ABS_MT_TRACKING_ID indicates the scan code of the code event, and 000008e0 indicates the specific event value. The definitions of this information are all declared in the kernel/include/linux/input. h file. For example, the type input device type includes the following:

#define EV_SYN                  0x00#define EV_KEY                  0x01#define EV_REL                  0x02#define EV_ABS                  0x03#define EV_MSC                  0x04#define EV_LED                  0x11#define EV_SND                  0x12#define EV_REP                  0x14#define EV_FF                   0x15#define EV_PWR                  0x16#define EV_FF_STATUS            0x17#define EV_MAX                  0x1f
Specifically, they all mean nothing more. They are all Linux objects. In general, EV_REL is commonly used to indicate relative coordinate types, EV_ABS represents absolute coordinate types, and EV_KEY represents physical keyboard event types, EV_SYN indicates the synchronization event type. A device supports multiple different event types, and different event codes can be set for each event type. For example, the EV_SYN synchronization event type event codes are as follows:

#define SYN_REPORT      0  #define SYN_CONFIG      1  #define SYN_MT_REPORT       2  
You can find the corresponding definitions in the input. h file if you do not list them all. The feedback after a quick touch screen in the figure above can be described as follows:

/Dev/input/event5: EV_ABS ABS_MT_TRACKING_ID 000008e0 indicates the start of collecting multi-point tracing information (required by the device)/dev/input/event5: EV_ABS ABS_MT_POSITION_X 00000280 report the contact center X coordinate/dev/input/event5: EV_ABS ABS_MT_POSITION_Y 0000064b report the contact center Y coordinate/dev/input/event5: EV_ABS ABS_MT_PRESSURE 0000005b report finger pressure/dev/input/event5: EV_ABS ABS_MT_TOUCH_MAJOR 00000014 report main contact axis/dev/input/event5: EV_SYN SYN_REPORT 00000000 synchronize data/dev/input/event5: EV_ABS ABS_MT_PRESSURE 00000057 report finger pressure/dev/input/event5: EV_ABS ABS_MT_TOUCH_MAJOR 00000012 report main contact axis/dev/input/event5: EV_SYN SYN_REPORT 00000000 synchronize data/dev/input/event5: EV_ABS ABS_MT_TRACKING_ID ffffffff indicates that the collection of multi-point tracking information ends (required by the device)/dev/input/event5: EV_SYN SYN_REPORT 00000000 synchronize data
The above process is just a quick touch of the generated node to read, if we make more complex gesture operations such as multi-point watermelon cutting effect Nima just collects this information is not enough! Fortunately, the collection of original information does not require developers at the application layer. For application development, we tend to be more concerned about whether to click an event or double-click or long-press an event, instead of dealing with these huge and complex raw information, So Android will convert the original data to a certain extent for ease of use, of course, if you need to perform driver development operations involving the raw data, you can also directly obtain and use it.

It can be seen that in Linux, the Input subsystem captures the Input device information as the source of Android events. Of course, these things do not need to be thoroughly understood by application developers. At the beginning of this article, we once said that the source of an event must come from user interaction. Is that true? Kids shoes that have played games in the early years must be familiar with the key-pushing genie. At least they are no stranger. We use the key-pushing genie to simulate users' operation on the key position. That is to say, we do not necessarily need real user interaction, simulate. Similarly, Android provides us with another cool tool sendevent to simulate event generation by writing event information to/dev/input/. The specific usage is similar to that of getevent, try it by yourself.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.