Respect the original reprint please specify: from Aigestudio (Http://blog.csdn.net/aigestudio) Power by aige infringement must investigate!
Artillery Zhen Lou
Previous Android event Distribution full Resolution Why is she we briefly analyze the origin of the event distribution mechanism, here to illustrate that Android (or any drive system) contains a number of different types of events, such as buttons, trackball, mouse, touch, infrared, etc. In order to simplify the problem and to be practical, we only analyze the touch events, as for the other assorted events are very well understood and not much to say.
So where does a touch event come from in Android? Children's shoes with little knowledge of event distribution know the Dispatchtouchevent method and know the starting point for the view to distribute the touch event, but where does the touch event come from in the incoming Dispatchtouchevent method? Follow the previous step and you will find that the code calls are endless. Sometimes blindly to read fuck source code will make you more confused, in fact, with the brain to understand the logic can quickly find the answer, we all know that the production of an event must be user interaction, that is, It is only when the user touches the screen or presses a button that the system can respond to an event, and each of these actions can be used as the "source" of the event, so who should capture the most primitive interactive information? Will it be view? Will it be activity? Will it be viewrootimpl and WMS? The components in these frameworks are still too "advanced" relative to the underlying mechanism, and we know that Android is a Linux-based operating system, and Linux itself has a very perfect input subsystem architecture, While Android also implements several of its own mechanisms, most of the underlying calls are based on the operating interface provided by Linux, such as the writing of input drivers is based on the Linux input system character-driven operation interface, About this part of Linux if you are interested to see the private dishes, here is not much, here you just need to know that the Linux input subsystem in Android will read and write under the/dev/input/path to the hardware input device node named Event[number]. These nodes are related to the specific hardware, so it is possible that the specific node name of each device is different, such as in my mx3/dev/input/event0 for Mxhub-key and/dev/input/event1 for Gp2ap. Specific node information can be viewed through the GetEvent tool provided by Android, if your device is already connected to a PC or the emulator has been started, adb shell getevent can get the real-time status of event read and write, of course, each device is not the same, For example, all input nodes are viewed through getevent in MX3:
1 2Aigestudio>adb Shell3[Email protected]:/$ getevent4 getevent5Add Device1:/dev/input/event06Name"Mxhub-keys" 7Add Device2:/dev/input/Event48Name"Lsm330dlc_gyr" 9Add Device3:/dev/input/Event3TenName"LSM330DLC_ACC" OneAdd Device4:/dev/input/event1 AName"Gp2ap" -Could notGetDriver version for/dev/input/Mouse0, not a typewriter -Add Device5:/dev/input/event5 theName"Mx_ts" -Add Device6:/dev/input/Event6 -Name"Gpio-keys" -Add Device7:/dev/input/event7 +Name"Headset" -Add Device8:/dev/input/Event2 +Name"Compass" ACould notGetDriver version for/dev/input/mice, not a typewriter
There are 8 input subsystems in the MX3, respectively:
- The "Mxhub-keys" subsystem of the illuminated primary key, which is located under the Event0 node, which reads and writes the Meizu Breath Light button, which is the circle below the screen
- "Lsm330dlc_gyr" subsystem of reading and writing gravity sensor under EVENT4 node
- "LSM330DLC_ACC" subsystem of reading and writing accelerometer in Event3 node
- The "GP2AP" subsystem (the Meizu MX3 uses infrared to measure the light sense and distance) in the Event1 node under the reading and writing infrared sensor.
- "Mx_ts" Subsystem for reading and writing screen touch under the EVENT5 node
- "Gpio-keys" Subsystem for reading and writing physical keys under the EVENT6 node
- The "Headset" subsystem located under the EVENT7 node that reads and writes the headset button (some mobile phone monitoring line control equipment is often called the system of hooks, here Meizu use rare Headset to indicate that the class does not know whether there is the meaning of the layout of the head-mounted device)
- "Compass" subsystem of reading and writing compass under Event2 node
and MX3 (can not be said that Android ha here for MX3) is from these system nodes read and write device event information, the above information I was in the mx3 screen when the power button is pressed to turn off the screen, if we press power again to light the screen, the kernel driver will constantly monitor some necessary read and write events , here we do not want to let our terminal always output, using the GetEvent-C parameter to set the maximum number of output bars to view:
Here I set the maximum 16 output, after the screen is visible as shown above, if the output is not limited, terminal will always output ... That is, the acceleration and infrared sensor subsystem will constantly detect changes in the external environment, as for why, think about acceleration and infrared sensors I think we should be able to know. If we touch the screen quickly after getevent, then the subsystem under the EVENT5 node responds immediately:
If we get a quick contact with the screen after the information, may not understand, right, to getevent plus-l parameter format output to see:
Note: Because of hardware equipment, touch area intensity, duration and other factors affect your output may not be the same as I, with specific your specific output will prevail, but the output information is roughly similar.
Here take the first message "/dev/input/event5:ev_abs abs_mt_tracking_id 000008e0", wherein/DEV/INPUT/EVENT5 above we said that the device node; Ev_abs represents the Type event class ABS_MT_TRACKING_ID represents a code event scan code; 000008E0 indicates a specific event value. The definitions of this information are declared in the Kernel/include/linux/input.h file, such as type input device types including the following:
1 #defineEv_syn 0x002 #defineEv_key 0x013 #defineEv_rel 0x024 #defineEv_abs 0x035 #defineEv_msc 0x046 #defineEv_led 0x117 #defineEv_snd 0x128 #defineEv_rep 0x149 #defineEv_ff 0x15Ten #defineEV_PWR 0x16 One #defineEv_ff_status 0x17 A #defineEv_max 0x1f
Specifically they all represent what is not much to say, are some of the Linux things, generally more commonly used is ev_rel represents a relative coordinate type, ev_abs represents an absolute coordinate type, Ev_key represents the physical keyboard event type, Ev_syn represents the synchronization event type, etc. One device can support multiple different event types and each event type can also be set up with different event codes, such as the event code for the Ev_syn synchronization event type as follows:
1 #define Syn_report 0 2#define syn_config 1 3#define syn_mt_ Report 2
The rest of the list is not listed in the Input.h file can be found in the corresponding definition. Feedback from a quick touch screen in the above illustration can be described as follows:
1/dev/input/event5:ev_abs abs_mt_tracking_id 000008e0 Flag Multi-point tracking information acquisition begins (requires device support)2/dev/input/event5:ev_abs abs_mt_position_x00000280the center point X coordinate of the escalation contact surface3/dev/input/event5:ev_abs abs_mt_position_y 0000064b Report The center point Y-coordinate of the contact surface4/dev/input/event5:ev_abs abs_mt_pressure 0000005b report finger pressure5/dev/input/event5:ev_abs Abs_mt_touch_major00000014escalation of main contact surface long axis6/dev/input/event5:ev_syn Syn_report00000000Synchronizing Data7/dev/input/event5:ev_abs abs_mt_pressure00000057 8 Report Finger pressure9/dev/input/event5:ev_abs Abs_mt_touch_major00000012 Ten escalation of main contact surface long axis One/dev/input/event5:ev_syn Syn_report00000000Synchronizing Data A/dev/input/event5:ev_abs abs_mt_tracking_id ffffffff - end of collection of flag multipoint tracking information (requires device support) -/dev/input/event5:ev_syn Syn_report00000000Synchronizing data
If the above process is just a quick touch of the resulting node reads, if we make more complex gesture operations such as multi-point cut watermelon like the effect of Nima is to collect this information is very much! Fortunately, the acquisition of these raw information does not need the application layer of developers to do, for application development we tend to care more about whether an event is clicked or double-click or long press and so on, rather than face these large and complex raw information, so, Android will be able to convert these raw data into a certain conversion for ease of use, if you need to do a driver development involving these raw data operations can also be directly accessible to its use.
It can be seen that the input subsystem in Linux captures the information of imported devices is the ancestor of the source of Android events, of course, these things for the application developers do not need to understand, only to understand. At the beginning of the article, we have said that the source of an event must come from the user's interaction, so is it true? Early games of children's shoes must be familiar with the key wizard this thing, at least not unfamiliar, we use the key wizard to simulate the user's operation of the keys, that is to say we do not necessarily need the user's real interaction, simulation is OK. The same Android also gives us another cool tool sendevent to write event information to/dev/input/to simulate the occurrence of events, the specific usage is similar to getevent, do not say more, try it on your own.
(turn) use GetEvent to monitor Android input device files