Respect the original reprint please specify: from Aigestudio (Http://blog.csdn.net/aigestudio) Power by aige infringement must investigate!
Artillery Zhen Lou
Previous Android event Distribution full Resolution Why is she we briefly analyze the origin of the event distribution mechanism, here to illustrate that Android (or any drive system) contains a number of different types of events, such as buttons, trackball, mouse, touch, infrared, etc. In order to simplify the problem and to be practical, we only analyze the touch events, as for the other assorted events are very well understood and not much to say.
So where does a touch event come from in Android? Children's shoes with little knowledge of event distribution know the Dispatchtouchevent method and know the starting point for the view to distribute the touch event, but where does the touch event come from in the incoming Dispatchtouchevent method? Follow the previous step and you will find that the code calls are endless. Sometimes blindly to read fuck source code will make you more confused, in fact, with the brain to understand the logic can quickly find the answer, we all know that the production of an event must be user interaction, that is, It is only when the user touches the screen or presses a button that the system can respond to an event, and each of these actions can be used as the "source" of the event, so who should capture the most primitive interactive information? Will it be view? Will it be activity? Will it be viewrootimpl and WMS? The components in these frameworks are still too "advanced" relative to the underlying mechanism, and we know that Android is a Linux-based operating system, and Linux itself has a very perfect input subsystem architecture, While Android also implements several of its own mechanisms, most of the underlying calls are based on the operating interface provided by Linux, such as the writing of input drivers is based on the Linux input system character-driven operation interface, About this part of Linux if you are interested to see the private dishes, here is not much, here you just need to know that the Linux input subsystem in Android will read and write under the/dev/input/path to the hardware input device node named Event[number]. These nodes are related to the specific hardware, so it is possible that the specific node name of each device is different, such as in my mx3/dev/input/event0 for Mxhub-key and/dev/input/event1 for Gp2ap. Specific node information can be viewed through the GetEvent tool provided by Android, if your device is already connected to a PC or the emulator has been started, adb shell getevent can get the real-time status of event read and write, of course, each device is not the same, For example, all input nodes are viewed through getevent in MX3:
AIGESTUDIO>ADB shell[email protected]:/$ geteventgeteventadd device 1:/dev/input/event0 name: " Mxhub-keys "Add Device 2:/dev/input/event4 Name: " Lsm330dlc_gyr "Add Device 3:/dev/input/event3 Name: "LSM330DLC_ACC" Add Device 4:/dev/input/event1 Name: "Gp2ap" could not get driver version for/dev/input/ MOUSE0, not a Typewriteradd device 5:/dev/input/event5 Name: "Mx_ts" Add Device 6:/DEV/INPUT/EVENT6 Name: "Gpio-keys" Add Device 7:/dev/input/event7 Name: "Headset" Add Device 8:/dev/input/event2 Name: "Compass" could not get driver version for/dev/input/mice, not a typewriter
There are 8 input subsystems in the MX3, respectively:
- The "Mxhub-keys" subsystem of the illuminated primary key, which is located under the Event0 node, which reads and writes the Meizu Breath Light button, which is the circle below the screen
- "Lsm330dlc_gyr" subsystem of reading and writing gravity sensor under EVENT4 node
- "LSM330DLC_ACC" subsystem of reading and writing accelerometer in Event3 node
- The "GP2AP" subsystem (the Meizu MX3 uses infrared to measure the light sense and distance) in the Event1 node under the reading and writing infrared sensor.
- "Mx_ts" Subsystem for reading and writing screen touch under the EVENT5 node
- "Gpio-keys" Subsystem for reading and writing physical keys under the EVENT6 node
- The "Headset" subsystem located under the EVENT7 node that reads and writes the headset button (some mobile phone monitoring line control equipment is often called the system of hooks, here Meizu use rare Headset to indicate that the class does not know whether there is the meaning of the layout of the head-mounted device)
- "Compass" subsystem of reading and writing compass under Event2 node
and MX3 (can not be said that Android ha here for MX3) is from these system nodes read and write device event information, the above information I was in the mx3 screen when the power button is pressed to turn off the screen, if we press power again to light the screen, the kernel driver will constantly monitor some necessary read and write events , here we do not want to let our terminal always output, using the GetEvent-C parameter to set the maximum number of output bars to view:
Here I set the maximum 16 output, after the screen is visible as shown above, if the output is not limited, terminal will always output ... That is, the acceleration and infrared sensor subsystem will constantly detect changes in the external environment, as for why, think about acceleration and infrared sensors I think we should be able to know. If we touch the screen quickly after getevent, then the subsystem under the EVENT5 node responds immediately:
If we get a quick contact with the screen after the information, may not understand, right, to getevent plus-l parameter format output to see:
Note: Because of hardware equipment, touch area intensity, duration and other factors affect your output may not be the same as I, with specific your specific output will prevail, but the output information is roughly similar.
Here take the first message "/dev/input/event5:ev_abs abs_mt_tracking_id 000008e0", wherein/DEV/INPUT/EVENT5 above we said that the device node; Ev_abs represents the Type event class ABS_MT_TRACKING_ID represents a code event scan code; 000008E0 indicates a specific event value. The definitions of this information are declared in the Kernel/include/linux/input.h file, such as type input device types including the following:
#define Ev_syn 0x00#define ev_key 0x01#define ev_rel 0x02#define ev_abs 0x03#define ev_msc 0x04 #define Ev_led 0x11#define ev_snd 0x12#define ev_rep 0x14#define ev_ff 0x15#define ev_pwr 0x16# Define Ev_ff_status 0x17#define Ev_max 0x1f
Specifically they all represent what is not much to say, are some of the Linux things, generally more commonly used is ev_rel represents a relative coordinate type, ev_abs represents an absolute coordinate type, Ev_key represents the physical keyboard event type, Ev_syn represents the synchronization event type, etc. One device can support multiple different event types and each event type can also be set up with different event codes, such as the event code for the Ev_syn synchronization event type as follows:
#define SYN_REPORT 0 #define SYN_CONFIG 1 #define Syn_mt_report 2
The rest of the list is not listed in the Input.h file can be found in the corresponding definition. Feedback from a quick touch screen in the above illustration can be described as follows:
/dev/input/event5:ev_abs abs_mt_tracking_id 000008E0 Flag Multi-point tracking information acquisition begins (requires device support)/DEV/INP Ut/event5:ev_abs abs_mt_position_x 00000280 Escalation Contact Surface Center Point X coordinate/dev/input/event5:ev_abs abs_mt_position_y 000 0064B Report Contact Surface Center Point y coordinate/dev/input/event5:ev_abs abs_mt_pressure 0000005b report finger pressure/dev/input/event5:ev_abs Abs_ Mt_touch_major 00000014 reported the main contact surface long axis/dev/input/event5:ev_syn Syn_report 00000000 Synchronous Data/dev/input/event5:ev_ ABS abs_mt_pressure 00000057 report finger pressure/dev/input/event5:ev_abs abs_mt_touch_major 00000012 report main contact surface long axis/DEV/INP Ut/event5:ev_syn Syn_report 00000000 Synchronous data/dev/input/event5:ev_abs abs_mt_tracking_id FFFFFFFF flag multiple End of point tracking information acquisition (requires device support)/dev/input/event5:ev_syn Syn_report 00000000 Synchronizing Data
If the above process is just a quick touch of the resulting node reads, if we make more complex gesture operations such as multi-point cut watermelon like the effect of Nima is to collect this information is very much! Fortunately, the acquisition of these raw information does not need the application layer of developers to do, for application development we tend to care more about whether an event is clicked or double-click or long press and so on, rather than face these large and complex raw information, so, Android will be able to convert these raw data into a certain conversion for ease of use, if you need to do a driver development involving these raw data operations can also be directly accessible to its use.
It can be seen that the input subsystem in Linux captures the information of imported devices is the ancestor of the source of Android events, of course, these things for the application developers do not need to understand, only to understand. At the beginning of the article, we have said that the source of an event must come from the user's interaction, so is it true? Early games of children's shoes must be familiar with the key wizard this thing, at least not unfamiliar, we use the key wizard to simulate the user's operation of the keys, that is to say we do not necessarily need the user's real interaction, simulation is OK. The same Android also gives us another cool tool sendevent to write event information to /dev/input/ to simulate the occurrence of events, the specific usage is similar to getevent, do not say more, try it on your own.
How events from the full resolution of the Android event distribution come from