1. Interface
The list of touch screen drivers required by windowsce.net is as follows:
Touchpanelgetdevicecaps
Touchpanelenable
Touchpaneldisable
Touchpanelsetmode
Touchpanelreadcalibrationpoint
Touchpanelreadcalibrationabort
Touchpanelsetcalibration
Touchpanelcalibrateapoint
Touchpanelpowerhandler
If the wceshellfe_modules_transcriber or shellw_modules_cgrtouch module is used for compilation, the following DDI extension interface is required.
Touchreset
Touchregisterwindow
Touchunregisterwindow
Touchsetvalue
Touchgetvalue
Touchcreateevent
Touchgetfocuswnd
Touchgetlasttouchfocuswnd
Touchgetqueueptr
For the standard MDD-PDD layering model, the following ddsi interfaces are also defined for the connection between MDD and PDD.
Ddsitouchpanelattach
Ddsitouchpaneldetach
Ddsitouchpaneldisable
Ddsitouchpanelenable
Ddsitouchpanelgetdevicecaps
Ddsitouchpanelgetpoint
Ddsitouchpanelpowerhandler
2. Driver Composition
For typical windowsce.net drivers, the drivers link the tch_cal.lib and tchmdd. Lib static link libraries. The source code of these two static link libraries is as follows:
Public \ common \ oak \ drivers \ tch_cal (tch_cal.lib) This module is responsible for the calibration algorithm of the touch screen.
What is interesting is tchmdd. lib is special. This file cannot be found in the Lib folder of oak, but it exists in the oak directory of the project file, that is to say, this file is generated during the cesysgen process. Based on the interface provided by the link library, we can guess that the Lib should be composed of several other scattered lib libraries. The source folder of these link libraries is under the oak \ common \ drivers \ touch directory. Later, the following content is found in the MAKEFILE file under public \ common \ cesysgen:
Tchmdd ::
@ Set targettype = library
@ Set targetname =$ @
@ Set releasetype = oak
@ Set targetlibs =
! Ifdef sysgen_transcriber
@ Set sourcelibs = $ (sg_input_lib) \ tchmain. Lib $ (sg_input_lib) \ tch_trns.lib
Echo touch events des transcriber hooks
! Else
@ Set sourcelibs = $ (sg_input_lib) \ tchmain. Lib $ (sg_input_lib) \ tchbasic. Lib
Echo touch is minimal
! Endif
Tchmdd is determined. the content of LIB is tchmain. lib + tch_trns.lib or tchbasic. lib. because tchmdd requires a dedicated nmake command to generate the action, a bat file is in the touch driver directory to generate tchmdd. the LIB file is used for the link library.
Tchmdd is required. lib is set to generate in this way, without directly generating it, it can be guessed that the DDI interface is different under different compilation conditions (see the first part), so here we select different DDI interface implementation.
In this way, the touch driver is divided into two different types:
1. Do not use touch drivers with extended DDI Interfaces
2. The extended touch driver generated under transcriber compiling conditions.
3. DDI function overview and ddsi call relationship:
In a Layered software model, the following points need to be clarified:
1. Call interfaces and call relationships between layers
2. parameter transfer outside the interface
3. Logical timing relationship
4. Constraints
For the basic driver call relationship model, the touch driver is loaded by GWES, the driver is called by DDI to get the device status, and the driver function is set, the driver directly obtains hardware information to determine the current touch status. The driver consists of MDD and PDD. MDD is usually used directly without modification. This part provides the DDI interface for GWES, while MDD calls PDD through the specified ddsi function interface, this is divided into specific hardware-related components that need to be completed by HIV or OEM vendors, that is, what we usually need to implement as drivers. In addition to ddsi, some specified variable definitions or variable initialization actions need to be implemented between the PDD and MDD sections. That is to say, the relationship between MDD and PDD is not necessarily implemented through a strict layered model, sometimes, you also need to share variables to complete interaction.
Next let's take a look at the interface situation. The first part of DDI and ddsi interface has been introduced, the following needs to clarify the interface trigger conditions and the call relationship of the DDI-DDSI. Because DDI has two implementations, this part needs to be viewed in two different situations. Here I will just give a brief introduction to the basic touch driver situation.
There is also the call time logic related to the call relationship. The call time logic determines the order of public resource initialization and the content of available resources. Therefore, we need to take time as the main line. Make a simple analysis of the driver call situation.
Touchpaneldllentry
This function is not a DDI interface function, but an entry function of the driver DLL, that is, the first function called in the driver.
The called ddsi functions include:
Ddsitouchpanelattach
Ddsitouchpaneldetach
These two functions are loaded during the earliest and latest DLL loading and unloading processes.
Touchpanelenable
This function initializes the driver.
The called ddsi function is
Ddsitouchpanelenable
Ddsitouchpaneldisable
The execution of this function is as follows:
1. Create the htouchpanelevent, hcalibrationsampleavailable
2. Mutual split csmutex
3. Calibration tool Initialization
4. Check and initialize the required interrupt gintrtouch, and gintrtouchchanged
5. Hook Event Callback Function
6. Associating interrupt gintrtouch and gintrtouchchanged to event htouchpanelevent
7. Create ISR touchpanelpisr and set the ISR priority.
Touchpaneldisable
This function is opposite to the touchpanelenable execution condition mentioned above and provides the ability to stop devices.
The called ddsi functions include:
Ddsitouchpaneldisable
The execution of this function is as follows:
1. Disable ISR
2. Stop interruption
3. Event cancellation and other Synchronization Methods
Touchpanelpowerhandler
This function is generated when it enters or exits the poweroff State, because the content is only called on the content related to PDD.
The called ddsi functions include:
Ddsitouchpanelpowerhandler
Touchpanelpisr
This function is not an interface of DDI either. It is used to wait for and handle touch screen time interruptions and provides a unique event source for the entire driver.
The called ddsi function is
Ddsitouchpanelgetpoint
The implementation of this function is as follows:
1. Wait loop, used to receive touch interrupt messages and constitute the main body of the Function
2. Obtain the current touch screen position and information through ddsi
3. Collect/submit the pressed location information when obtaining valid data and in the calibration status
4. calibration data in normal state (this step is not required if PDD has been calibrated) and check the validity of the calibrated data
5. Finally, call the callback function passed in by GWES to submit the location information and status information.
Touchpanelgetdevicecaps
This function is a DDI interface function. Used to query the specific functions supported by touch screen devices.
The called ddsi function is
Ddsitouchpanelgetdevicecaps
The function action is
1. query the corresponding information through the ddsi Function
2. Save screen information when querying screen coordinate information for subsequent programs to map screen coordinates
Touchpanelsetmode
This function is also a DDI interface function used to set the working mode of the touch screen.
The PDD function called is ddsitouchpanelsetmode.
The action of this function is
1. directly set the IST priority through API
2. directly transfer other settings to PDD.
Touchpanelreadcalibrationpoint
This function is also a DDI interface function that reads the corresponding input values during the interaction validation process. After ISR obtains the corresponding tip event, it returns the location information through the passed pointer.
Touchpanelreadcalibrationabort
This function is called when the verification is canceled. Return only after the status bit and event are set.
Touchpanelsetcalibration
This is a DDI interface function, which is one of the implementation functions of the calibration tool. This function is used to generate calibration parameters. After the calibration UI ends the calibration action, send the key data to this function to calculate the benchmark parameter.
This function does not have the required ddsi.
The execution is not logical flow control, but a set of mathematical algorithms. The specific content is:
Assume that the touch screen data is in a linear relationship with its location offset and the display device pixel is in a linear relationship with its location offset. The location information returned by the touch screen and the pixel location information are converted to 2D coordinates. the following coordinate transformation is used to represent the Conversion Relationship Between touchcoordinate (TX, Ty) and screencoordinate (sx, Sy) that match the touch screen's position on the display device:
SX = A1 * Tx + B1 * ty + C1
Sy = a2 * Tx + B2 * ty + C2
The specific work of touchpanelsetcalibration is to determine A1, B1, C1, A2, B2, c2through screencoordinate and touchcoordinate obtained through the calibration action.
For cases where Tx, Ty, Sx, and SY are known. Consider a B C as an unknown number to solve the problem. Since there are three unknown numbers in two groups, we need to use the equations of two groups and three equations to solve them.
Construct a matrix here
| Sigma TX * Tx, Sigma TX * ty, Sigma TX | A1 (A2) | Sigma TX * SX (TX * Sy) |
| Σ TX * ty, Σ ty * ty, Σ ty | B1 (B2) | = | Σ ty * SX (TY * Sy) |
| Sigma Tx, Sigma ty, Sigma 1 | C1 (C2) | Sigma SX (SY) |
Then, we can solve the above equations through the Klein law. Finally, the calibration data is obtained, that is, the calibration parameter is the linear transformation parameter. Delta in the calibration data does not have the logic meaning. It is only the number that is not divided when solving the calibration data.
What is confusing here is that the square matrix corresponding to the above determinant is not satisfied with the rank (in fact, the difference between the three equations is only the same variable coefficient, which can solve a, B, c ?), In fact, the symmetric array constructed in the algorithm makes deltra necessarily 0. Why can we get a non-zero solution and get the calibration parameter. I have poor math skills and fail to understand the reasons.
Touchpanelcalibrateapoint
This function is a DDI interface function designed to calibrate the input coordinate information.
According to touchpanelsetcalibration, this function is based on the formula.
SX = A1 * Tx + B1 * ty + C1
Sy = a2 * Tx + B2 * ty + C2
After data calibration is completed, the deltra variable used in the function is the legacy factor in the equation process. The division operation is performed here to reduce data errors.
Here, we can see the outline of the driver, which is basically divided into several parts.
1. initialize and close the process. This part of the content is available in software all over the world. I will not talk about it much.
2. ISR entity, in fact, only one part of the entire driver operation is the real system event source and information transmission means, which corresponds to the PDD program, it is used to obtain the status and specific location information of the buttons. Ideally, the entire driver can work. (In fact, the virtual touch driver can do this)
3. Calibrate the Assembly. The calibration program occupies a large proportion of the entire touch driver, making the entire driver seem complicated. Because the driver of this part of the program is the user's interference with GWES, this part is not included in the driver runtime process. Therefore, you can leave this part completely or you can see a rough picture of the entire touch screen driver.
4. Relationship between UI implementation and drive Calibration
In fact, the UI calibration is completely isolated for the touch driver, and the UI only implements display. Logically, data is not involved in the data collection process.
5. Addendum
Mutex is used in the driver as a means of synchronization. In fact, mutex is used for system calling. I have never figured out why mutex is used instead of criticalsection. If you have any understanding of this, please kindly advise.
On many platforms, the touch controller and the ac97 controller are on the same chip and share a set of register interfaces. Because the touch driver and the audio driver are in different process spaces, the criticalsection cannot be used. In this case, it is necessary to use mutex to ensure the atomicity of register operations.
The driver interfaces and running modes called by GWES are very different from those of stream drivers. Apart from the display driver, the PDD implementation of these drivers is very simple. Sometimes it seems complicated to see that some drivers have a long content. In fact, aside from some of the complex features brought by system hardware, it is easier to implement these PDD parts.