Msg2133 touch screen (TP source code learning)
Note: The following devices refer to touch screens.
ABS: absolute value
1. Introduction to the input subsystem
Linux input devices include buttons, keyboards, touch screens, mouse, and joystick. They are character devices, and the Linux kernel abstracts the commonality of these devices, simplified driver development creates an input subsystem. Subsystems are divided into three layers, as shown in 1.
Figure 1
The driver layer is related to hardware. It directly captures and obtains data information of hardware devices (including touch screen pressed, pressed position, mouse moving, keyboard pressed, and so on ), then, report the data information to the core layer. The core layer is responsible for connecting the driver layer and the event processing layer. The registration of device drivers and handler needs to be completed through the core layer. The core layer receives data information from the driver layer, select the corresponding handler for processing the data information, and finally the handler copies the data to the user space.
After registering all inputdevices, an input_dev_list (input device linked list) is added. After registering all eventhandler, an input_handler_list (input handler linked list) is added ), the list_head function is to store the address as a node of input_dev_list and input_handler_list. The correspondence between Input_dev_list and input_handler_list is bridge by the input_handle struct.
Input_handle is used to associate input_dev and input_handler. Why does input_handle be used to associate input_dev and input_handler without directly matching input_dev and input_handler? Because one device can correspond to multiple handler, and one handler can also process multiple devices. For example, a touch screen device can correspond to event handler or tseve handler.
The relationships among input_dev, input_handler, and input_handle are shown in figure 2.
Figure 2
2. Touch Screen driver Introduction
Process
In Linux, the Input device is described using the input_dev struct, which is defined in input. h. The device driver can be implemented by following the steps below.
1) set the events that the Input device supports in the driver module loading function;
2) register the Input device to the input subsystem;
3 ). when an Input operation occurs on the Input device (for example, when the keyboard is pressed/lifted, the touch screen is touched/lifted/moved, and the mouse is moved/clicked/lifted ), submits events and corresponding key values/coordinates.
2.1 input device registration
Inputdevice registration actually only has a few lines of code, because a large amount of code has been encapsulated in input. c. The master needs to call several key functions to complete the registration of the input device.
In xxx_ts.c, pre-define the global variable structinput_devtsdev; then enter the initialization function.
A complete input device system requires not only devices, but also the processing program input_handler.
2.2 input handler Registration
Input_handler is intended to deal with the user layer. In evdev. c, an input_handler struct is defined and some internal member variables are initialized.
2.3 Data Transfer Process
The data obtained from the hardware device (touch screen) needs to be processed by input. c, select the corresponding handler, and report it to the user space. The source code in xxx_ts.c is directly related to the hardware device. In xxx_ts.c, we need to complete the register configuration related to the touch screen device, and report the obtained data.
For device Initialization Configuration, You can initialize the configuration register based on The Datasheet of the ARM chip used by each user. Whether using the query method or the interrupt method (I have never seen using the query method ), when the touch screen is pressed, we will get the data related to the touch screen being pressed (mainly the X and Y coordinate values being pressed), and then we need to report the data information, report the following when the touch screen is pressed:
Input_report_key (tsdev, BTN_TOUCH, 1); // report the event of pressing the button
Input_report_abs (tsdev, ABS_X, x); // report the x coordinate value of the touch screen being pressed.
Input_report_abs (tsdev, ABS_Y, y); // report the y coordinate value of the touch screen being pressed.
Input_report_abs (tsdev, ABS_PRESSURE, 1); // report the pressure value of the touch screen being pressed (0 or 1)
Input_sync (tsdev); // report a synchronization event, indicating that an event has ended
When the pen is lifted from the touch screen, it must be reported:
Input_report_key (tsdev, BTN_TOUCH, 0); // report events in which the buttons are released
Input_report_abs (tsdev, ABS_PRESSURE, 0); // report the pressure value of the touch screen being pressed (0 or 1)
Input_sync (tsdev); // report a synchronization event, indicating that an event has ended
2.4 Data Reading Process
It is easy to read. Anyone who has programming in linux can know that, as long as an input_event struct is defined in the application, the device can be opened through open, and then read.
2.5
3. msg00003a drive code learning
3.1 touch_driver_probe ()
The involved files and some major function relationships are as follows:
Figure 3
(1) mstar_drv_platform_porting_layer.c: DrvPlatformLyrInputDeviceInitialize ()
// Allocate content to a new input device, return a pre-prepared structure, input_dev, and point // g_InputDevice to it. /* Allocate an input device */g_InputDevice = input_allocate_device (); if (g_InputDevice = NULL) {DBG ("*** input device allocation failed *** \ n "); return-ENOMEM;} // phys: physical path of the touch screen device in the system hierarchy. The touch screen device is mounted on the // I2C bus. id: the id of the device, the corresponding struct input_id. The device uses the I2C bus. G_InputDevice-> name = pClient-> name; g_InputDevice-> phys = "I2C"; g_InputDevice-> dev. parent = & pClient-> dev; g_InputDevice-> id. bustype = BUS_I2C; // set the event type supported by the device. evbit indicates the event type supported by the device. EV_ABS: absolute coordinate event is supported here, used for the joystick EV_SYN: Synchronous event EV_KEY: the Keybit of the keyboard event indicates the key type supported by the device, BTN_TOUCH indicates the touch button, and propbit indicates the device property and compatibility (quirks ), INPUT_PROP_DIRECT indicates the direct input device/* set the supported event type for input device */set_bit (EV_ABS, g_InputDevice-> evbit); set_bit (EV_SYN, g_InputDevice-> evbit); set_bit (EV_KEY, g_InputDevice-> evbit); set_bit (BTN_TOUCH, g_InputDevice-> keybit); set_bit (vertex, g_InputDevice-> propbit ); // when touch panel support virtual key (EX. menu, Home, Back, Search) defines this macro. input_set_capability () is used to mark the device's ability to process key events (EV_KEY). The event code that can be processed is specified by g_TpVirtualKey [I, menu, home, back, and search are supported. # Ifdef CONFIG_TP_HAVE_KEY // Method 1. {u32 I; for (I = 0; I <MAX_KEY_NUM; I ++) {input_set_capability (g_InputDevice, EV_KEY, g_TpVirtualKey [I]); }}# endif
// ABS_MT_TOUCH_MAJOR indicates the long axis of the main contact surface.
ABS_MT_WIDTH_MAJOR indicates the long axis of the contact tool, such
ABS_MT_TOUCH_MAJOR/ABS_MT_WIDTH_MAJOR is always smaller than 1 and is related to the pressure. The larger the pressure, the closer it is to 1.
The X coordinate of the center of the ABS_MT_POSITION_X Contact Position
The Y coordinate of the center of the ABS_MT_POSITION_Y Contact Position
Values (g_InputDevice, values, 0,255, 0, 0); input_set_abs_params (g_InputDevice, values, 0, 15, 0, 0); // the X axis ranges from TOUCH_SCREEN_X_MIN to TOUCH_SCREEN_X_MIN, the data error is-0 to + 0, and the center smooth position is 0. Values (g_InputDevice, ABS_MT_POSITION_X, TOUCH_SCREEN_X_MIN, TOUCH_SCREEN_X_MAX, 0, 0); values (g_InputDevice, ABS_MT_POSITION_Y, values, values, 0, 0 ); // register the input device with the input subsystem/* register the input device to input sub-system */nRetVal = input_register_device (g_InputDevice); if (nRetVal <0) {DBG ("*** Unable to register touch input device *** \ n ");}
(2) mstar_drv_platform_porting_layer.c: DrvPlatformLyrTouchDeviceRequestGPIO ()
This function is the GPIO that applies for Reset and interrupt pins.
# Define prepare 12 # define MS_TS_MSG_IC_GPIO_INT 13 # define partition (cost + 911) # define partition (MS_TS_MSG_IC_GPIO_INT + 911) // apply to the target GPIO, its name is C_TP_RSTnRetVal = gpio_request (MSM_TS_MSG_IC_GPIO_RST, "C_TP_RST"); if (nRetVal <0) {DBG ("*** Failed to request GPIO % d, error % d *** \ n ", response, nRetVal);} nRetVal = gpio_request (MSM_TS_MSG_IC_GPIO_INT," C_TP_INT "); if (nRetVal <0) {DBG ("*** Failed to request GPIO % d, error % d *** \ n", MSM_TS_MSG_IC_GPIO_INT, nRetVal );}
(3) mstar_drv_platform_porting_layer.c: DrvPlatformLyrTouchDevicePowerOn ()
// Reset the touch screen IC. When gpio_direction_output () writes a value to a GPIO port, the port is set to the output mode.
gpio_direction_output(MSM_TS_MSG_IC_GPIO_RST,1);udelay(100);gpio_set_value(MSM_TS_MSG_IC_GPIO_RST, 0);udelay(100);gpio_set_value(MSM_TS_MSG_IC_GPIO_RST, 1);mdelay(25);
(4) mstar_drv_main.c: DrvMainTouchDeviceInitialize ()
It mainly creates the procfs file system directory entry and the/kernel/kset_example/kobject_example directory.
Figure 4
(5) mstar_drv_ic_fw_porting_layer.c: DrvIcFwLyrGetChipType ()
Call mstar_drv_self_fw_control.c: DrvFwCtrlGetChipType () to obtain the touch screen IC type. The corresponding value of msg00003a is 2.
(6) mstar_drv_self_fw_control.c: DrvFwCtrlGetChipType ()
Obtain the touch screen IC chip type, for example, # defineCHIP_TYPE_MSG21XXA (0x02)
(7) mstar_drv_platform_porting_layer.c: DrvPlatformLyrTouchDeviceResetHw ()
You can reset hw by controlling the reset pin, which is the same as that of DrvPlatformLyrTouchDevicePowerOn.
(8) mstar_drv_platform_porting_layer.c: DrvPlatformLyrTouchDeviceRegisterFingerTouchInterruptHandler ()
// Initialize the finger-touch work queue and bind this work queue to the handler function _ DrvPlatformLyrFingerTouchDoWork. /* Initialize the finger touch work queue */INIT_WORK (& _ gFingerTouchWork, _ DrvPlatformLyrFingerTouchDoWork); // return the interrupt number to _ gIrq. _ GIrq = gpio_to_irq (MSM_TS_MSG_IC_GPIO_INT); // apply for an IRQ and register the corresponding ISR. When IRQ occurs, it will call ISR (here _ disconnect ()) /* request an irq and register the isr */nRetVal = request_threaded_irq (_ gIrq/* MS_TS_MSG_IC_GPIO_INT */, NULL, _ DrvPlatformLyrFingerTouchInterruptHandler, ir1__trigger_rising | ir1__oneshot/* | ir1__no_suspend * // * ir1__trigger_falling */, "msg2xxx", NULL); _ gInterruptFlag = 1;
(9) mstar_drv_platform_porting_layer.c: DrvPlatformLyrTouchDeviceRegisterEarlySuspend ()
// Register the notifier. When will it be called?
_gFbNotifier.notifier_call = MsDrvInterfaceTouchDeviceFbNotifierCallback;fb_register_client(&_gFbNotifier);
(10)
Processing of 3.2 finger touch Touch Screen
We can see from the above that when you touch TP, the corresponding interrupt will be generated, and then call _ DrvPlatformLyrFingerTouchInterruptHandler ().
Figure 5
(1) _ DrvPlatformLyrFingerTouchInterruptHandler ()
Spin_lock_irqsave (& _ gIrqLock, nIrqFlag); if (_ gInterruptFlag = 1) {// disable_irq_nosync () Close the interrupt without waiting for the completion of disable_irq_nosync (_ gIrq ); _ gInterruptFlag = 0; schedule_work (& _ gFingerTouchWork);} spin_unlock_irqrestore (& _ gIrqLock, nIrqFlag );
After initialization, _ gInterruptFlag is 1 and schedule_work (& _ gFingerTouchWork). Here, _ DrvPlatformLyrFingerTouchDoWork () is called ()
(2) _ DrvPlatformLyrFingerTouchDoWork ()
Call DrvIcFwLyrHandleFingerTouch to process the touch action.
DrvIcFwLyrHandleFingerTouch (NULL, 0); spin_lock_irqsave (& _ gIrqLock, nIrqFlag); if (_ gInterruptFlag = 0) {// enable the processing of an IRQ, it is easy to respond to the next touch action. Enable_irq (_ gIrq); _ gInterruptFlag = 1;} response (& _ gIrqLock, nIrqFlag); nReportPacketLength = response; pPacket = g_demodepacket; rc = IicReadData (response, & pPacket [0], nReportPacketLength); if (rc <0) {DBG ("I2C read packet data failed, rc = % d \ n", rc); goto TouchHandleEnd ;}
(3) DrvIcFwLyrHandleFingerTouch ()
Call DrvFwCtrlHandleFingerTouch () for processing
(4) DrvFwCtrlHandleFingerTouch ()
DrvFwCtrlHandleFingerTouch
Call _ DrvFwCtrlParsePacket () to parse the data and report the data.
(5) _ DrvFwCtrlParsePacket ()
Read the eight-byte data packets sent back from msg00003a through I2C, meaning
/*
PPacket [0]: id, pPacket [1] ~ PPacket [3]: thefirst point abs, pPacket [4] ~ PPacket [6]: the relative distance between the firstpoint abs and the second point abs
When pPacket [1] ~ PPacket [4], pPacket [6] is0xFF, keyevent, pPacket [5] to judge which key press.
PPacket [1] ~ PPacket [6] all are 0xFF, releasetouch
*/
For msg00003a, pPacket [0] = 0x52, pPacket [1] ~ PPacket [3] is the ADC sample value, which needs to be converted to the x and y coordinate values. The conversion formula is x = (x corresponds to the AD sample value * 480)/2048, and y is similar.
(6) DrvPlatformLyrFingerTouchPressed ()
Report events
input_report_key(g_InputDevice, BTN_TOUCH,1);input_report_abs(g_InputDevice, ABS_MT_TOUCH_MAJOR,1);input_report_abs(g_InputDevice,ABS_MT_WIDTH_MAJOR, 1);input_report_abs(g_InputDevice,ABS_MT_POSITION_X, nX);input_report_abs(g_InputDevice,ABS_MT_POSITION_Y, nY);input_mt_sync(g_InputDevice);
(7)
3.3 virtual key implementation
When pPacket [1] ~ PPacket [4], pPacket [6] is0xFF, keyevent, pPacket [5] to judge which key press. according to our touch screen silk screen buttons, the options include menu, home, back, and search. The values of pPacket [5] at the following locations are 4, 8, 1, and 2, respectively, then, perform the corresponding processing in the DrvFwCtrlHandleFingerTouch function:
const int g_TpVirtualKey[] = {TOUCH_KEY_MENU,TOUCH_KEY_HOME, TOUCH_KEY_BACK, TOUCH_KEY_SEARCH};if (tInfo.nTouchKeyCode != 0) {#ifdef CONFIG_TP_HAVE_KEY if (tInfo.nTouchKeyCode == 4)// TOUCH_KEY_MENU { nTouchKeyCode=g_TpVirtualKey[0]; } else if (tInfo.nTouchKeyCode ==1) // TOUCH_KEY_BACK { nTouchKeyCode =g_TpVirtualKey[2]; } else if (tInfo.nTouchKeyCode ==2) //TOUCH_KEY_SEARCH { nTouchKeyCode =g_TpVirtualKey[3]; } else if (tInfo.nTouchKeyCode ==8) //TOUCH_KEY_HOME { nTouchKeyCode =g_TpVirtualKey[1]; } if (nLastKeyCode !=nTouchKeyCode) { DBG("key touchpressed\n"); DBG("nTouchKeyCode =%d, nLastKeyCode = %d\n", nTouchKeyCode, nLastKeyCode); nLastKeyCode =nTouchKeyCode; input_report_key(g_InputDevice,BTN_TOUCH, 1); input_report_key(g_InputDevice, nTouchKeyCode, 1); input_sync(g_InputDevice); }#endif //CONFIG_TP_HAVE_KEY
3.4
4. debugging method
4.1 debug serial port
4.2 Adb
(1) getevent, touch screen with buttons
Figure 6
(2) busybox hexdump/dev/input/event2
Figure 7
Corresponding to the following code
struct timeval { __kernel_time_t tv_sec; /*seconds */ __kernel_suseconds_t tv_usec; /*microseconds */};/* *The event structure itself */ struct input_event { structtimeval time; __u16type; __u16code; __s32value;};