msg2133 touch screen (TP source code learning)
Emphasis: The following device refers to the touch screen
ABS: Absolute Value
1. Introduction to the input subsystem
Linux The total type of input device is complex, common include keys, keyboards, touch screen, mouse, rocker and so on, they are character devices, and Linux The kernel abstracts the commonality of these devices, simplifying drive development and creating a input subsystem. The subsystem is divided into three layers,1 shows.
Drive layer and hardware-related, directly capture and acquire hardware device data information, etc. (including touch screen is pressed, press position, mouse movement, keyboard press, etc.), and then report data information to the core layer. The core layer is responsible for connecting the drive layer and the event processing layer, and the registration of device drivers (Devices driver) and handlers (handler) needs to be done through the core layer. The core layer receives data from the driver layer and selects the corresponding handler to process the data, and finally handler the data to the user space.
All input devices will join a input_dev_list (input device link list) after registration, and all EventHandler will join a input_handler_list (input handler list) after registration, where the List_ The main function of head is to save the address as a node of input_dev_list and Input_handler_list. The correspondence between Input_dev_list and Input_handler_list is bridged by the input_handle structure
Input_handle are used to correlate Input_dev and Input_handler. Why use Input_handle to correlate Input_dev and Input_handler without direct correspondence between Input_dev and Input_handler? Because a device can correspond to multiple handler, one handler can handle multiple device. As a touch screen device can correspond to the event handler can also correspond to Tseve handler.
The relationship between Input_dev, Input_handler.
2. Introduction to touch Screen drive
Process
In Linux, the input device is described by the INPUT_DEV structure, defined in the input.h. The drive of the device can be achieved by simply following the steps below.
1). In the driver module load function, set the input device to support which events of the input subsystem;
2). Register the input device with the input subsystem;
3). When input device takes place (for example: keyboard is pressed/lifted, touch screen is touched/lifted/moved, mouse is moved/clicked/lifted, etc.), the occurrence of events and corresponding key/coordinate status is submitted.
Reference:
ARM Linux Kernel Input Shallow solution of input subsystem
Http://www.51hei.com/bbs/dpj-27652-1.html
2.1 Input Device Registration
Input Device the registration is actually just a few lines of code, because the input.c A large amount of code has been encapsulated, the main need to call a few key functions to complete the input Device the registration.
The global variable structinput_dev tsdevis predefined in xxx_ts.cand then entered into the initialization function
A complete input device system requires not only equipment, but also handlers Input_handler
2.2 Input Handler Registration
Input_handler is to deal with the user layer, a input_handler struct is defined directly in the EVDEV.C and some internal member variables are initialized.
2.3 Data transfer process
The data obtained from the hardware device (touch screen) needs to be processed by input.c selection of the corresponding handler, and finally escalated to the user space. How to get the data from the hardware device (touch screen), it depends on the code in the XXX_TS.C, the source code in the XXX_TS.C is directly related to the hardware device. In XXX_TS.C we have to complete the register configuration associated with the touchscreen device, on the other hand, to report the data obtained.
As for device initialization configuration, the configuration register is initialized based on the datasheet of the arm chip used by everyone, and there is no need to say more. Whether through the query method or interrupt method (I have not seen the query), when the touch screen press we will get the touch screen is pressed by the relevant data (mainly by pressing the x and Y coordinate values), and then need to upload data information, when the touch screen is pressed to report:
Input_report_key (Tsdev, Btn_touch, 1); Report button pressed Event
Input_report_abs (Tsdev, abs_x, X); Report the x-coordinate value of the touch screen being pressed
Input_report_abs (Tsdev, abs_y, Y); Report the y-coordinate value of the touch screen being pressed
Input_report_abs (Tsdev, abs_pressure, 1);//Report The pressure value (0 or 1) of the touch screen being pressed
Input_sync (Tsdev); Report a synchronization event that indicates the end of an event
When the stylus is lifted from the touch screen, it needs to be escalated:
Input_report_key (Tsdev, Btn_touch, 0); Report button is released event
Input_report_abs (Tsdev, abs_pressure, 0);//Report The pressure value (0 or 1) of the touch screen being pressed
Input_sync (Tsdev); Report a synchronization event that indicates the end of an event
2.4 Data Read Process
Reading becomes very simple, do the Linux programming can know, as long as in the application to define a input_event structure, open the device through open, and then read.
2.5
3. MSG2133A Drive Code Learning
3.1 Touch_driver_probe ()
(1) mstar_drv_platform_porting_layer.c:drvplatformlyrinputdeviceinitialize ()
//Assign content to a new input device, return a pre prepared structure input? Dev, and let / / g? Input device point to it.
/* allocate an input device */
g_InputDevice = input_allocate_device();
if (g_InputDevice == NULL)
{
DBG("*** input device allocation failed ***\n");
return -ENOMEM;
}
//Phys: the physical path of the touch screen device in the system hierarchy, where the touch screen device is mounted on the / / I2C bus; ID: the ID of the device, which corresponds to the input_id of the structure, and the bus used by the device is I2C.
g_InputDevice->name= pClient->name;
g_InputDevice->phys = "I2C";
g_InputDevice->dev.parent= &pClient->dev;
g_InputDevice->id.bustype= BUS_I2C;
//Set the event types supported by the device. Evbit indicates the event types supported by the device. Supported here
EV? Abs: absolute coordinate event for rocker
Ev_syn: synchronous event
Ev_key: keyboard event
Keybit indicates the key type supported by the device, BTN? Touch indicates the touch key, propbit indicates the device attributes and compatibility (quirks), and input? Prop? Direct indicates the direct input device
/* set the supported event type for input device */
set_bit(EV_ABS, g_InputDevice->evbit);
set_bit(EV_SYN, g_InputDevice->evbit);
set_bit(EV_KEY, g_InputDevice->evbit);
set_bit(BTN_TOUCH, g_InputDevice->keybit);
set_bit(INPUT_PROP_DIRECT, g_InputDevice->propbit);
//When touch panel support virtual key (ex.menu, home, back, search) defines this macro. The function of input ﹣ set ﹣ capability() is to mark that the device has the ability to handle key events (EV ﹣ key). The event code that can be handled is specified by G ﹣ tpvirtualkey [i]. Menu, home, back and search are supported here.
#ifdef CONFIG_TP_HAVE_KEY
/ / Method 1.
{
U32 I;
for (i = 0; i < MAX_KEY_NUM; i ++)
{
input_set_capability(g_InputDevice, EV_KEY, g_TpVirtualKey[i]);
}
}
#endif
Abs_mt_touch_major represents a long axis that describes the main contact surface
Abs_mt_width_major represents the long axis of the contact tool, such as
Abs_mt_touch_major/abs_mt_width_major, always less than 1, and pressure-related, the greater the pressure, the closer to 1.
Center point x coordinate of abs_mt_position_x contact position
Abs_mt_position_y the center point Y coordinate of the contact position
input_set_abs_params(g_InputDevice,ABS_MT_TOUCH_MAJOR, 0, 255, 0, 0);
input_set_abs_params(g_InputDevice,ABS_MT_WIDTH_MAJOR, 0, 15, 0, 0);
//For the x-axis range from touch screen min to touch screen min, the data error is - 0 to + 0, and the center smoothing position is 0.
input_set_abs_params(g_InputDevice,ABS_MT_POSITION_X, TOUCH_SCREEN_X_MIN, TOUCH_SCREEN_X_MAX, 0, 0);
input_set_abs_params(g_InputDevice,ABS_MT_POSITION_Y, TOUCH_SCREEN_Y_MIN, TOUCH_SCREEN_Y_MAX, 0, 0);
//Register input devices with input subsystem
/* register the input device to input sub-system */
nRetVal = input_register_device(g_InputDevice);
if (nRetVal < 0)
{
DBG("*** Unable to register touch input device ***\n");
}
(2) Mstar_drv_platform_porting_layer.c:drvplatformlyrtouchdevicerequestgpio ()
This function is the Gpio requesting the reset and interrupt pins.
#define MS_TS_MSG_IC_GPIO_RST 12
#define MS_TS_MSG_IC_GPIO_INT 13
#define MSM_TS_MSG_IC_GPIO_RST (MS_TS_MSG_IC_GPIO_RST+911)
#define MSM_TS_MSG_IC_GPIO_INT (MS_TS_MSG_IC_GPIO_INT+911)
//Apply for MSM? TS? MSG? IC? GPIO? Rst, which is named C? TP? Rst
nRetVal =gpio_request(MSM_TS_MSG_IC_GPIO_RST, "C_TP_RST");
if (nRetVal < 0)
{
DBG("*** Failed to request GPIO %d, error %d ***\n",MSM_TS_MSG_IC_GPIO_RST, nRetVal);
}
nRetVal = gpio_request(MSM_TS_MSG_IC_GPIO_INT,"C_TP_INT");
if (nRetVal < 0)
{
DBG("*** Failed to request GPIO %d, error %d ***\n",MSM_TS_MSG_IC_GPIO_INT, nRetVal);
}
(3) Mstar_drv_platform_porting_layer.c:drvplatformlyrtouchdevicepoweron ()
Reset Touch screen Ic,gpio_direction_output () after a certain Gpio port is written with a value, the port is also set to output mode.
gpio_direction_output(MSM_TS_MSG_IC_GPIO_RST,1);
udelay(100);
gpio_set_value(MSM_TS_MSG_IC_GPIO_RST, 0);
udelay(100);
gpio_set_value(MSM_TS_MSG_IC_GPIO_RST, 1);
mdelay(25);
(4) Mstar_drv_main.c:drvmaintouchdeviceinitialize ()
The main is to create the Procfs file system directory entry and create the/kernel/kset_example/kobject_example directory
(5) Mstar_drv_ic_fw_porting_layer.c:drvicfwlyrgetchiptype ()
by calling Mstar_drv_self_fw_control.c:drvfwctrlgetchiptype () to get the touchscreen IC type, the msg2133a corresponds to 2
(6) Mstar_drv_self_fw_control.c:drvfwctrlgetchiptype ()
Get Touch-screen IC chip types, such as #definechip_type_msg21xxa (0x02)
(7) MSTAR_DRV_PLATFORM_PORTING_LAYER.C:DRVPLATFORMLYRTOUCHDEVICERESETHW ()
Reset the HW by controlling the reset pin, as with the Drvplatformlyrtouchdevicepoweron () implementation.
(8) Mstar_drv_platform_porting_layer.c:drvplatformlyrtouchdeviceregisterfingertouchinterrupthandler ()
//Initialize the finger touch work queue and bind the work queue to the processing function "drvplatformlyrfingerthouchdowork()".
/* initialize the finger touch work queue*/
INIT_WORK(&_gFingerTouchWork,_DrvPlatformLyrFingerTouchDoWork);
//Returns the interrupt label to Gui RQ.
_gIrq =gpio_to_irq(MSM_TS_MSG_IC_GPIO_INT);
//Apply for an IRQ and register the corresponding ISR. When the IRQ occurs, the ISR will be called (here is "drvplatformlyrfingertouchinterrupthandler())
/*request an irq and register the isr */
nRetVal =request_threaded_irq(_gIrq/*MS_TS_MSG_IC_GPIO_INT*/, NULL,_DrvPlatformLyrFingerTouchInterruptHandler,IRQF_TRIGGER_RISING | IRQF_ONESHOT/*| IRQF_NO_SUSPEND *//* IRQF_TRIGGER_FALLING */,
"msg2xxx",NULL);
_gInterruptFlag = 1;
(9) Mstar_drv_platform_porting_layer.c:drvplatformlyrtouchdeviceregisterearlysuspend ()
Register the notifier, when is it called?
_gfbnotifier.notifier_call = Msdrvinterfacetouchdevicefbnotifiercallback;fb_register_client (&_gFbNotifier);
(10)
3.2 Finger touch touch screen handling process
From the above, when the TP is touched, the corresponding interrupt is generated and then called _drvplatformlyrfingertouchinterrupthandler ().
(1) _drvplatformlyrfingertouchinterrupthandler ()
spin_lock_irqsave(&_gIrqLock,nIrqFlag);
if(_gInterruptFlag == 1)
{
//disable_irq_nosync()
disable_irq_nosync(_gIrq);
_gInterruptFlag = 0;
schedule_work(&_gFingerTouchWork);
}
spin_unlock_irqrestore(&_gIrqLock,nIrqFlag);
After initialization _ginterruptflag is 1,schedule_work (&_gfingertouchwork); _drvplatformlyrfingertouchdowork () is called here
(2) _drvplatformlyrfingertouchdowork ()
Touch actions are handled primarily by calling Drvicfwlyrhandlefingertouch.
DrvIcFwLyrHandleFingerTouch(NULL, 0);
spin_lock_irqsave(&_gIrqLock,nIrqFlag);
if (_gInterruptFlag == 0)
{
//Enable the processing of one IRQ to respond to the next touch action.
enable_irq(_gIrq);
_gInterruptFlag = 1;
}
spin_unlock_irqrestore(&_gIrqLock,nIrqFlag);
nReportPacketLength =DEMO_MODE_PACKET_LENGTH;
pPacket = g_DemoModePacket;
rc = IicReadData(SLAVE_I2C_ID_DWI2C,&pPacket[0], nReportPacketLength);
If (RC < 0)
{
DBG("I2C read packet data failed, rc = %d\n", rc);
goto TouchHandleEnd;
}
(3) Drvicfwlyrhandlefingertouch ()
Call Drvfwctrlhandlefingertouch () to handle
(4) Drvfwctrlhandlefingertouch ()
Drvfwctrlhandlefingertouch
The data is parsed by calling _drvfwctrlparsepacket (), and then the data is escalated.
(5) _drvfwctrlparsepacket ()
The meaning of the 8 bytes of data is read through the 8 byte packets sent back to msg2133a by i²c
/*
Ppacket[0]:id, Ppacket[1]~ppacket[3]:thefirst point abs, Ppacket[4]~ppacket[6]:the relative distance between the Firstpoint Abs and the second point ABS
When Ppacket[1]~ppacket[4], ppacket[6] is0xff, KeyEvent, ppacket[5] to judge which key press.
PPACKET[1]~PPACKET[6] All is 0xFF, Releasetouch
*/
For msg2133a, Ppacket[0]=0x52,ppacket[1]~ppacket[3] is the ADC sample value that needs to be converted to x and Y coordinate values, and the formula for the conversion is similar to x= (x corresponds to the ad sample value *480)/2048,y.
(6) drvplatformlyrfingertouchpressed ()
Escalation Events
input_report_key(g_InputDevice, BTN_TOUCH,1);
input_report_abs(g_InputDevice, ABS_MT_TOUCH_MAJOR,1);
input_report_abs(g_InputDevice,ABS_MT_WIDTH_MAJOR, 1);
input_report_abs(g_InputDevice,ABS_MT_POSITION_X, nX);
input_report_abs(g_InputDevice,ABS_MT_POSITION_Y, nY);
input_mt_sync(g_InputDevice);
(7)
3.3 Virtual Key Implementation
When Ppacket[1]~ppacket[4], ppacket[6] is0xff, KeyEvent, ppacket[5] to judge which key press. According to our touchscreen screen printing buttons have menu, home, Back and search, press these positions ppacket[5] The values are 4, 8, 1, 2, and then do the corresponding processing in the Drvfwctrlhandlefingertouch function:
const int g_TpVirtualKey[] = {TOUCH_KEY_MENU,TOUCH_KEY_HOME, TOUCH_KEY_BACK, TOUCH_KEY_SEARCH};
if (tInfo.nTouchKeyCode != 0)
{
#ifdef CONFIG_TP_HAVE_KEY
if (tInfo.nTouchKeyCode == 4)// TOUCH_KEY_MENU
{
nTouchKeyCode=g_TpVirtualKey[0];
}
else if (tInfo.nTouchKeyCode ==1) // TOUCH_KEY_BACK
{
nTouchKeyCode =g_TpVirtualKey[2];
}
else if (tInfo.nTouchKeyCode ==2) //TOUCH_KEY_SEARCH
{
nTouchKeyCode =g_TpVirtualKey[3];
}
else if (tInfo.nTouchKeyCode ==8) //TOUCH_KEY_HOME
{
nTouchKeyCode =g_TpVirtualKey[1];
}
if (nLastKeyCode !=nTouchKeyCode)
{
DBG("key touchpressed\n");
DBG("nTouchKeyCode =%d, nLastKeyCode = %d\n", nTouchKeyCode, nLastKeyCode);
nLastKeyCode =nTouchKeyCode;
input_report_key(g_InputDevice,BTN_TOUCH, 1);
input_report_key(g_InputDevice, nTouchKeyCode, 1);
input_sync(g_InputDevice);
}
#endif //CONFIG_TP_HAVE_KEY
3.4
4. Debugging Methods
4.1 Debug Serial Port
4.2 Adb
(1) GetEvent, Button touch screen
(2) BusyBox Hexdump/dev/input/event2
Corresponds to the following code
struct timeval {
__kernel_time_t tv_sec; /*seconds */
__kernel_suseconds_t tv_usec; /*microseconds */
};
/*
*The event structure itself
*/
struct input_event {
structtimeval time;
__u16type;
__u16code;
__s32value;
};
msg2133 touch screen (TP source code learning)