" written before the series "
This series of Sony on the Smartphone touch screen article, a total of four, the Android smartphone touch system has done a system and detailed description.
Note 1: Although the translation, but still think the original text more accurate, if possible, please go to the original text: http://developer.sonymobile.com/tag/touch/
NOTE 2: If there is an error, please correct me and thank you.
(a) understanding of the intelligent touch screen ecosystem
Understanding the smartphone touch screen ecosystem
Starting today, we have written a series of articles to explore the internal mechanisms and evolution of the smart-machine touch system. Usually, we don't usually focus too much on simply touching a screen that directly causes the screen to show what this process is, and now it has become a basic design that is integrated into the smartphone. To start this series, Sony engineers, Magnum Johansson and Alexander Hunt, explain how the typical smartphone touch system works. I will elaborate later.
When you see a smartphone directly, no other interface (interface) is easier or more intuitive than a touch system. But how does this technology work? In order to uncover the secret of this subject of touch system, we have found this field of Sony mobile phone expert--magnus Johansson, software Master System engineer, and Alexander Hunt, hardware senior System engineer (Job translation is flawed ah rub ... )。 At the beginning of this series of articles covering the touch system, they will introduce the universal touch system and explain how each component works.
Master Systems Engineer Magnus Johansson and Senior Systems Engineer Alexander Hunt.
Touch Screen Eco-system
In this article, we look at the touch screen ecosystem from a general point of view. This is only a high-level overview, but it is necessary to understand the subject of touch performance. In this overview, we describe the hardware components involved and how they work, as well as the software components to be used and how they are associated (organized).
software for touch-screen systems & Hardware Components
The ecosystem starts with a hardware touch panel (Supplemental: Infrared touch screen, capacitive touch screen, resistive touchscreen)-the screen you actually touch when you start interacting with your phone. As you can see in the, this touch panel firmware sends data to the host Touch driver . Next, the device driver assembles the data and publishes the touch events in the operating system event queue. From here, the operating system's Event management Framework handles these events and broadcasts them to an accepting application. To illustrate the changes that the app makes in conjunction with these events, we call it updating its UI, for example, by scrolling through one of the steps in the list. In turn, this will trigger the graphics frame (the graphic framework) and by displaying the drive, it will push (push) a new frame to be displayed.
In the image below, we will describe these components in more detail.
Illustration of the touch ecosystem. (Touching the Eco-system illustrations)
Touch panel
Basically all of today's smart-machine touch technologies are called projected capacitors (project capacitive). With this technique, when the finger touches the screen, the electric field line is "projected" outside the touch surface. In short, the finger is "stealing electricity" from the receiving electrode (RX) and the distortion is detected by the touch IC in the field of the transmitting electrode (TX). The charge is measured in units of a unit called Coulomb (whose symbol is: C).
Illustration of projected capacitive touch, in which the finger steals a charge from the receiving electrode (RX) in the E lectrical field (Tx).
At each node, the capacitance is measured at a rate of several khz by the touch integrated circuit. If the scanning speed of the touch panel is at a maximum of 16,7ms, the touch IC needs to scan all the nodes at a speed of max. If the refresh rate is faster, the measurement needs to be faster. The smartphone's touch panel typically rates between 60Hz and 120Hz.
The information for the test capacitor is set in the ground design rule of the touch panel. Factors that affect the speed of the touch panel include:
L Number of points on the touch panel. Generally, touch ICS have a certain number of channels with limited nodes.
The size of the touch panel will tell you how many channels (channels) are required for this touch integrated circuit. This also connects (is connected to) the grid spacing of the touch sensor. Each electrode to another should be how to shut down. The recommended finger operation is the 4-6mm between electrodes. If you need to support an electronic pen, you need to compromise the grid spacing, nib size, and available signals.
L Design of touch panel electrodes. A touch sensor is made of silver wire, and the lower part is a transparent electrode that the user cannot see and is visible (a touch sensors is made by A Silver Line tracing under the parts the user cannot See and a transparent electrode in the visible area). This transparent electrode is usually made from a material called indium tin oxide ("ITO"). The resistance of Ito depends on the thickness and purity, and when the ITO layer is thicker or more pure, the resistance decreases. The ITO layer is thicker, the opacity is lower, and the sensor mode is easier to identify. As resistance increases, the RC constant () of the panel increases, and the scan rate is affected.
Once the touch panel is designed and a certain scan rate is determined, all completed measurements need to be processed and sent to the host. The touch ICS take all measured values from the touch panel and convert them to X-position, y-position, and other parameters. Of course, it also has the task of noise reduction. There are many component injection (inject) noise to the system around the touch panel, such as: display, charging power, radio transmitter, etc. The more filters you need to activate to reduce the noise, the longer it takes to get the parameters passed to the host "Note: The more filters that's needed to being active to reduce noise the longer it takes to get t He parameters to the host.
The number of fingers on the touch panel also affects how many calculators are required for the touch IC to calculate, which can also affect the scan rate.
Host Touch Drive
The host touch driver receives data from the firmware of the touch hardware. The driver assembles this data into a touch event and exposes the input event queue to the operating system. Since Android is a Linux-based operating system, such as the Xperia series, these touch data can be used in any user-space program located in "/dev/input/eventx" (Here, "X" is a number representing a specific device) under the path. These device outputs are exposed as files in the file system, and they can be accessed and read through the common file API provided by the system
Event management and choreographer
For us, the event management framework for high-level operating systems is part of Android. The framework reads the touch events from the event queue. The event management framework can do many things with these events. For example: it can try to monitor gestures. If the original data is heterogeneous or irregular, it may try to improve these event data in terms of location and time. It is able to attempt to synchronize these events to show vertical sync signals (vertical synchronization---VSYNC) in order to give the application more time to avoid dropping frames. However, its main task is to deliver these events to the right goals. This goal is usually a window or view of a current foreground and focus application.
The choreographer is introduced in the Android 4.3 (Jelly Bean), which is part of the Android Event management framework. Its significance is to improve the smoothness of Android, because compared with other operating systems, the smoothness problem has been defined as a very serious problem. Choreographer is an example that provides the purpose of synchronizing a VSYNC signal, which is formed by an input event that combines a mobile device multimedia subsystem with data received from a touch input device.
VSYNC is the acronym for Vertical synchronizing (field sync line) and it is a signal from the display system that informs the graphics system that it prepares a new render frame. Android reuses this signal to assign touch events.
When the choreographer receives a VSYNC signal notification, it invokes the event allocator to the event target. The rationale for this mechanism is to give the application enough time to process and draw before the next frame.
Application
At this point, the touch event is received by the currently active app. The app follows this touch event activity and is likely to do some processing and draw something to update the user interface (UI). The most basic way an application can hold touch events is by following the touch events that they receive. This approach follows several common processes:
(1) Receive events in an event handler, such as a view in Android or a ontouchevent (...) in activity.
(2) Update the internal state and execute application-specific logic.
(3) The drawing surface is invalid.
(4) Draw the user interface model on the drawing surface.
Then, the app needs to do everything in less time than the two-event interval (16.67ms). If not, the app will drop frames and often stutter.
Graphics Compositing (modified here)
When an application is rendered after the surface has failed, the graphics composition in the graphics frame is triggered. A surface failure means that it needs to be redrawn. Compositing means that all visible content surfaces will be combined into a picture that will then be sent to the video memory.
Depending on the system settings, the number of layers and other factors, the composition can be implemented by software or hardware methods. In our case, this will have "Surfaceflinger" done, it (Surfaceflinger) is an Android software image synthesizer, or through a hardware synthesizer in the platform. Naturally, if the operating system or hardware platform changes the actual work of these instances will change over time but this principle will probably stay for a while.
The final step of synthesis is done at the kernel level through hardware image synthesis (called Mobile Display Processor (MDP) on the Qualcomm platform). This involves additional layer compositing on the frames provided by the Surfaceflinger and it also involves rotation and scaling.
Host Display Driver
When a final synthetic image is completed, the host display driver will ensure that the currently correctly synthesized image data is sent to the display. This data is transmitted to the monitor via the MIPI DSI interface.
Display
The display is the final component in the touch ecosystem. There are many different types of monitors that are used for smartphones and tablets. Some of the most commonly used are liquid crystal displays (Liquid crystal display LCD) and organic electroluminescent diodes (organic light emitting diodes OLED) "Note 1"
The display typically has a 60Hz frame rate. The refresh from the display starts at a corner, after which the message is transmitted one line at a point until all the displays are refreshed.
An illustration of the scanning direction of a display
There are two different types of display driver ICS, with internal RAM and no internal RAM. The difference is that when you have internal RAM, the host only needs to send the data once, each time the frame has new content and needs to be updated. Another benefit is that only the parts that have been changed need to be accessed. This will drastically reduce the number of transmitted data on the display interface. The downside is that an extra cache exists in the system that adds extra latency. In the case of ram-less, the host needs to update all frames of the display as long as the monitor is active. This is the fastest, cheapest and easiest implementation, but it consumes power. In the Xperia device, we used two display drivers, depending on the display size, resolution and cost.
In addition, there are two different types of LCD technology, the most common is In-plane switching (IPS), vertical alignment (VA), Twisted nematic (TN). How they are arranged in liquid crystal (LC) has a completely different structure, depending on what features you like to enhance in your product. One example of the compromise scenario is that the OEMs needs to involve the LC (LIPUID chromatography) switching time. This means that time moves the LC from one position to another. TN-LCD is the fastest technology according to IPs and the latest Va. Take a look at the latest smartphones, mainly using IPs LCD or OLED and that's because the optical properties of IPs and OLED are the best way to compromise today.
OLED displays, which are self-illuminated display technology, do not have the same problem when switching time when this technique is much like switching lights on and off.
***
We hope this article will give you some general concepts in the touch screen system and the components behind it. In our next article, we'll look at a more detailed explanation of where the time actually was spent on the same system and how much it took. So be patient and wait Developer world mobile!
More information:
- Find out on our flagship Xperia devices, the Xperia Z2 and Xperia Z2 Tablet, Sony's latest devices to feature an IPS PA Nel.
- Learn how to develop remote control apps for Sony cameras.
- Discover how to lower the power consumption on your Xperia device with the new sensor co-processor.
- Check out the Evolutionui project within gamification.
Note 1:oled is Organic Light emitting diodes( organic light-emitting diodes ), has the characteristics of buckling, with the development of high-performance and large-area lighting products, the potential of the European has begun to be widely used.
(a) understanding of the intelligent touch screen ecosystem