" written before the series "
This series of Sony on the Smartphone touch screen article, a total of four, the Android smartphone touch system has done a system and detailed description.
Note 1: Although the translation, but still think the original text more accurate, if possible, please go to the original text: http://developer.sonymobile.com/tag/touch/
NOTE 2: If there is an error, please correct me and thank you.
(ii) Understanding touch response
Understanding Touch Response
This is the second article in our Touch screen technology series. In the previous article, we explained the components of the touchscreen system and how these components translate a touch input into graphical user feedback. In this article, we will continue to touch the response topic and explain the touch lag encountered when touching. Read on to get more details.
What is touch response
Touch response refers to the time it takes for a user to get feedback from a device as an input result. In our special case, this means touch events from touch hardware and frame updates on the display. Specifically, the touch response is the time that the user presses the touchpad until an application updates its UI so that it appears on the screen.
Because many parts of the device system are more inclined to touch the other name "system delay" of the response, "system delay" will be a more appropriate description. As a result, it is actually a system delay that users feel, not just touch-related accessories. Typically, touch hardware is considered the culprit for system delays. In fact, it only accounts for a small part of the system delay.
Why is touch response important?
Touch response is directly related to end users, such as games, refresh List Fire management Launcher app. Even if the user doesn't notice (or doesn't care) the system delay, the user experience is better when the delay is short, which is why it is so important to improve the responsiveness of the device. Less system latency, more fluid and faster devices.
We think that there are three different delays that are important to the touch experience-click Delay, initialize the move delay, and move the delay. Click Delay is the time from the "Touch Up" or "touch down" event-when the user lifts (or presses) a finger from the touch plate until something appears on the display as a response to the event. Initializing the move delay is a response to this event from the first "move" event to display something. Finally, the move delay is the same as the initialization move delay but in the replacement gesture (swiping gesture??). ) After measurement.
Timing (timing) analysis of touch ecosystems
All of the steps in touch event processing take time and contribute to the entire system delay. Although some portions take longer than others, the latency dependency in the touch ecosystem varies with the currently active application and different timing mechanisms. In the following sections, we will look at the various parts of the system that contribute to the delay (with a very simple usage scenario). The ecosystem component has been described in the previous article.
a very simple usage scenario--a running Android app
A simple usage scenario is when we have a running Android app. Our sample program responds to touch input and when a "touch down action" is detected, a white four-sided line is drawn in the application's view. A view in Android exposes an interface to access graphical layouts and draw surfaces. The following discussion is based on this usage scenario.
System Delay Overview
Analyze touch ecosystems with time series details
In the Android system, it usually represents the entire system delay. The value is measured in the Android version of the Jelly Bean. In the next article in the Touch series we will explain how to measure these different parts.
Touch Panel
The sensing point (usually the finger), the Touch IC scan channel is connected to the sensor. This allows the IC to generate events from about 60Hz to 120Hz, which is the common reporting rate for mobile devices today.
Sometimes when the signal noise is very large, if the position of the point cannot be determined, the touch IC may need to re-scan the panel. This reduces the reporting rate, although this should not happen on a well-tuned system. This is common on all capacitive touch panel mobile devices.
One 60Hz capacitive touch IC generates one event per 16.67ms (1/60s). If one presses exactly after the previous sensor, this can result in two of the entire scan time, which means an increase to a maximum of 34ms delay.
For example, the maximum delay from the panel is described as follows:
1, from the top of the panel to the bottom scan general scan;
2. The user tapping event happens after scanning the top of the panel. The touch ICS need to scan the remaining parts of the panel. This takes a 0 to 16.667ms delay, including firmware processing. The point has not been detected yet.
3, a new scan, the point in this scan will be detected. Including firmware processing, the delay will be 16.667ms.
4. The scan is complete-the firmware processes the data and sends them to the host system.
The time it takes for a firmware scan to take depends on the number of channels on the touch circuit and the active filter algorithm. The problem with the address of the touch integrated circuit filter, such as noise, linearity, and jitter, is dependent on the signal level implemented from the hardware (the signal levels) to activate. Of course, enabling multi-touch increases firmware processing time from increased data. If the touch ICS CPU is not strong enough, the reporting rate may be reduced.
If we use a touch integrated circuit that can run at 120Hz, the scan time is 8.33ms, which means we can almost reduce the near-average latency of nearly touch panels. The worst case will be 16.667 (8.33*2).
Touch Panel sleep mode
When the touch panel is not used for a while, it often lowers to sleep mode to conserve power. Sleep mode adds more time to the touch panel before it can send a response to the Android system. This additional time is caused by the touch panel to reduce the scanning rate to 5Hz to 20Hz, which depends on the manufacturer's decision at the design stage. This adds an additional 50ms to 200ms to give the tap delay and initialize the click Delay. Sometimes people mistakenly say that this long wake-up delay is due to a faulty touch solution, which is in fact a sobering decision to reduce current consumption.
Normal and reduced scan mode
Normal and reduced scans are two different modes that you can use to get data from the touch panel. The normal mode is designed to determine the speed of the scan touch panel sensor. So if the Touch sensor engineer specifies an optimal scan rate
60Hz, as long as one finger or other conductive object exists, the touch panel sensor has been scanned at 60Hz. Conversely, when a finger is present and does not move, the scan mode is lowered to reduce the scan rate. When the user touch is monitored, the escalation rate in this case increases to 60Hz, and the value is always there as long as the user moves the finger. When the finger stops, the escalation rate decreases until it starts to move again.
While setting up a touch integrated circuit to reduce the mode scan to conserve power, the drawback is a long delay-if the user's finger pauses on the touch panel and then starts generating events, such as moving the finger.
Host Touch Drive delay
The host touch drive delay comes from the terminal that acts on the bus, reads the data, translates the assembly data into operating system-specific touch events, and publishes data in the operating system kernel queue. The bytes is a reasonable number to calculate the delay. The bytes (400bits) is read by the host touch driver at 400kHz i²c Bus (400bits/400000bits/s = 1ms) at the cost of 1ms. These data are then assembled and processed. In addition, operating system-specific behaviors such as context Exchange entry (comes into play).
In summary, the delay of one point in the host drive is approximately 2 to 3ms. Have to (have to?? Handle multiple points (multi-touch) to increase the delay.
Event and Window management
The event manager of the Advanced operating system (or window frame) reads the touch events published by the host driver. In our scenario, this is done by Android. The event manager in Android is not a big contributor to system latency, it spends very little milliseconds (such as) to handle and transport events to the right target, in Android scenarios, these targets will be a activity,view and/or ViewGroup. Many parts of Android are related to event transmissions, which are implemented as observer patterns. This means that these observers remain in use until they notice that an event has arrived. Another delay factor is that these events will travel through several different threads to their final destination. When these events are processed in a thread, it needs to be switched off by the operating system scheduler (it needs to be switched in by the operating system scheduler to be executed).
Until the Android JB (Jelly Bean) version, a mechanism called choreographer was introduced as part of the event manager. The choreographer affects the delay, as described below.
Choreographer and Delay
When choreographer receives a vertical synchronous (VSYNC) signal notification, it invokes the event distribution to the event target. This mechanism gives applications more time to process and draw their content before the next frame. However, when touch events from touch input are created and vsynv inconsistent, this can lead to more latency. Under the circumstances of the bad timing (in case of the timing is?? ), assuming a 60Hz frequency on the touch panel, this will add an additional frame, or 16.667ms delay.
SLOP
SLOP is a threshold used to define motion as an actual motion. In practice, people will need the number of pixels that the previous "move" event will be sent to move the finger. This increases the initial latency of the perception, as any moving objects will fall further behind. The reason for the slop threshold is to prevent any jitter from the input data. For example, if you send a move event because of a pixel change, we will never be able to click or press anything, because all touches will be considered mobile. On a 1080p resolution 5-inch display, a pixel is very small, about 58μm.
Application
The time of application increase varies greatly and takes two large steps:
1. Perform some application logic to respond to received touch events;
2. Drawing and updating the user interface (Ui:user interface)
In these steps, applications use changes from 1ms to hundreds of milliseconds in real-world catastrophic situations. An important number does not exceed 1/60hz, meaning 16.67ms, for two reasons:
1. If an application spends more than 16.67ms per event on processing and drawing, it misses Frame/vsync and has to wait for the next vsync to add another 16.67ms to the delay. More frames are lost, resulting in more delays.
2. Breaking the 16.67ms boundary means that the application may not reach the speed of 60 frames per second. This causes the drop frame to cause a very bad user experience when the image is rendered without smoothing. The 60 frames per second requirement comes from the fact that today all smartphone displays have an average internal refresh rate of about 60Hz. So this is the maximum frame rate that can be displayed to the user. In order to not lose the frame the application refresh rate must be equal to or higher than this value.
Image Synthesizer
An image synthesizer is a process that can assemble multiple content surfaces into a single frame. In Android, the image assembler and the Manage frame cache can take a lot of time, even in a simple use case. Many of the steps in image synthesis phase are tied directly to vsync, which means that in the flow risk more steps take time to wait. An example of this can be seen in the following system trace.
The steps to directly bind VSync in an image composition are:
- Invalidates (informs the system that the frame needs to be updated)
- Draw (meaning to actually draw and swap the cache)
- Layer Assembly on MDP (mobile display processor)
- Kernel frame posted to display
This means that vsync timing must be ignored, which will greatly contribute to the delay. We tested the delay of handling more than 40ms in a single surfaceflinger. Learn more about how image assembly works in Android: Https://source.android.com/devices/graphics.html.
resolution
The display resolution is another factor contributing to the delay. A low resolution conversion is handled less by the CPU and GPU, and less data is passed and copied to the system. Our measurements indicate that at least one 20ms is different in a 720p (720*1280 pixel) and a 1080p (1080*1920 pixel) resolution. The difference between a qHD (540*960 pixel) and 1080p (1080*1920 pixel) is about 30ms.
system trace Event to Image Manager
In order to understand system events when receiving a touch event, we need to do some tracing:http://en.wikipedia.org/wiki/tracing_%28software%29. Here is a system trace of the very simple usage scenario we mentioned earlier in this article.
System Trace Illustration
As shown, some related threads are displayed from the system trace session. Below, this diagram details the read input event from the Android framework what happened there to a frame where it was assembled by the graphics framework and ready to be sent to the display driver. We have ruled out those touch IC scan sensors and eventually have a display drive to process the rendered frame and display its own portion.
1. The Android system reads the event and distributes it to the Incident management framework (the Event management framework). This costs 2 to 3ms. Next, the event is choreographer received, and as shown in, it is a long time until the next VSync arrives. Note Input event creation is independent of vsync in it, it can come at any time between the previous VSync, depending on the escalation rate. So the delay change here is related to the entire VSYNC timing (so the latency here can vary with a whole VSYNC timing).
2. When the choreographer receives the VSYNC signal, it distributes the input events to the target, in this scenario is our application.
3. CPU0 is currently in a busy state and requires a couple of milliseconds until the application starts processing the event. Note that this is actually a 4-core CPU (a Quad-core (4)) system but three of them are currently in a downline state.
Additional related threads from our system trace session
4. Now test the application to do basic drawing-a white four-sided line drawn on a black background with a 2D API. When a drawing is complete, the surface is marked dirty (changed) and informs the graphics framework that it should be updated at a coming display frame and any other changed surfaces are assembled.
5. The graphics framework takes over and composes the altered surface, combining all other surfaces to complete the final frame. When a new VSync timer is detected this will start first, so the test application must be ready before Karma VSync. No this, we will get a drop frame.
6. Next, the VSync comes and the touch display driver is able to further process the completed frames.
Host Display Driver
The host display driver is a group binding between frame-based caches and an overlays instance provided by the Android framework. Its role is to provide data so that the data can be transmitted and displayed via the MIPI DSI bus. Our delayed measurements show that the time it takes to process data here is quite limited, relative to the rest of the system, usually taking a few milliseconds.
Display
As for the delay, here are two things we need to consider. The first is the amount of time it takes to respond to physical material. There are several different display technologies, such as LCD with Sub-catagories,va,tn, and IPs. These are based on the liquid crystal (liquid crystals). Another common display technique is OLED based on luminescent compounds.
LCD displays vary greatly in latency, ranging between 2ms and 100ms, depending on the color and LCD type used. On the other hand, OLED displays are very fast and have a microsecond delay.
Another aspect is the internal refresh time of the monitor. Today, mobile device displays have an internal refresh rate of about 60Hz
, the entire display is updated about 16.67ms after conversion.
The display features a feature that shows when a new frame is rendered and the refresh occurs.
The refresh occurs when the display is rendering a new frame-the color gradient reverses when the monitor is touched. In this diagram, the right side shows a full reversal, and the left side is not, showing that the display's continuation refresh usually takes 16.67ms to complete.
In a ram-less display, there is no additional delay or wait state, since the data from the device platform must be continuously transmitted and the display will be updated by this stream. In a ram-base display, the latest frame is stored in the display's memory before it can be displayed so that more than one buffer is loaded. In this manner, we add a frame update delay, 16.67ms (this-way, we add one + frame update to the latency, which is 16.67ms).
***
This is the end of the article on system delay and touch response. In the next article, we will continue with the overlay sampling of the relevant touch topic.
More Information
- Read our first touch article on Touch ecosystems
- Learn the Xperia Z1 Compact, xperia T2 ultra and Xperia T2 Ultra Dual,sony's first configuration of the IPS panel on your smartphone.
(ii) Understanding touch response