Hey ~ recently doing a variety of mobile end products, design AH experience ah operation Ah, with the hands of the cocoon, the work has also considered the "hand" this relatively new way of interaction, and its relationship with the interface. This article starts from the type of gesture, the application scene, the usability problem and how to design four parts, so that we can think about it and learn from it.
Mobile devices are on the way today, gestures are a popular word, what is the gesture? Hand, is a natural tool for all kinds of creative activities, people are born to use the movements of the hand to express feelings, such as people will use the handshake to show friendliness, the deaf and mute use of a set of sign language instead of verbal communication, these are gestures in the life of the application. Visible since ancient times gestures is a set of specific language system, in human communication plays an important role. From an interactive perspective, gestures are actually an input pattern. We now understand the intuitive human-computer interaction is the interaction between people and machines, this interactive way through the mouse, physical hardware, screen touch, remote sense of the body of the gradual development of the process.
However, in the field of interactive design, the concept of gesture, which is widely discussed, is different from traditional keyboard operation and mouse operation. The following will focus on mobile device gesture operation, mainly from the problem of hand gestures, application scenarios and design should pay attention to what to say.
1. Using the mouse, the cursor trajectory simulation gestures
Website www.kakarod.com, using a large number of screen simulation gestures to interact, click, drag and drag, and so on, lively people before the light.
2. Gestures on physical hardware
Apple Magic Mouse mouse. The MacBook Touchpad supports multiple gestures such as single finger and multiple finger sliding.
3. Gestures on the touch screen
There are eight kinds of gestures, such as long press, touch, slide, drag, rotate, zoom and shake.
4. Long-distance somatosensory
Through the camera, sensors, etc. capture the hands and even the entire body posture, to control.
5. Future gestures
The use of holographic projection and sensors, in space or projection direct operation, this has been used in some areas, I believe that the near future will certainly be widely served in our lives. PS: In the patent of Apple's latest application, it mentions the technology of "projection gesture operation", so we will wait and see what revolutionary products the Chowa is going to bring.
Of course, there are other gestures in life, here is no longer to repeat. The main research in this paper is the current explosive growth of mobile devices on the touch-screen gesture operation, mainly ios,android system. Gestures on the touchscreen are the combination of a series of multi-touch events into a single event. This paper analyzes the present situation of hand gestures on touch-screen, and finds some obvious features of hand gesture interaction compared with traditional mouse keyboard. The following figure is a summary of gestures from the two dimensions of time and space. Provide reference for readers in gesture design.
The usability research expert Don Norma in the latest issue of the "consortium" magazine also questioned and criticized the gesture interface, that the new gesture interface there are many places do not follow the established interactive design principles, The standard of interactive design that has been well tested and understood in the industry is being overthrown, ignored and violated.
The author analyzes the existing application of the app, as well as the design experience of many products, found that this query is not unreasonable, there are several main problems:
1. Lower Accuracy
In the case of iOS, the accuracy of the gesture is much lower than the 1 pixel precision of the cursor. Suitable for finger click Area to do 44*44px (iphone4 the following equipment), with the weight of the gesture has 0~20px deviation, so touch screen interface needs to use a larger size of the control response area. The IPhone 3GS, ipad and iPhone4 screen resolution densities are 163 PPI, 132 PPI, and 326ppi, and the 3GS and ipad controls respond to pixels that are close to the 44PX standard, while iPhone4 need to be enlarged several times.
2. Lack of visibility and consistency
Take the ipad pages app, for example, for example, there are 2 objects in the manuscript, and you want to make it the same size, there are two ways: you can use the edge guides to make them the same size, of course, this zoom out of the way in many apps are very common, so it is easy to think of. Alternatively, you can do this by dragging one of the objects with one finger and with the other finger touching the object you want to be with, lift the first finger and lift the second finger when the size tip is present, and the two objects are exactly the same size (there are no help and instructions in the app). So obviously no one will easily find the second gesture. Even if you find it, you won't know how to use it anytime soon. The same goes for Android's long press operation.
The important reason for this problem is that the gesture interface usually does not represent the visual elements of the action, gesture is the action. If generic gestures are not a problem, it is difficult for users to find them if they have a rare combination of gestures, and they can create usability problems.
3. Increase operation cost and misoperation
On the displacement
Gesture operation compared to the rigid mouse clicks is indeed lively and interesting many, but some operations, such as zooming in and out of the drop but increased operating costs, the wheel on the mouse can be done, touch on the screen need to drag a lot of fingers down.
On the strength of
Gesture operation does not have the mouse to press the physical feedback, because the intensity also is difficult to grasp, sometimes the bad design can let the user mistakenly think is oneself operation question, thus repeatedly tries.
On the sensitivity
iOS touch screen is very sensitive, touch and long press operation line is very blurred, and in addition to the fixed button, many operations of the response area is very large, not subject to button size restrictions. Therefore, it is often accidentally encountered to make an action response, such as in the call record broadcast a number and the memo's right sliding delete.
4. Limited by physical factors
Physical keys
Bring real sense of touch and certain operation interruption feeling, late mobile phone gradually weakening physical button, gesture and screen combine more compact. Android uses a hardware button to trigger a menu, which means you can't predict what the program is and when it will have a menu option. Because the hardware button is always there, whether or not the program needs it.
Above, from left to right is the Plam pre, Palm pre2, Palm pre3, the return button and the phone screen integration more and more compact
In both directions.
Directly limited by physical buttons, Android device physical key position is not uniform, the screen switch is not easy to quickly identify, the operation of the gesture will be more significant impact. If the app supports the horizontal direction, consider the return button and the commonly used menu directly displayed in the software interface. So app should consider providing a "back" button directly.
Equipment size
The large-screen pad supports more complex gestures, most of which are referred to as a single operation.
Control form
Button size control (conversion of different resolution size), drag feedback prompts, sliding selection and click Conversion.
Based on the usability problem, the following points should be paid attention to in the design of gesture operation:
1. Operation Guide
This can be a detailed help interface can also be a graphical guide to metaphor (metaphor to conform to the user's mental model), such as pagination dot logo, or switch the page to reveal part of the content, can be long press the System icon, the footer rise, or even animation and so on. Here the level of the hint of their own, efficiency-oriented applications, as far as possible to be clearly visible, that is to see point. Immersion applications can reserve the space for exploration, allowing users to find their own, to bring unexpected surprises. such as Qqlivehd home pull rope shaking. However, it should be noted that hidden gestures and shortcut gestures do not affect the main operation flow and can be used as auxiliary gestures.
2. Operation Feedback
Gesture operation is fast and lightweight, but without the mouse press Didadida sound security, but also very limited by the sensitivity of the device screen, so the role of action feedback is critical. For example, icon press the response, here in addition to the effect of mouse over, the other three states and PC end is consistent, indispensable. In addition to consider the operating area is too small by the finger occlusion of the situation, the feedback must be obvious, and present in the visual range. For example QQ Address Book name retrieval operation. In addition to the visual feedback sound is an effective feedback method, such as the iphone sent SMS to send the sound of success. Sina Weibo feeds pull, tweetbot and so on, all cleverly use the sound feedback.
3. Misoperation
Gesture operation is more flexible than the mouse, if your program is very complex, the load of information, most of the area is the response area, then the probability of a false operation will increase greatly. So to allow users to undo the operation in time, always know what is happening now, not just give a warning when it happens. Often used in more important or obscure gestures, such as deletion, one-click Purge, long press, etc., two confirmed operations are critical.
OMG, unknowingly said so many ⊙﹏⊙ B, the above is just gesture interaction, for this application area, there are many worthy of research and exploration of places, welcome everyone to explore and learn.
Finally, thank the "fingertips" team members of the research on the contribution to the Force! A wonderful speech about gesture interaction brought by Jeff Han, which was last seen at TED: http://www.ted.com/talks/lang/eng/jeff_han_demos_his_breakthrough_touchscreen.html
(This article is from the Tencent CDC Blog, reprint, please indicate the source)