Design Android Apps test Cases

Source: Internet
Author: User

Erik Nijkamp ([email protected] testobject.com) is CEO of Testobject Co., Ltd. (its headquarters in the suburb of Berlin, Hennigsdorf). Testobject specializes in mobile QA Solutions and provides cloud-based application testing services that fundamentally simplify the UI that provides test automation and can use any mobile app at any time with an intuitive test recorder test). As a product owner, he focuses on strategic alliances for Testobject business Solutions . He gained invaluable experience in high-tech fields during his time at the Silicon Valley (IBM Institute of America) and Consulting (IBM Germany) Inc.

In today's highly competitive marketplace, the success of an app is a reliable user interface (UI). Therefore, a thorough test of the UI that has some special attention and care for the functionality and user experience is essential. Challenges become even more complex when it comes to the number of Android platforms and the unique issues they raise (Android makes a significant challenge to the UI). The keyword "Fragmentation" symbolizes the biggest obstacle to comprehensive testing of mobile applications , and also indicates the difficulties caused by all forms, sizes, and configuration types of Android devices released to the market. This article will explain how the Android simulator can provide extensive testing covering a wide range of device types by using some techniques and simple practices.

  Introduction -testing in the dispersion unit
One of the biggest challenges common Android developers face in their daily work is that the range of end devices and operating system versions is too wide. A study conducted by OpenSignal found that there were over 11,828 different Android devices in the market in July 2013, and that all devices differed in type/size/screen resolution and specific configurations. Given that the previous year's survey recorded only 3,997 different devices, this is an increasingly challenging hurdle.

Figure 1.11, 828 types of Android devices (OpenSignal research, July 2013 [1]) distribution

From a mobile app development perspective, defining a terminal device has four basic features :
1. Operating system: The Android Operating system version (1.1), which is professionally defined by "API metrics" (1? 18) 4.3),.
2. Display: The screen is primarily defined by the screen resolution (in pixels), the screen pixel density (in dpi), and/or the screen size (in inches).
3. CPU: The application Binary Interface (ABI) defines the instruction set of the CPU. The main differences here are arm and Intel-based CPUs.
4. Memory: A device includes the predefined heap memory of the internal memory (RAM) and the Dalvik virtual memory (VM heap).
This is the first two features of the operating system and monitors that require special attention, as they are directly perceived by the end user and should be constantly rigorously tested for coverage. As for the Android version, there were eight different versions of the July 2013 market that ran the inevitable fragments simultaneously. In July, nearly 90% of these devices are running the Gingerbread version (2.3.3-2.3.7), 32.3 are running jelly beans (version 4.1.x), and 23.3 are running ice Cream Sandwich (4.0.3-4. 0.4).

Figure 2.16 Android version distribution (OpenSignal research, July 2013 [1])

Considering device displays, a study conducted by TechCrunch from April 2013 shows that the vast majority (79.9%) of active devices are using "normal" screens of size 3 and 4.5 inches. The screen density of these devices varies between "MDPI", "hdpi" and "xhdpi" (at the same temperature). There are exceptions, a device that only accounts for 9.5% of the screen density is low "hdpi" and the screen is small.

Figure 3. Common screen size and density distributions (Google research, April 2013) [2]

If this diversity is overlooked in the quality assurance process, then it is absolutely predictable that bugs will sneak into the application, then the bug report Storm, and finally the negative user comments in the Google Play store. So the question now is: how do you actually solve this challenge with a reasonable level of testing? Defining test cases and a companion test process is an effective weapon to meet this challenge.

  Use cases-"where to test", "What to test," "How To test", "when to test"?
"Where to test"
To save on the expensive time spent on your test work, we recommend that you first reduce the previous 32 Android version combinations and the 5-10-version display representing the leading device screens on the market. When selecting a reference device, you should make sure that you cover a wide range of versions and screen types. As a reference, you can use the OpenSignal survey or the infographic [3] using the phone to help you choose the most widely used device.
To satisfy curiosity, the size and resolution of the screen can be mapped to the density of the data above ("ldpi", "mdpi", etc.) and resolution ("small", "standard", etc.) from the Android file [5].

Figure 5: Six examples of high-diversity and highly-distributed Android devices (mobile phone detection research, February 2013) [3]

With the help of 2013 mobile phone detection Research, it is easy to find a representative series of devices. There is an interesting trivia: 30% Indian Android users have very low device resolution of only 240x320 pixels, as seen in the list above, and Samsung Galaxy Y S5360 is also in it. In addition, 480x800 resolution pixels are now most commonly used (visible in the table above in Samsung Galaxy S II).

  "What to test"
Mobile apps must provide the best user experience and are displayed correctly (UI testing) on a variety of smartphones and tablets in different sizes and resolutions (keyword "responsive design"). At the same time, apps must be functional and compatible (compatibility testing) with as many device specifications (memory, CPU, sensors, etc.) as possible. In addition to the previously acquired "direct" fragmentation problem (about Android version and screen features), "environment-related" fragmentation plays a pivotal role. This effect involves a number of different situations or environments in which the user is using an end device in their own environment. As an example, if the network connection is not stable, call interruption, screen lock and other situations appear, you should carefully consider the stress test [4] and exploratory testing to ensure perfect error-free.

Figure 6. Test all aspects of your Android device


It is necessary to prepare in advance all possible test scenarios that cover the most commonly used features of the app. Early bug detection and simple changes in source code can only be achieved through constant testing.

"How to test"
A pragmatic approach that takes this wide range of diversity into account is that the Android simulator-provides an adjustable tool that mimics almost all Android end-user devices on a standard PC. In short, the Android Simulator is the ideal tool for continuous regression testing (user interface, unit and integration testing) with various device configurations (compatibility tests) in the QA process. In exploratory testing, simulators can be configured in a wide range of different scenarios. For example, the simulator can be set in a way that simulates the change in connection speed or quality. However, QA on a real-world device is indispensable. In practice, virtual devices used as references can still be different in some small (but very important for some applications), such as no program-specific adjustments in the Android operating system or support for headphones and Bluetooth. Performance on real hardware plays a significant role in the evaluation process, and it should be tested on all possible end devices, such as touch hardware support and device physical form (usability testing).

 "When to test"
Now that we've defined where (the reference device) to test, what to test (test scenario), and how to test (Android simulator and real device), it's important to describe a process and determine when to execute which test scenario. Therefore, we recommend the following level two process:
1. Regression testing with a virtual appliance.
This includes continuous automated regression tests on virtual reference devices that are used to identify basic errors early on. The idea here is to quickly and cost-effectively identify bugs.
2. Acceptance tests performed with real equipment.
This is related to the intensive testing (primarily manual testing) of the real device (e.g., alpha and beta test groups in Google play[5) that was posted to Google Play store during the "plan promotion" period.
In the first phase, test automation has greatly contributed to achieving this strategy in an affordable manner. At this stage, only test cases that can be easily automated (that is, can be performed daily) can be included.
In the ongoing development of an app, this automated test provides a safety net for developers and testers. Routine test runs ensure that the core functions work properly, the overall stability and quality of the app is reflected transparently by the test data, and the certification regression can be easily associated with recent changes. Such tests can be easily designed and used by SaaS solutions, such as the Testobject UI mobile app test in the cloud, to be recorded from testers ' computers.
When and only if this stage has been successfully executed, the process will continue the labor intensive test in the second phase. The idea here is that if the core functions are only put into test resources through automated testing, testers can focus on advanced scenarios. This phase may include test cases, such as performance tests, usability tests, or compatibility tests. The combination of these two methods produces a strong mobile apps quality assurance strategy [7].

 Conclusion-doing the test
Using it in the right way, testing can be a powerful tool in the fight against fragmented Android. The key to an effective test strategy is to define a custom test case for the app at hand and define a workflow or process that simplifies testing. Testing a mobile app is a major challenge, but it can be effectively addressed with a structured approach and the right set of tools and expertise.

 Copyright notice: This article from SPASVO Software Testing network: http://www.spasvo.com/news/html/2014429143529.html

Original works, reproduced when you must be in the form of hyperlinks to the original source, author information and this statement, or will be held liable.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.