Translator: Google offers a new uiautomation framework to support user interface automation testing when Android 4.3 is released, using existing accessibility APIs to simulate user interaction with the device user interface, such as getting window interface controls and injection events. such as before 4.3 uiautomator tool is through InputManager or earlier WindowManager to inject keyevent, etc., 4.3 is used after the new framework uiautomation use accessibility APIs to inject the event.
Class Overview/Overview
Class for interacting with the device's UI by simulation user actions and Introspec tion of the screen content. It relies the platform accessibility APIs to introspect the screens and to perform some actions on the remote view tree. It also allows injecting of arbitrary raw input events simulating user interaction with keyboards and touch devices. One can think of a uiautomation as a special type Of accessibilityservice
which does not provide hooks for the service life cycle and exposes oth Er APIs that is useful for UI test automation.
This is a class that simulates user actions to interact with the device user interface and get screen content. It relies on the platform's accessibility APIs to get screen content and perform some actions on the remote control tree. It also allows the injection of native events (Translator Note: refers to inputevent. KeyEvent is also inherited from InputEvent, so it is a native event) to simulate the user's keystrokes and touch screen operations. We can assume that uiautomation is a special type of accessibilityservice that neither provides a hook function for the life cycle of the control service nor exposes any other APIs that can be used directly for user interface test automation.
the APIs exposed by this class is low-level to maximize flexibility when developin G UI test automation tools and libraries. Generally, a uiautomation client should be using a higher-level library or implement high-level functions. For example, performing a tap in the screen requires construction and injecting of a touch down and up events which having T o being delivered to the system by a call To Injectinputevent (InputEvent, Boolean)
.
This class exposes APIs that are very low-level, and are designed to provide maximum flexibility when developing user interfaces for testing automation frameworks and libraries. In general, a uiautomation client should use some (UIAutomation-based) higher-level libraries or implement a higher-level approach. For example, simulating a user's Click event on the screen needs to be constructed and injected with a press and a bounce event, and then a injectinputevent (InputEvent, Boolean) call to UIAutomation must be called to send to the operating system.
The APIs exposed by this class operate across applications enabling a client to write tests that cover use cases spanning over multiple applications. For example, going to the settings application to change a setting and then interacting with another application whose BEH Avior depends on that setting.
This class exposes APIs that can be applied across applications so that users can write test case scripts that can span multiple applications. For example, open the system settings app to modify some settings and then interact with another application that relies on that setting (the translator Note: This is not possible in the instrumentation framework).
Testing and debugging ( official changes to APIs from Android 4.3 )
Automated UI testing/user Interface Test automation
The new class provides APIs, which allow you to UiAutomation
simulate, user actions for test automation. By using the platform ' s AccessibilityService
APIs, the APIs allow you to inspect the screen UiAutomation
content and inject arbitrary keyboard a nd touch events.
The new UIAutomation class provides a series of APIs that allow you to simulate user actions when testing automation. The accessibilityservice apis, UiAutomation The
apis allows you to get the contents of a window (control) and inject keystrokes and touch-screen events.
to get a instance of uiautomation
, call instrumentation.getuiautomation ()
. Must supply The option with The Instrument
command when running your instrumentationtestcase
from adb Shell
.
You can call Instrumentation.getuiautomation ( ) to obtain an instance of UIAutomation. In order to make it work, when you are in adb Shell instrumentationtestcase You also need to instrument command provides -w this option.
with The uiautomation
instance, you can execute arbitrary events to test your app by Callingexecuteandwaitforevent ()
, passing it a runnable
to perform, a timeout period for the operation, and an implementation of the < Code style= "FONT-SIZE:13PX; Color:rgb (0,102,0); line-height:14px ">uiautomation.accessibilityeventfilter interface. It's within Your Uiautomation.accessibilityeventfilter
implementation So you'll receive a call this allows you to filter the Events that your ' re interested in and determine the success or failure of a given test case.
Through the example of UIAutomation, You can call its executeandwaitforevent () to inject different events into your app to test: The function accepts an executable runnable thread objects are used to perform event injection operations. An operation timed out, and an instance of a class that implements the Uiautomation.accessibilityeventfilter. It is in this Uiautomation.accessibilityeventfilter implementation class that you receive a callback that allows you to filter the events you like and decide if your test case will pass.
to observe all the events during a test, create an implementation of uiautomation.onaccessibilityeventlistener
and pass it to setonaccessibilityeventlistener ()
. Your listener interface then receives a call To Onaccessibilityevent ()
each time an event occurs, receiving An accessibilityevent
object that describes the event.
If you want to monitor all events during testing, You need to create a Uiautomation.onaccessibilityeventlistener implementation class and pass its instance to Setonaccessibilityeventlistener (). Your listener interface will receive a callback sent to Onaccessibilityevent () every time an event is triggered, and the parameter is a description of the event accessibilityevent
objects
There is a variety of other operations, the UiAutomation
APIs expose at a very low level to encourage the development of UI T EST tools such as uiautomator. For instance, UiAutomation
can also:
UiAutomation
APIs also exposes a number of other low-level operations to encourage the development of user interface testing tools such as Uiautomator. For example, UIAutomation can also do the following things:
- Inject Input events / injection event
- < Span style= "Color:rgb (34,34,34)" > change the orientation of the screen / Change the orientation of the screen
- take Screenshots / screenshot
And most importantly for UI test tools, the UiAutomation
APIs work across application boundaries, unlike those in Instrumentation
.
The most important thing about the user interface Automation test tool is that UiAutomation
APIs can work across applications, unlike the APIs that instrumentation provides.
Android4.3 introduction of the UIAutomation new framework official profile