1. System test activity process and content
In the system testing process, the main executive personnel's responsibility:
Test leader
1. Development and maintenance of the system test plan.
2. Preparation of the System test report, the system test suspension report.
3. Review the system test case.
4. Regularly monitor testing activities of testers and review test results.
5. Organize and verify the system test environment.
Test personnel
1. Develop and maintain the system test case.
2. Perform system test activities.
3. Communicate and coordinate with developers.
4. Regularly report the test process and results to the test Manager.
(Design and coding) development members
1. Communicate and coordinate with testers to solve the defects found in system testing.
2. Prepare the System Deployment manual and user manual documentation.
3. Verify and assist in setting up the system test environment.
2. Test Plan
This section is the 2014 Management Support System reform project system test plan, mainly defined the test objectives, scope, strategy, progress, resources and so on.
2.1. Test scope
Test Subsystem |
Modules/Functions |
whether to test |
Notes |
Basic platform |
Task Message Service |
Is |
|
Basic platform |
Unified organization and Account management Services |
Is |
|
Basic platform |
File management, Office File Transformation Services |
Is |
Put it in the second phase. |
Basic platform |
Data export Service |
Whether |
Pending |
Basic platform |
Log service |
Is |
The second stage |
Basic platform |
Work Calendar Service |
Is |
|
Basic platform |
Unified permissions, Menu Management Services |
Is |
|
Basic platform |
System Eco-Environmental management |
Whether |
System-Level content |
Basic platform |
Multi-tenancy capabilities |
Is |
|
Process Capability Platform |
Fast Process development services |
Is |
|
Process Capability Platform |
Quick release order Service |
Is |
Put it in the second phase. |
Process Capability Platform |
Information Column Development services |
Is |
Put it in the second phase. |
Process Capability Platform |
Simple statistical Reporting Service |
Whether |
Pending |
Process Capability Platform |
Operation and Maintenance Management Service |
Is |
Put it in the second phase. |
Sunshine Hall |
Login and Home |
Is |
|
Sunshine Hall |
My desktop |
Is |
|
Sunshine Hall |
Follow the process |
Is |
|
Sunshine Hall |
Process presentation |
Is |
|
Sunshine Hall |
Functional management |
Is |
Put it in the second phase. |
Sunshine Hall |
Guide |
Is |
Put it in the second phase. |
Process Management |
Process Execution |
Is |
|
Process Management |
Process monitoring |
Is |
|
...... |
...... |
|
|
According to the overall situation of the project, determine the test object, test focus.
2.2. Test criteria
Admission criteria: Integration testing activities have been completed and export guidelines for integration testing have been achieved.
Closing guidelines:
- The test case execution rate needs to reach 100%.
- The defect repair rate is 100% (except for the "minor" defect).
2.3. Test resource 2.3.1. Human Resources
role |
Human Resources |
responsibilities (test content) |
Test leader |
Chen X |
Responsible for test plan, test case, test report, Sunshine Hall test, etc. |
Test personnel |
Han xx, Wang xx |
Test capability platform, basic platform and process execution, etc. |
Developer |
Xu xx |
Be responsible for providing capability platform test case and problem handling, test environment construction |
2.3.2. Test environment
Server-side:
- ip:10.64.60.100
- Operating system: Linux RHEL6.5
- Platform environment: Opentext Cordys BOP 4.3
- Database: Oracle 10g, MySQL 5.5, Mongdb 3.0.5
- Web service: Apache Http 2.2.26
Client:
- Operating system: Windows 7
- Browser: Chrome, IE9.0 and above
2.3.3. Training needs
Training Content |
Training Methods |
participating People |
Scheduled Time |
Process modeling and rapid development |
Sample instruction and manual self-study |
Developers, testers |
August 10-15th |
Business and operations |
Explain requirements and user manuals |
Demand personnel and testers |
August 10-15th |
2.4. Test strategy
According to the actual situation of the project, the appropriate system testing strategy is developed, and the test strategy provides a recommended method for testing the test object.
2.4.1. Functional Testing
Test objective: To ensure that the test function is normal, including navigation, data input, output and other functions;
Test scope: Business-oriented operation interface visible functions, covering the Sun Hall, process applications, rapid development of the process;
Techniques used: Perform tests according to the description of the test case
Testing priorities and priorities:
- Priority: High priority of process application, low priority of basic management;
- Testing focus: Process execution in rapid process development and process application.
Start Standard: Integration test complete;
End criteria: Run the full business, including rapid process development, process execution, and presentation through the Sun Hall.
Special considerations: Because some functions are not developed, it is necessary to declare them in advance;
Limitations: Due to business limitations, it is not possible to fully test the process capability, so choose a representative business for testing, as well as simulation testing.
2.4.2. Data and database Integrity testing
Test objective: To ensure that the database access methods and processes run properly, the data integrity (refers to MySQL, Oracle, mongdb data between the complete and comprehensive);
Test scope: Full service;
Techniques used: Check the database to ensure that the data has been populated as expected, and that all database events have occurred properly, or check the returned data to ensure correct retrieval of the correct results;
Testing priorities and priorities: focus on business start-up, approval of transaction data integrity in the preservation process, and data traceability in case of system anomalies;
Limitations: System anomalies are difficult to produce and can be artificially created.
2.4.3. User interface Testing
Test target:
- The characteristics of the test object (for example: menu, size, position, status) conform to the standard;
- The use of the interface object's Access Method (Tab key, mouse click, shortcut key, mouse scroll).
Test scope:
- Process Application Interface
- Sunshine Hall
Special things to consider: Browser compatibility.
2.4.4. Interface Testing
Test target: Ensure that the interface calls are correct.
2.4.5. Failover and fail-over recovery testing
Test target:
- Failover ensures that, in the event of a failure, the standby system will "replace" the faulty system in a timely manner;
- Failure recovery ensures that once a failure occurs, the system responds appropriately to ensure that data and resources are not lost.
Test scope: Key business, such as: Database failure and network failure during business process flow;
Use technology:
- Load Balancing
- Database ha
- Database Master Plex Replication
Test focus and Priority:
Limitations:
2.5. Test Schedule
Module Name |
Work Tasks |
principal |
scheduled start date |
Scheduled End Date |
Process Capability Platform |
Process modeling and rapid development |
Han xx |
August 17 |
August 21 |
...... |
...... |
|
|
|
3. Test Cases
4. Test execution 4.1. Test Execution record
4.2. System Test Defect Record
Defect Number |
Defect Description |
Defect Type |
Degree of severity |
Priority Level |
Source of Defects |
Reporting Staff |
Report date |
Defect Status |
Solve People |
resolution measures |
resolution Date |
Validation Person |
Verify Date |
Notes |
...... |
...... |
|
|
|
|
|
|
|
|
|
|
|
|
|
A description of the test Defect record execution specification is as follows:
1. Defect number: Unique indication of defect, naming specification: module name + number (starting from 001).
2. Defect Type:
- F-functions: such as logic, pointers, loops, recursion, functions and other defects.
- G-Syntax: defects such as spelling, punctuation, and so on.
- A-assignment: such as declaration, repeating name, scope.
- I-interfaces: defects that interact with other components, modules or device drivers, invocation parameters, control blocks, or parameter lists.
- B-linked packaging: errors caused by Configuration library, change management, or versioning.
- D-Documents: requirements, summary design, detailed design and other documents.
- U-User interface: Human-Computer Interaction Features: screen format, confirm user input, functional effectiveness.
- P-Performance: does not meet the system's measurable property values, such as execution time, transaction rate, and so on.
- N-Standard: does not meet the requirements of various standards, such as coding standards, design requirements.
- E-Environment: design, compile, other support system issues.
3. Severity: fatal, severe, general, minor.
4. Priority: "High" level defects need to be resolved immediately; "Medium" level defects need to be queued for repair; "Low" level defects can be repaired at ease.
5. Defect Status:
- Submitted: The defect has been submitted.
- Open: Confirm the submitted defect and wait for processing.
- Reject: Rejects the submitted defect and does not need to be repaired or is not a defect.
- FIX: The bug is fixed.
- Close: Confirm the defect that was repaired and close it.
- Re-open: Verify the fixed defect and verify that the result is not fixed.
5. Test Report
After performing the system test, a test report needs to be formed.
5.1. Test Report Contents
5.2. Test process Rollup 5.2.1. Test Case execution
function Module |
number of execution cases |
number of failed cases |
by number of use cases |
number of cases not executed |
use case pass rate (%) |
...... |
|
|
|
|
|
...... |
|
|
|
|
|
5.2.2. Defect statistics
5.3. Documentation included with the test report
- System Deployment Manual
- User manual
Summarize
Through the sharing of this article, we can provide the Test Manager and tester with the system test execution process and the operational reference specification document to improve the testing quality. There are shortcomings, welcome feedback exchange.
Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.
Software project Management (CMMI Maturity) Practice--System testing