Part III Agile Test Quadrant
6. Purpose of the test
Quadrant for Agile Testing
Support Team Testing : helping to develop and develop products
Quadrant one: tdd/td test. Use and apply the same encoding. The general internal quality is defined by the programmer and participates in the test. CI environment.
Quadrant Two: Test the details of each product, and the automated test runs on the business logic layer. Automate continuous integration, build, and test processes. Quick test, feedback bug. Functional environment
Support Product Testing : Confirm that the product meets the needs and improve the product. Test
Quadrant Three: Evaluation products to meet customer needs, competitiveness, improve the product. Simulate end-user testing.
Quadrant four: Every step of performance security development should be considered and not left to the end.
know when a product is finished : form a habit through a card.
managing technical Debt : Fast iterative products along with the code base is more and more difficult to maintain, more and more problems;
The automation and integration of running unit tests is a necessity for minimizing technical debt;
context-Sensitive testing : Contextual guidance for test work;
7. Support Team technology-oriented testing
Test in the first quadrant
Fundamentals of Agile Testing: TDD, support infrastructure (guaranteed code quality, more time for complex scenario testing)
Why: more efficient, easier for testers to work with, design-time-tested (hierarchical architecture and testability), and timely feedback;
when a technology-oriented test stops: only the single dimension is measured for technology. Complex scenarios for product testing.
If the team does not test: lead everyone to agile development;"Ask for help" mode;
related tools : IDE, build maven, continue to build Jenkins, single test Xunit. Imitation object or test pile mock.
8. Support for business-oriented testing
Second quadrant test: Finish the test before the code begins. External quality. Test content includes: pre-and post-conditions, impact on other functions, integration with the associated system.
Drive development through business-oriented testing: requirements and ideas exist PRD, testing. The team is based on product communication.
demand Dilemma : Demand =story (Customer team) + Test example (Test team) + communication (development of co-workers involved in writing)
Common language, stimulating demand, using the right way to ask questions, using examples, multiple viewpoints, communicating with customers, improving clarity, satisfying conditions, ripple effects;
Small increments
How to know we are done: consider and mitigate risks, testability and automation;
9. Business-Oriented Testing toolkit
1. Inspire examples and story: as a {role}, I need {function} to {business value}.
Describe expected behavior: checklists (templates: reports, associated systems, DB), mind maps, electronic forms (Financial domains: complex operation cases), model diagrams, flowcharts,
Model: Try to simulate user real data
Function Modification Model: Print original Function---------scan upload with change point
Wiki: Facilitating discussion, documenting communication, making decisions
2. Communication tools: Telephone, video, Webex, online whiteboard, mail, desktop VNC
3. Automation tools
Single-Test BDD tools: Easyb and JBehave
API function Test: Fitness
Web Services Test: SoapUI
GUI test tool: Record reply Watair,selenium,
4. Strategies for Writing tests
Build high-level test---> Detailed testing
1. Incremental build: Write a simple, basic stream test first. Each test is for only one business rule or condition.
2. Ensure the build and test pass. (a test is more likely to pass through the subsequent test)
3. Suitable test design mode
4. Based on time, activity and event mode????
5. Keywords and data-driven
5. Testability: If there are non-measurable modules, seek help from development
6. Test management: Also need to have version control.
V. Business-oriented testing of evaluation products
Evaluate the product: Try to reproduce the end user's actual experience. For changes in iterations, seize every opportunity to show, not wait for the iteration.
1. Scenario Testing:
Real-life domain knowledge is critical.
Soap opera test: real data and processes, have fun. (You can also find the customer to provide data)
Define scenarios, Workflow tools: Data flow, workflow diagrams.
2. Exploratory testing:
Sometimes, the meaning of doing things is far greater than thinking.
1. Possible errors
2. Simulation software operation mode
3. What to know when testing: Consider customer needs, team common mistakes, product quality evaluation.
3. Usability Testing
1. User needs and Roles: role type division, derived from different scenarios. (There is less user capacity and no usability testing.) )
2. Navigation test: Link, Tab
3. Research competitor Software: Take the time to use and research your opponent's software.
4. Behind the GUI
1. API Testing:
Check the number of input parameters, bounds, optional. Disrupts the order of the interface calls. Outputs the result boundary. Check when there is a return value, view log, DB, associated system when void. (Understand all parameters, methods.) )
2. Web Service testing: emphasizing interface effectiveness. Understand the quality of customer expectations, exploratory testing.
5. Document Testing
1. User documentation: Connection, text clear, consistent, concise, pop-up window, block pop-up window.
2. Test report: To obtain the correct data.
6. Exploratory testing Aids
Test settings: Sometimes it may take a day to reproduce the error, session-based testing uses automation to set up test data, scenarios (just modify the parameters). Tools Watir, Selenium IDE
Generate test data: Perlclip Test the text box with different types of input data. Enter 200 characters if required.
Monitoring: Logs, errors. TAIL-F Tools for Linux.
Simulator, emulator: Simulate incomplete, complex associated system. Simulate expensive devices such as mobile phones (downloadable emulators).
Vi. using technology-oriented testing and evaluation products
Performance-Related: Includes configuration, compatibility, ility (such as interactivity, reliability, security, extensibility, etc.), memory, recovery, data conversion. (preferably with a checklist.) )
1. Who will do it?
Security: Seek security groups. Data transformations: Database groups. Recovery test, Failover: Product Support Team.
2. When to do
Write a performance test story.
Design performance tests early in the morning.
Establish a performance test baseline.
Security:
Static code Analysis: Identify potential vulnerabilities. (Firebug)
Dynamic analysis: SQL injection or cross-site attack. (fuzzing)
Maintainability:
Success is 0, and failure must be negative.
Each class or module has a single responsibility.
All functions must be single entry single exit???
Two fields on a page cannot have the same name.
Compatibility:
OS, browser. Includes different versions and types.
Reliability:
First failure time, average failure time.
Scalability:
Whether the system can handle the growing demands of users. Network, database bottlenecks?
installable, Interactive. Research and propose test strategies to assess quality levels.
Performance testing it is important to define expectations.
Performance Testing Tools:
Pressure tools: such as Junitperf,httpperf,jmeter
Bottleneck analysis: Jprofiler, viewing bottlenecks, memory leaks.
JConsole: Analyze the use of DB.
Permon: Monitor CPU, memory, swap, disk IO, hardware resources.
Network: NetScout.
Performance Benchmarks:
TPS, transaction maximum processing time, maximum busy connection, maximum processing time and number of users, number of users when the maximum processing time is 8 seconds.
Vii. Summary of the test quadrant
Security:
Static code Analysis: Identify potential vulnerabilities. (Firebug)
Dynamic analysis: SQL injection or cross-site attack. (fuzzing)
Maintainability:
Success is 0, and failure must be negative.
Each class or module has a single responsibility.
All functions must be single entry single exit???
Two fields on a page cannot have the same name.
Compatibility:
OS, browser. Includes different versions and types.
Reliability:
First failure time, average failure time.
Scalability:
Whether the system can handle the growing demands of users. Network, database bottlenecks?
installable, Interactive. Research and propose test strategies to assess quality levels.
Performance testing it is important to define expectations.
Performance Testing Tools:
Pressure tools: such as Junitperf,httpperf,jmeter
Bottleneck analysis: Jprofiler, viewing bottlenecks, memory leaks.
JConsole: Analyze the use of DB.
Permon: Monitor CPU, memory, swap, disk IO, hardware resources.
Network: NetScout.
Performance Benchmarks:
TPS, transaction maximum processing time, maximum busy connection, maximum processing time and number of users, number of users when the maximum processing time is 8 seconds.
"Agile Software Test" reading notes (iii)