There are many definitions of lean, but one of the most encouraging is that the Lean Enterprise Research Institute chairman John Shooke in his book, "Management Lean", described in the passage: lean by improving the level of staff to ensure product development. On the basis of this definition, this paper then explains how lean improves the level of people: the way to solve problems. This definition reveals the beauty of the following management practices: Carefully design your work so that you can clearly see the problems that occur (and the opportunities for learning at the same time) and solve them in a scientific way when problems arise.
When working with a team of software developers using agile methods, I had some misconceptions: at first, I confused the concept of bugs and problems, and was sure that the agile process was lean because it made the bugs visible. In the last few months, as the concept began to clear up in my head, I began to believe that the bugs generated by my agile team were not the same as the learning opportunities generated by the Lean system: the latter showed that there was a quality problem in my team, And I've seen the same thing on many other teams.
The purpose of writing this article is to describe how my thinking about bugs and quality issues has changed gradually. This can help readers to better understand the quality problems resulting from the bug, and improve performance accordingly. And through some real story descriptions to see what the real problem is. (Let's say one thing: we don't assume that all agile teams have similar misconceptions about this issue)
What is a bug?
In the software industry, a bug can represent any form of system error (NullPointerException, Http 404 error code, or blue screen ...). , functional error (when I click B, the system should have executed Z, but the y), performance issues, configuration errors, and so on.
In lean terminology, a bug must be able to express itself clearly in terms of the definition mentioned in the next section to say that it is a problem. Believe me, more than 95% of the bugs I've seen (and generated) don't seem to be a problem. Performance problems may be a common exception, but interestingly, they are
In the lean team, a bug does not represent a problem, XXXX. More than 95% of the bugs I've seen don't appear to be real problems--performance problems may be a common exception, but interestingly, they are also part of the performance, aren't they?
What is a problem?
Let's make a standard definition here. In "Toyota Model: Lean Mode Practice" (Toyota Way Field book), Jeffrey Liker defines four aspects of a problem:
Current real-world performance
Expected performance (standard performance or target performance)
The severity of the problem manifested by the difference between current performance and target performance
Scope and characteristics of the problem
As Brenée Brown said in a speech about vulnerabilities at TED, if you can't evaluate a vulnerability, it doesn't exist. In a more practical sense, if you can't explain the problem with the performance gap, it's probably because you haven't spent enough time thinking about it.
Before embarking on a problem, it is important to articulate it clearly and take the time to understand it (as the lean expert Michael Ballé says: Be kind to it) and restrain the urge to go straight to the solution. We have all heard of Einstein's famous saying: "If I have only one hour to solve a problem, I will first use 55 minutes to think about the problem, and finally 5 minutes to think about the solution." "No one said it was easy.
In the context of a software development Agile team, the performance indicator may be a Burndown chart (indicating workload and latency), number of bugs, system response time (quality), and customer evaluations of submitted user stories (with a total score of 10 to indicate customer satisfaction), And the number of user stories (or total user story points) submitted per sprint (productivity).
According to these metrics, you can have the following problems:
Quality: This page's response time target is within 500ms, and in the case of 5,000 concurrent users, the result we measured is 1500ms.
Quality: Number of bugs still unresolved at the end of the sprint (2, not 0)
Workload/Latency: We expect this user story to take 3 days to complete, but it actually took 8 days to complete
Productivity: At the end of the sprint, the entire team submitted 5 completed user stories, and the previous plan was to complete 7.
Customer satisfaction: We want each user story to get more than 8 points (out of 10 points), and after the end of the last sprint, there are two user stories with customer satisfaction below this score (6.5 and 7 points).