Over the years, "communicating with senior management with metric data" has been agile CommunityAnd "Fear data" is not a problem. The correct method is to make the cost, troubles, risks, and XP return absolutely public. It is dominant and can take effect. I am afraid I will continue to talk about interesting stories to make the agile community a pure technical capability discussion area. However, those numbers are universally valid only when the cases we know are random samples. These cases are not random samples, so the data cannot reflect the reality. Furthermore, statistical data from "historical surveys" is really not a very effective tool. Although they are credible, they lack a critical factor: verifiable. The Releaser will more or less mix his thoughts with "results.
Broad indicators such as success rates may be meaningful for sales staff or those who are involved in the change for some purpose. You can compare it to the Dow Jones index in the stock market. This indicator does not tell you why the stock market has risen/fallen by 200 points. It only tells investors what happened to the entire market that day. Investors need to find out why such a rise or fall occurs.
Therefore, the value of collecting successful and failed real cases is to provide more useful information for later users so that they may correctly determine their expectations when deciding on agility.
For more information, see infoq China.