This article continues the exploration of requirements-related metrics that I began in Part 1, which looked at measuring product size and requirements quality.
Requirements Status
Track the status of each requirement over time to monitor overall project status, perhaps defining a requirement attribute to store this information. Status tracking can help you avoid the pervasive “ninety percent done” problem of software project tracking. Each requirement will have one of the following statuses at any time.
- Proposed (someone suggested it)
- Approved (it was allocated to a baseline)
- Implemented (the code was designed, written, and unit tested)
- Verified (the requirement passed its tests after integration into the product)
- Deferred (the requirement will be implemented in a future release)
- Deleted (you decided not to implement it at all)
- Rejected (the idea was never approved)
Other status options are possible, of course. Some organizations use a status of “Reviewed” because they want to confirm that a requirement is of high quality before allocating it to a baseline. Other organizations use “Delivered to Customer” to indicate that a requirement has actually been released.
When you ask a developer how he is coming along, he might say, “Of the eighty-seven requirements allocated to this subsystem, sixty-one of them are verified, nine are implemented but not yet verified, and seventeen aren’t yet completely implemented.” There’s a good chance that not all these requirements are the same size, will consume the same amount of implementation effort, or will deliver the same customer value. If I were a project manager, though, I’d feel that we had a good handle on the size of that subsystem and how close we were to completion. This is far more informative than, “I’m about ninety percent done. Lookin’ good!”
Requests for Changes
Much of requirements management involves handling requirement additions, modifications, and deletions. Therefore, track the status and impact of your requirements change requests. The data you collect should let your team answer questions such as the following:
- How many change requests were submitted in a given time period?
- How many of these requests are open, and how many are closed?
- How many requests were approved, and how many were rejected?
- How much effort did the team spend implementing each approved change?
- How long have the requests been open on average?
- On average, how many individual requirements or other artifacts are affected by each submitted change request?
Monitor how many changes are incorporated throughout development after you baselined the requirements for a specific release. Note that a single change request potentially can affect multiple requirements of different levels and types (user requirements, functional requirements, nonfunctional requirements). To calculate requirements volatility over a given time period, divide the number of changes by the total number of requirements at the beginning of the period (for example, at the time a baseline was defined):
The intent is not to try to eliminate requirements volatility. There are often good reasons to change requirements. However, we need to ensure that the project can manage the degree of requirements changes and still meet its commitments. Changes become more expensive as the product nears completion, and a sustained high level of approved change requests makes it difficult to know when you can ship the product. Most projects should become more resistant to making changes as development progresses, meaning the trend of accepted changes should approach zero as you near the planned completion date for a given release. An iterative development approach gives the team multiple opportunities to incorporate changes into subsequent iterations, while still keeping each iteration on schedule.
If you receive many change requests, that suggests that elicitation overlooked many requirements or that new ideas keep coming in as the project drags along month after month. Record where the change requests come from: marketing, users, sales, management, developers, and so on. The change request origins will tell you who to work with to reduce the number of overlooked, modified, and misunderstood requirements.
Change requests that remain unresolved for a long time suggest that your change management process isn’t working well. I once visited a company where a manager wryly admitted that they had enhancement requests that were several years old and still pending. This team should allocate certain of their open requests to specific planned maintenance releases and convert other long-term deferred change requests to a status of rejected. This would help the project manager focus the team’s energy on the most important and most urgent items in the change backlog.
Effort
Finally, I recommend that you record the time your team spends on requirements engineering activities. These activities include both requirements development (getting and writing good requirements) and requirements management (dealing with change, tracking status, recording traceability data, and so on).
I’m frequently asked how much time and effort a project should allocate to these functions. The answer depends enormously on the type and size of project, the developing team and organization, and the application domain. If you track your own team’s investment in these critical project activities, you can better estimate how much effort to plan for future projects.
Suppose that on one previous project, your team expended ten percent of its effort on requirements activities. In retrospect, you conclude that the requirements were too poorly defined and the project would have benefited from additional investment in developing quality requirements. The next time your team tackles a similar project, the project manager would be wise to allocate more than ten percent of the total project effort to the requirements work.
As you accumulate data, correlate the project development effort with some measure of product size. The documented requirements should give you an indication of size. You could correlate effort with the count of individually testable requirements, use case points, function points, or something else that is proportional to product size. As Figure 1 illustrates, such correlations provide a measure of your development team’s productivity, which will help you estimate and scope individual release contents. If you collect some product size data and track the corresponding implementation effort, you’ll be in a better position to create meaningful estimates for similar projects in the future.
Figure 1. Correlating requirements size with project effort gives a measure of team productivity. Each point represents a separate project.
Some people are afraid that launching a software measurement effort will consume too much time, time they feel should be spent doing “real work.” My experience, though, is that a sensible and focused metrics program doesn’t take much time or effort at all. It’s mostly a matter of developing a simple infrastructure for collecting and analyzing the data, and getting team members in the habit of recording some key bits of data about their work. Once you’ve developed a measurement culture in your organization, you’ll be surprised how much you can learn from the data.
Also read Measuring Requirements, Part 1
Jama Software has partnered with Karl Wiegers to share licensed content from his books and articles on our web site via a series of blog posts, whitepapers and webinars. Karl Wiegers is an independent consultant and not an employee of Jama. He can be reached at http://www.processimpact.com. Enjoy these free requirements management resources.
- Best Practices for Change Impact Analysis - September 12, 2022
- Characteristics of Effective Software Requirements and Software Requirements Specifications (SRS) - May 30, 2022
- Defining and Implementing Requirements Baselines - June 18, 2019