Conclusion 1: It is critical that there is early and clear communications and collaboration with users about requirements. In particular, it is extremely beneficial to get users, developers, and testers to collaborate on initial estimates of feasibility and for users to then categorize their requirements into a list of “must-haves” and a “wish list” with some prioritization that can be used to trade off at later stages of system development if necessary.
This conclusion reflects the need for continuous exchange and involvement of users in the development of requirements. User input can assist in assessing cost and mission effectiveness of a design and can aid in the development of the “analysis of alternatives.”1 Although continuous involvement in the development of requirements by users does occur in DOD—for example, the Army designates a capabilities manager for the U.S. Army Training and Doctrine Command to represent the user on a program—it does not appear to be emphasized as much or conducted as extensively as in industry.
The industry practice of asking customers to separate their needs into a list of “must-haves” and a “wish list” is especially appealing. It imposes discipline on customers: they are forced to carefully examine a system’s needs and capabilities and any discrepancies between them and, thus, make decisions early in the development process. Communication and collaboration also ensure that all parties, including the user, the program manager, the developer, and the tester, agree on the required performance levels of a system. Although elements of this concept have been implemented in DOD through the use of threshold and objective levels for requirements and by banding requirements and key system attributes, with appropriately higher authority approval required for any change, we emphasize that more can be done for more effective requirements setting.
1An analysis of alternatives (AoA) is part of several steps in the Joint Capabilities Integration and Development System (JCIDS), which assesses cost and mission effectiveness, given levels of performance and suitability. In JCIDS (a formal DOD procedure that defines requirements and evaluation criteria for defense systems in development) a required capability (e.g., defeat an Integrated Air Defense System) is evaluated through a capability-based analysis (CBA) and then by an AoA to develop system attributes as a function of required levels of performance and suitability. However, only system attributes are provided as “requirements” to the development and test community. Currently, there is no quantitative way to assess the impact of not meeting a system requirement on accomplishing the mission. If, on the other hand, the JCIDS/CBA/AoA process provided a quantitative linkage between mission accomplishment and system attributes, the acquisitions community would have an effective method for making decisions on threshold levels set by the requirements process and for understanding the cost effectiveness of changing those requirements.