APPENDIX E
Glossary and Acronyms
ACAT:
Acquisition Category; a designation for each program based on cost that determines both the level of review that is required by law and the level at which milestone (see below) decision authority rests in DoD. There are four acquisition categories, ACAT I through ACAT IV; the most expensive systems are designated ACAT I. ACAT I programs have two sub-categories: ACAT ID (milestone decision authority is the USD[A&T]); and ACAT IC (milestone decision authority is the DoD Component Head.
AFOTEC:
Air Force Operational Test and Evaluation Center
AMSAA:
Army Materiel Systems Analysis Activity
Analysis of Alternatives:
(formerly Cost and Operational Effectiveness Analysis - COEA); this is a cost-benefit analysis tool that provides justification for the selection of one procurement option over an alternative.
Acquisition Program Baseline (APB):
can be viewed as a contract between the milestone decision authority and the relevant service, including the program manager and his/her supervisors.
ATACMS:
Army Tactical Missile System
BAT:
Brilliant Anti-Tank
CAIG:
Cost Analysis Improvement Group
DAB:
Defense Acquisition Board
Developmental Testing (DT):
responsible for performing specification-based testing, verifies that the system meets all specifications, and certifies that the system is ready to enter operational testing; determines if and how the system works.
DIS:
Distributed Interactive Simulation
DoD:
Department of Defense
DOT&E:
Director, Operational Test and Evaluation
DT&E:
Developmental Testing and Evaluation
EADSIM:
Force-on-Force Air Defense Model
EMD:
Engineering and Manufacturing Development
Evolutionary Procurement:
pertains to continuous changes to a system and the associated stage-wise testing and development, so that what is operationally tested is not necessarily what is deployed; particularly relevant to software and software-intensive systems.
IDA:
Institute for Defense Analyses; a federally funded research and development center established to assist the Office of the Secretary of Defense, the Joint Staff, the Unified Commands and Defense Agencies in addressing important national security issues, particularly those requiring scientific and technical expertise.
IOT&E:
Initial Operational Test and Evaluation
ISO:
Organization for International Standardization; a worldwide federation of 92 member countries established to promote the development of international standards and related activities to facilitate the exchange of goods and services.
JROC:
Joint Requirements Oversight Council; serves to support milestone review, validate the Operational Requirements Document, and validate mission need.
LOSFH:
Line of Sight—Forward Heavy; LOSFH was the generic name given
early on to the conceptual ADATS system, while ADATS was the particular piece of hardware/software that was chosen to fill that role.
M&S:
Modeling and Simulation
MCOTEA:
Marine Corps Operational Test and Evaluation Activity
MDAPs:
Major Defense Acquisition Program; ACAT I programs are MDAPs
Measure of Effectiveness (MOE):
A measure of how well an operational task or an assigned task is accomplished; can be measured directly or may require the aggregation of MOPs.
Measure of Performance (MOP):
A measure of how well a system performs its function or how well a design characteristic meets an operational requirement.
Milestone:
One of five steps in the procurement of a weapons system; Milestone 0 is the concept studies approval, Milestone I is the concept demonstration approval, Milestone II is the development approval, Milestone III is the production approval, and Milestone IV is the major modification approval.
Mission Needs Statement (MNS):
a conceptual document, prepared by the relevant military service in response to a perceived threat, that is supposed to identify a broadly stated operational need (not a specific solution to counter the perceived threat).
MTBOMF:
Mean Time Between Operational Mission Failure
MTTF:
Mean Time To Failure
Operational Effectiveness:
the capability of a system to perform its mission in the operational environment, and in the face of the expected threat, including countermeasures.
OMS/MP:
Operational Mode Summary and Mission Profiles; defines the environment and stress levels the system is expected to encounter in the field. They include the overall length of the scenarios of use, the sequence of missions, and the maintenance opportunities.
Operational Requirements Document (ORD):
a document that describes in some detail the translation from the broadly stated mission need to the system performance parameters that the users and the program manager believe the system must have to justify its eventual procurement.Operational Suitability:
the degree that a system can be placed satisfactorily in field use with consideration given to availability, compatibility, transportability, interoperability, wartime usage rates, maintainability, safety, human factors, manpower supportability, logistics supportability, natural environmental effects and impacts, documentation, and training requirements.
Operational Testing and Evaluation (OT&E):
pertains to field tests, under realistic conditions, to determine system effectiveness and suitability for use in combat by typical military users; assesses when and where the system will work.
OPTEC:
Army Operational Test and Evaluation Command
OPTEVFOR:
Navy Operational Test and Evaluation Force
OSD:
Office of the Secretary of Defense
PA&E:
Office of Program Analysis and Evaluation
PEO:
Program Evaluation Office
PM:
Program Manager; the "champion" in the Department of Defense of a military system in development.
RAM:
Reliability, Availability, and Maintainability
ROC:
Required Operational Capability
Test and Evaluation Master Plan (TEMP):
documents the overall structure and objectives of the test and evaluation program, provides a framework for generating detailed test and evaluation plans, and documents associated schedule and resource implications.
USD(A&T):
Under Secretary of Defense (Acquisition and Technology)
VV&A:
Verification, Validation, and Accreditation