Test and Evaluation in Support of Systems Acquisition
Summary of Change
Department of the Army
Washington, DC
30 May 2003
*Department of the Army
Pamphlet 73-1
Test and Evaluation
Test and Evaluation in Support of Systems Acquisition
By Order of the Secretary of the Army:
ERIC K. SHINSEKI
General, United States Army
Chief of Staff
Official:
JOEL B. HUDSON
Administrative Assistant to the
Secretary of the Army
History. This publication is an administrative revision. The portions affected by this administrative revision are listed in the summary of change.
Summary. This pamphlet provides guidance and procedures to implement test and evaluation policy for materiel and information technology systems as promulgated by AR 73-1. It outlines the basic Army test and evaluation philosophy; general test and evaluation guidance in support of materiel systems acquisition and information technology systems acquisition; test and evaluation guidance in support of system modifications and non-developmental items; the test and evaluation working-level integrated product team; preparation, staffing and approval of the test and evaluation master plan; detailed guidance on preparation, staffing, and approval of critical operational issues and criteria, to include key performance parameters; guidance on the planning, conduct, and reporting of system evaluation; and guidance on the planning, conduct, and reporting of testing (that is, developmental and operational), to include test support packages, test incidents, corrective actions, instrumentation, targets, and threat simulators.
Applicability. The provisions of this pamphlet apply to the Active Army, the Army National Guard of the United States, and the U.S. Army Reserve. This pamphlet is not applicable during mobilization.
Proponent and exception authority. The proponent of this pamphlet is the Assistant Secretary of the Army (Acquisition, Logistics and Technology). The proponent has the authority to approve exceptions to this pamphlet that are consistent with controlling law and regulation. The proponent may delegate this approval authority, in writing, to a division chief within the proponent agency who holds the grade of colonel or the civilian equivalent.
Suggested improvements. Users are invited to send comments and suggested improvements on DA FormDA FormDepartment of the Army form 2028 (Recommended Changes to Publications and Blank Forms) via email to usarmy.pentagon.hqda-asa-alt.mbx.asa-alt-publication-updates@army.mil.
Distribution. This publication is available in electronic media only and is intended for command levels A, B, C, D, and E for the Active Army, the Army National Guard of the United States, and the U.S. Army Reserve.
TOCTable of Contents
Chapter 1Introduction
Chapter 2Test and Evaluation Working-Level Integrated Product Team
Chapter 3Test and Evaluation Master Plan
Chapter 4Critical Operational Issues and Criteria
Chapter 5System Evaluation
Chapter 6Testing
Glossary
Appendix BTest and Evaluation Master Plan Checklist
Appendix CTest and Evaluation Master Plan Approval Pages
Appendix DTest and Evaluation Master Plan Format and Content
Appendix ECritical Operational Issues and Criteria Format and Content
Appendix FCritical Operational Issues and Criteria Process Guide
Appendix GCritical Operational Issues and Criteria Checklist
Appendix HCritical Operational Issues and Criteria Development Example
Appendix ISurvivability and Vulnerability Issue: System Evaluation Considerations
Appendix JLive Fire Vulnerability/Lethality Issue: System Evaluation Considerations
Appendix KReliability, Availability, and Maintainability Issues: System Evaluation Considerations
Appendix LLogistics Supportability (Including Transportability) Issue: System Evaluation
Considerations
Appendix MManpower and Personnel Integration Issue: System Evaluation Considerations
Appendix NSystem Safety Issue: System Evaluation Considerations
Appendix OInteroperability Issue: System Evaluation Considerations
Appendix PNatural Environmental Issue: System Evaluation Considerations
Appendix QSoftware Issue: System Evaluation Considerations
Appendix RDepartment of Army Test Facilities
Appendix SLive Fire Testing
Appendix TSoftware Testing
Appendix UTest and Evaluation Documentation Overview
SummaryPublication Summary Detailed test plan (DTP)
AR 73 – 1 Test organization The DTP is an event-level document used to supple-ment the EDP by providing explicit instructions for the day-to-day conduct of a test. It is derived from and imple-ments the SEP, and governs test control, data collection, data analysis, and the neces-sary administrative aspects of the test program. There may be one or several DTPs, de-pending on the complexity of the program and the number of test sites or test facilities providing data. The DTP is coordinated with the system evaluator and with other T&E WIPT members, if necessary, to ensure that it accurately and completely reflects the requirements for data, infor-mation, and analysis set forth in the EDP (if available). DTPs for FUSL LFT&E are submitted through the DUSA (OR) to the DOT&E for ap-proval. See appendix S for LFT DTP information. Developmental test readiness statement (DTRS) AR 73 – 1 Materiel developer The DTRS is a written state-ment prepared by the chair of the DTRR as part of the minutes. The statement docu-ments that the materiel sys-tem is ready for the PQT or the IT system is ready for the SQT. See chapter 6. Doctrine and organization test support package (D&O TSP) DA PamDA PamDepartment of the Army Pamphlets 73 – 1 TRADOCTRADOCU.S. Army Training and Doctrinal Command (combat developer) The D&O TSP is a set of doc-umentation prepared or re-vised by the CBTDEV or FP for each OT supporting an acquisition milestone deci-sion. Major components of the D&O TSP are means of
Appendix VTest Incident and Corrective Action Reporting
Appendix WSurvivability Testing
Appendix XOperational Testing Entrance Criteria Templates
Appendix YThreat Considerations for Testing
Appendix ZInstrumentation, Targets, and Threat Simulators
Appendix ASection A1. This section should provide a summary table identi-
fying the significant entities represented in the simulation, the function of each, an indicator of the level of confidence in the representation of that entity and function and any comments. Section A2. This section should include a representative sample of the results of tests or comparisons performed as part of the simulation validation effort and as described in the simulation validation plan. Tests or comparisons that illustrate simulation errors, limitations or differences from the threat should be in-cluded as well. In most cases, these results will appear as graphs. Section A3. This section, when applicable, should contain the standard validation criteria from the appropriate appendix/annex of the DoD Threat Simulator Program Plan with all the threat simulator/target data. In cases where the simulator/target has been made programmable, do not simply state programmable.
