USAWC STRATEGY RESEARCH PROJECT

transformation of Army Test and Evaluation

by

Mr. Robert L. Bowen

Department of the Army Civilian

COL Donald R. Yates

Project Advisor

This SRP is submitted in partial fulfillment of the requirements of the Master of Strategic Studies Degree. The views expressed in this student academic research paper are those of the author and do not reflect the official policy or position of the Department of the Army, Department of Defense, or the U.S. Government.The views expressed in this academic research paper are those of the author and do not necessarily reflect the official policy or position of the U.S. Government, the Department of Defense, or any of its agencies.

U.S. Army War College

Carlisle Barracks, Pennsylvania 17013


ABSTRACT

AUTHOR: Robert L. Bowen

TITLE: Transformation of Army Test and Evaluation

FORMAT: Strategy Research Project

DATE: 19 March 2004 PAGES: ##345 CLASSIFICATION: Unclassified

(Note: PAGES =Total number of pages from cover to last page…*remove this note)

In the late 1990’s, the U.S. Army implemented significant organizational and procedural changes were made to the way that the Army conductsin the conduct of “independent evaluations” of Army developmental systems. While these changes significantly improved the planning and conduct of independent evaluations in support of the materiel acquisition process, disconnects and inefficiencies still existpersist within and between the Army’s test and evaluation and analysis communities. This paper Strategy Research Project will describe the recent evolution of the organizations and processes currently involved in Army evaluation, and will then propose additional changes that will further streamline Army weapon systems analysis and evaluation.


TABLE OF CONTENTS

ABSTRACT iii

List of illustrations vii

List of Tables ix

Transformation of Army test and Evaluation 1

Strategic relevance 2

t&e AND ANALYSIS ORGANIZATION interdependencies 3

Director, Operational Test and Evaluation (DOT&E) 3

Army Test and Evaluation Command (ATEC) 4

Army research laboratory (arl) 5

Survivability/Lethality Analysis Directorate (SLAD) 6

Human Research and Engineering Directorate (HRED) 6

Army Materiel Systems Analysis Activity (AMSAA) 7

Program Manager (PM) 7

training and Doctrine Command (Tradoc) 8

Center for Army Analysis (CAA) 8

Operational Forces 8

Recent Organizational Changes 9

Room for improvement 10

Army Structural suggestions 11

Joint Operational Test Realism 13

Analysis 14

Test Unit Realism 14

Test Operations Realism 14

Operational Effectiveness 15

System Cost 15

Force Readiness 15

Schedule 15

Statutory 15

joint Operational test Realism Recommendation 16

Conclusions 16

ENDNOTES 17

BIBLIOGRAPHY 21

ABSTRACT iii

PREFACE ix

List of illustrations xi

List of Tables xiii

Transformation of Army test and Evaluation 1

Strategic relevance 2

t&e AND ANALYSIS ORGANIZATION interdependencies 3

Director, Operational Test and Evaluation (DOT&E) 3

Army Test and Evaluation Command (ATEC) 4

Army research laboratory (arl) 5

Survivability/Lethality Analysis Directorate (SLAD) 5

Human Research and Engineering Directorate (HRED) 6

Army Materiel Systems Analysis Activity (AMSAA) 7

Program Manager (PM) 7

training and Doctrine Command (Tradoc) 7

Center for Army Analysis (CAA) 8

Operational Forces 8

Recent Organizational Changes 8

Room for improvement 10

Army Structural suggestions 11

Joint Operational Test Realism 12

Analysis 13

Test Unit Realism 14

Test Operations Realism 14

Operational Effectiveness 14

System Cost 14

Force Readiness 15

Schedule 15

Statutory 15

joint Operational test Realism Recommendation 15

Conclusions 15

ENDNOTES 17

BIBLIOGRAPHY 20


PREFACE

(Start text of Preface or Acknowledgements here)

(Note 1: Change wording from PREFACE to Acknowledgements as applicable…*remove this note)

(Note 2: This page is optional, delete page as needed…*remove this note)


List of illustrations

Figure 1. T&E and Analysis interdependencies 3

Figure 2. RDECOM Organization, 10

Figure 1. T&E and Analysis interdependencies 3

Figure 2. RDECOM Organization, 9

(Note: This page is optional if you have 1 or no figures…delete page as needed…*remove this note)


List of Tables

Table 1. Comparison of COA Risks 14

Table 1. Comparison of COA Risks 14

(Note: This page is optional if you have 1 or no tables…delete page as needed…*remove this note)

ix

Transformation of Army test and Evaluation

The purpose of Test and Evaluation (T&E) during the development and acquisition of a defense system is to learn about identify and understand the areas of risk that must be accepted, reduced, or eliminated.[1] A desire to control costs and to reduce acquisition cycle time (and more quickly field the latest technological advances) often is manifested in pressure to reduce the scope or forego portions of the T&E process, including Operational Test and Evaluation (OT&E). However, as is pointed out in a 1987 GAO report on OT&E recognizes, this inadequate T&E can lead to increased operational risk:

If adequate OT&E is not done and the weapon system does not perform satisfactorily in the field, significant changes may be required. Moreover, the changes will not be limited to a few developmental models, but may also be applied to items already produced and deployed. In extreme situations, DoD also risks (1) deploying systems, which cannot adequately perform significant portions of their missions, thus degrading our deterrent/defensive capabilities and (2) endangering the safety of military personnel who operate and maintain the systems.[2]

While the primary purpose of T&E is to support acquisition decisions, a secondary purpose is to support the broader Army and Defense “analysis community”. This community is comprised of the various analysis organizations that determine force structure and warfighting requirements. Test data are used to validate the estimates of systems performance estimates and the Models and Simulations (M&S) that are used by these organizations. Similarly, validation with test data improves the pedigree of the performance estimates that are used by operational forces in exercises and experiments as well as in weaponeering. Weaponeering is defined by the Air Force as “the process of estimating the quantity of a specific type weapon required to achieve a specific level of damage to a given target, considering target vulnerability, weapon effects, munition delivery errors, damage criteria, probability of kill, weapon reliability, etc.”[3]

During the past decade, the Army has implemented Major major changes to the organizational structure, responsibilities, and relationships were made to the Army’s in its T&E and analysis organizations during the last decade. These changes were intended to improve efficiency by consolidating similar functions that were previously scattered across various Army organizations. While these changes that have been made to date have been successfulled to improvements, room for future improvement still exists. For example, T&E functions are still spread split between the Army Test and Evaluation Command (ATEC) and the Army Materiel Command (AMC). Furthermore, the changes made to date have not eliminated the long-running criticism that the T&E process does not adequately test systems in a realistic Joint joint operational context. This paper Strategy Research Project will describes the strategic relevance of T&E, characterizes the interaction of the Army’s analysis community with T&E, and reviews the organizational evolution of the Army’s T&E infrastructure. It will then describe proposes additional changes that wouldto streamline business processes and would improve Army evaluation and analysis capabilities.

Strategic relevance

The Research, Development, Test and Evaluation (RDT&E) of military systems directly supports the Ends, Ways, and Means that are described set forth in various defense policy documents. For example, the Defense Policy Goals (Ends) contained in the 2001 Quadrennial Defense Review are “Assuring Allies and Friends,” “Dissuading Future Military Competition,” “Deterring Threats and Coercion Against U.S. Interests,” and “If Deterrence Fails, Decisively Defeat Any Adversary.” Among the seven Strategic Tenets (Ways) to achieve these goals, RDT&E is a component of four: (“Managing Risks”, “A Capabilities-Based Approach”, “Developing a Broad Portfolio of Military Capabilities”, and “Transforming Defense”).[4] The Quadrennial Defense Review also points out that one of the four “pillars” of transformation is experimentation using wargaming, simulations and field exercises.[A]

The objective (End) of the Defense Acquisition System (of which RDT&E is primary amongst the Ways), is “to acquire quality products that satisfy user needs with measurable improvements to mission capability and operational support, in a timely manner, and at a fair and reasonable price.”[5] The standard T&E process (Means) that is currently currently used to support the acquisition of new materiel is described in the DoD Interim Defense Acquisition Guidebook and Army Regulation 73-1, Test and Evaluation Policy. The “product” of the Army T&E process is an understanding of system capabilities, which is documented in integrated (developmental and operational) evaluations that are used to inform production and fielding decisions. This process consists of the collection of data from Developmental Tests, Operational Tests, Modeling and Simulation (M&S), Demonstrations, and Experiments in order to evaluate the Effectiveness, Suitability, and Survivability of the system under development.[6] Developmental Test and Evaluation is used as an engineering development/design tool and to verify the inherent technical capabilities offered by new technologies and systems. [7] Operational Test and Evaluation is defined as:

“the field test, under realistic combat conditions, of any item of (or key component of) weapons, equipment, or munitions for the purpose of determining the effectiveness and suitability of the weapons, equipment or munitions for use in combat by typical military users; and the evaluation of the results of such test.”[8]

Thus, Operational Test and Evaluation is thus used to assess the degree to which soldiers can leverage a system’s technical capabilities in a realistic operational field test context. Title 10, United States Code contains sets forth statutory requirements to for the conduct of Operational Test and Evaluation[9] and realistic survivability/lethality Live Fire Test and Evaluation[10] for major systems and munitions prior to proceeding beyond Low-Rate Initial Production.

t&e AND ANALYSIS ORGANIZATION interdependencies

There are numerous Numerous organizations within the Army and Defense Department that rely on weapon-system data in the conduct of their analysis missions. Figure 1 depicts the typical information flow between these organizations. The paragraphs that follow describe the missions and relationships of key organizations in Army T&E and analysis.

Figure 1. T&E and Analysis interdependencies

Director, Operational Test and Evaluation (DOT&E)

In 1983, the Congress created the statutory requirement for the Defense Department to establish a DOT&E to oversee Service Operational Test and Evaluation programs. In 1994, the DOT&E assumed responsibility for oversight of Live Fire Test and Evaluation (LFT&E) programs. The Director of DOT&E, who is a Presidential appointee requiring Senate confirmation, and prepares annual reports to Congress on all major defense acquisition programs. The functions and duties of DOT&E are described in Title X, United States Code, Sections 139, 2366, 2399 and 2400. [11] The responsibilities of DOT&E are contained specified in Section 139:

The Director shall –

(1) prescribe, by authority of the Secretary of Defense, policies and procedures for the conduct of operational test and evaluation in the Department of Defense;

(2) provide guidance to and consult with the Secretary of Defense and the Under Secretary of Defense for Acquisition, and Technology and Logistics and the Secretaries of the military departments departments with respect to operational test and evaluation in the Department of Defense in general and with respect to specific operational test and evaluation to be conducted in connection with a major defense acquisition program;

(3) monitor and review all operational test and evaluation in the Department of Defense;

(4) coordinate operational testing conducted jointly by more than one military department or defense agency;

(5) review and make recommendations to the Secretary of Defense on all budgetary and financial matters relating to operational test and evaluation, including operational test facilities and equipment, in the Department of Defense; and

(6) monitor and review the live fire testing activities of the Department of Defense provided for under section 2366 of this title.”[12]

The DOT&E is also responsible for the execution of the Joint Technical Coordinating Group for Munitions Effectiveness (JTCG/ME), which has the mission of publishing operational effectiveness estimates for all non-nuclear weapons and standardizing effectiveness measures and methodologies used by the services. These estimates are published in Joint Munitions Effectiveness Manuals and include descriptions of weapon system characteristics, such as detection ranges, engagement ranges, fly-out times, delivery accuracy, reliability, and kill probabilities. JTCG/ME data and Joint Munitions Effectiveness Manuals are used by a variety of U.S. and Allied combatant commanders, analysis organizations, trainers, and Defense defense planners in weaponeering, training, tactics development and weapons systems studies.[13]

Army Test and Evaluation Command (ATEC)

The Army is unique among the services in having a single organization, ATEC, which is responsible for developmental testing, operational testing, and the continuous (through all phases of a program’s life cycle) integrated (developmental and operational) evaluation of materiel. The primary “products” of ATEC are test data and systems evaluations. DOT&E uses these ATEC products as the primary input for their “beyond low-rate initial production” reports to Congress, . and Program Managers and Acquisition Executives also use these ATEC products in materiel acquisition and fielding decisions. ATEC evaluations determine the degree to which materiel is effective, suitable, and survivable, . and These evaluations typically include recommended system improvements or “operational work-arounds” (changes to Tactics, Techniques and Procedures) to overcome capability limitations observed in testing.

The ATEC commander is a major general who reports directly to the Vice Chief of Staff of the Army. ATEC is comprised of three major subordinate commands. The Developmental Test Command, headquartered at Aberdeen Proving Ground, Maryland, manages developmental test centers throughout the U.S. and plans, conducts, and reports on developmental tests. The Operational Test Command, headquartered at Fort Hood, Texas, manages operational test centers throughout the U.S. and plans, conducts, and reports on operational tests. The Army Evaluation Center, headquartered in Alexandria, Virginia, develops evaluation plans, determines data requirements and sources (analysis, developmental testing, operational testing, M&S, exercises), observes testing, and evaluates system effectiveness, suitability, and survivability. Also unique among the services is the fact that ATEC, as the Army’s Operational Test Agency is responsible for defining LFT&E requirements and reporting on LFT&E results (program managers have assume this responsibility in other services). The unique characteristics of ATEC activities were endorsed by a 1999 Defense Science Board recommendation, which implicitly urged the other services to adopt the Army/ATEC model: