Homework 4: Test and Evaluation Master Plan
Wireless Immersive Training Vest Monitoring System

Prepared by SYSENG 368 group 5:

Chris Blanchard /
Gareth Caunt /
Michael Donnerstein /
Christopher Neuman /
Varun Ramachandran /

Submitted: November 27th 2012

Test and Evaluation Master Plan

Contents

1.Introduction

1.1.Test Levels Sections

2.Test Environment and Schedule

2.1.System Component Tests

2.2.Organizational Structure

2.3.Testing Environments

2.4.Test Schedule

3.Resource Requirements

4.Test Material and Training

5.Test Methods and Evaluation

5.1.Test Methodology

5.2.Test Conditions/pass/fail criteria:

5.3.Test Deliverables

6.Test Descriptions

6.1.Test Description and Responsibilities

Test and Evaluation Master Plan

1.Introduction

This Test and Evaluation Master Plan (TEMP) has been produced to provide an integrated plan for testing the project throughout the development and customer acceptance parts of the lifecycle. It aids in understanding the design feasibility. Operational utilization, operational training and support system testing (Type 4 testing from Systems Engineering and Analysis, Blanchard and Fabrycky, 5th edition 2011) will be ongoing and extend beyond initial system delivery and may involve incorporation of new technologies prior to system retirement.

The TEMP is a “living” document thatiterates and evolves through project lifecycle, primarily during the development phases. At this stage of the development lifecycle, very little detail is available for the Test Descriptions. This version of the document will therefore provide details on the test methods and evaluations for each requirement. That involves describing the method we will use to how we verify the requirements. A subsequent version of this document will include a set of verifications for the Tier 0 and Tier 1 requirements as agreed with the customer.

Verification methods for the requirements can be inspection, analysis, demonstration, or test. Each requirement verification impacts on thetest environment and resources.Thesein turn drive the program schedule. Training for the test engineers and any support staff can then be determined based on skill set required to perform each of the verifications and the tools need. The test program follows the development phases and customer involvement at all stages is encouraged. The customer is required to agree the verification methodology of Tier 1 requirements and the validation methods for the Tier 0 requirements and the statement of need.

1.1.Test Levels

The following test levelswere considered during the development of this document:

  1. Subsystem Integration Tests – Section that describes how testers will examine the subsystems to ensure that they will interface properly and report problems;
  2. System Test – Section that describes the testing to be done to determine overall system compliance with standards and satisfaction of functional and technical requirements;
  3. User Acceptance Test – Section that describes the test to be done to evaluate the system and functionality as it is intended to be deployed; and
  4. Security Test – DoD related systems will need this test to ensure compliance with final requirements document.

The customer has agreed that the project is unclassified;therefore no security level tests are required. The remaining test levels are discussed further in section 5 of this plan.

2.Test Environment and Schedule

Testing protocols will be developed that will sequentially test each component of our system individually. Once the system successfully passes individual tests, the testing protocol will move to more advanced stages which will test all of the systemcomponents working together. Final round testing will require the system to meet customer expectations when exposed to real mission environmental elements and duty cycle.

2.1.System Component Tests

  • Wireless Network Testing – Initial testing of the wireless network will attempt to capture and display data from each of the legacy systems mounted on a single soldier. Field personnel will confirm that data indicated in the control room has been initiated with the test soldier. Testing will begin with a single legacy device being activated and progress until all devices are in use simultaneously. Network bandwidth usage will be monitored throughout testing.
  • Trainer Room Display and Interface – Testing of the display software and hardware that the trainer will utilize in the control room will follow a similar methodology as above. In addition, alert times will be monitored to compare to customer requirements. Finally, the trainer’s interface for sending feedback to the test soldiers will be tested.
  • Data recording system – Once field testing is completed, the data recorded from the training session will be reviewed for accuracy and ease of data retrieval.

2.2.Organizational Structure

  • Project Team 5 – Development team members will be involved in testing the system. Involvement will range from being test subjects, to test trainers and field observers. The Project team will also be responsible for troubleshooting and field repairs that are required during testing.
  • Missouri S&T Personnel – Team mentors and advisors as well as the development personnel for legacy systems will be invited to observe and evaluate testing protocols.
  • COTS Vendor Support Staff – Vendor support is expected for initial testing of COTS products. Vendor expertise will be utilized to streamline interface problems and minimize testing delays.
  • Customer – The customer of his designees will be welcome to observe or participate in testing.

Post training debriefing will involve all parties and subsequent testing protocols or system corrections will be discussed.

2.3.Testing Environments

Testing will progress through a controlled indoor atmosphere field test with single subject soldiers up to full outdoor environment in all temperature, humidity and conditions with multiple soldiers.

2.4.Test Schedule

The final testing schedule has yet to be determined. However, it is anticipated that all testing shall be performed between January 22nd and March 22nd, 2013. These dates correlate to the testing schedule that has already been presented to the customer at both the Conceptual and Preliminary Design Reviews.

3.Resource Requirements

Minimum resource requirements for the testing protocols are listed below. As testing progresses to more complex assignments that will more closely mimic the performance expected in the field, multiple units will be required for testing the system with multiple soldiers.

  • Control Room Laptop – The control room laptop should be preloaded with all software that has been developed. The laptop should also be wired into the system wireless network and all typical computer peripherals shall be available.
  • Wireless Network – Wireless routers, router antennas, and power cabling (POE) shall be available and installed for testing.
  • Legacy Subsystems – Legacy subsystems shall be available for system testing and should consist of (but are not necessarily limited to) the following
  • Missouri Mote
  • Integrated Training Vest (ITV)
  • Bioharness
  • Virtual Cultural Monitoring System
  • RT-19 Tactor Feedback System
  • Personnel Requirements – Minimum personnel requirements for testing shall be one mock trainer, one mock soldier and one field observer for each soldier and trainer. Initial testing can be accomplished with four people. Subsequent tests of multiple test soldiers will require additional personnel.
  • Software Requirements - All software shall be preinstalled on the trainer laptop. The software will consist of developed data monitoring software, trainer GUI software, modified Bioharness display software and any additional software required or requested by the customer.
  • Supply Support Requirements – For initial tests which utilize single components, additional components shall serve as spares for change out during testing if failures occur. Any failures will be replaced and subsequent testing shall be performed to confirm that a fully functional suite of components is delivered to the customer.

4.Test Material and Training

  • Test Material – All Owner/Operation manuals shall be available for all devices. In addition, copies of the test methods and evaluations, and test descriptions (listed below) will be included as a part of the testing documentation for all testing personnel. A formal itemized testing procedure will be included to allow testing to adhere to a strict and comprehensive schedule which will test each component individually and in conjunction with the entire system.
  • Test Training– Prior to actual field testing commencing, test protocols will be agreed to by the customer and the development team. A ‘walk-through’ training session will be done utilizing only team members prior to deploying any equipment, expense or personnel in actual testing. This walk-through will be done to find any gaps in the proposed testing prior to expenditures on testing.

5.Test Methods and Evaluation

5.1.Test Methodology

The testing of the system is planned to be carried out in three levels. Each level is planned and carried out with boundaries defined. The test approach will involve the development team and the transition team. The results of each level is then documented and submitted for evaluation. The three levels are identified to be:

  • Subsystem Integration test: - The Subsystem integration tests focuses on testing the external application programming interfaces between subsystems. There are several approaches to performing subsystem integration testing like modeling, simulation, functional, top-down, bottom-up, inspection and analysis. All the subsystems are commercially available. The tests are carried out by the development and the transition team. All existing software hasalready been tested and the new software hassuccessfully completed a code peer review and unit testing before they enter integration testing.

All defects are identified and examined before subjecting to extensive testing of the subsystems. The final Subsystem integration results including the test data sets and outputs from the test may be delivered as part of the final test plan.

  • System test: - This section describes the type of test that determines system compliance. System documents and training manuals are examined for accuracy, validity, completeness and usability. During this test period software performance, response time and ability to operate under stress condition is tested. External system interfaces are also tested. All findings are documented in a system analysis report. The testing at this level requires:
  • The installation of the system.
  • The system is tested in different phases to check the working.

1) The system is first tested for the preferred configuration.

2) Different parts of the system are installed.

  • Missouri Mote
  • Integrated Training Vest
  • Bioharness
  • Virtual Cultural Monitoring System
  • RT-19 Tactor Feedback System

3) All types of cables are tested for operation.

  • User Acceptance test: - This test is performed on the whole system and is deemed functional. It is a test conducted to determine if the requirements are met. Every system feature may be tested for correctness and satisfaction of functional requirements. System interoperability, system reliability, documentation and the level to which system meets user requirements is evaluated. They are accompanied by test case input data, formal description of the operational activities and the expected results. Performance tests may be executed to ensure receive and send operations, data recording and data accessibility are addressed. These are monitored by development team, transition team, project manager and the contractors. Prior to final completion of the acceptance test all major Defects will be corrected and verified by the test teams and the manager.

5.2.Test Conditions/pass/fail criteria:

  • Each system component specified will have threshold values assigned.
  • The test results are then compared with the threshold values and the decision is made
  • Previously tested requirements will be not be tested again to minimize time and cost.
  • Each system level unit is tested for proper interfacing.
  • All defects are listed and categorized into severity levels. Each severity level is then examined and each level is retested to check the functionality.
  • Test results are recorded to assist in further tests and retests.
  • There will be a limited number of failures defined for each component after which order is placed for a set of new components.
  • A permissiblenumberof defects will be defined. If the number of defects surpasses the maximum value, testing will be halted and the design of the system will be reviewed.
  • The development team, transition team and the project manager will have a constant update on the tests been carried out and test results being documented. The pass criteria will be developed to achieve better results as we move from one review to another.

5.3.Test Deliverables

The test deliverables are an important set of documents that describe the testing procedures that have taken place from the start of the testing phase to the end of testing phase. The important test deliverables are:

a) Test Plan - Describes the test methodology and the type of testing required. It also gives the schedule and basically represents a framework of the overall testing activities.

b) Test cases - Developed to implement one or more system test conditions.

c) Test design specifications - Gives the detailed test conditions and the expected results as well as test pass criteria.

d) Test procedure specification - Tells us how the test is to be conducted. It also contains any preconditions and steps that need to be followed.

e) Test Log - Records the results of the tests completed, who performed the test and whether the test passed or failed.

f) Test Incident Report - A detailed report explaining the test performed. This is mostly done for failed tests and gives a comparison of the actual results and the expected results

g) Test Summary Report - A report that assesses the quality of the test conducted. Any missed test or information not covered in the test. All statistics resulting from these tests are recorded.

h) Performance evaluation - of the personnel will be done regularly and submitted to be presented in the meetings.

6.Test Descriptions

6.1.Test Description and Responsibilities

This section provides information about each individual test, the amount of test outlined will be truncated for space. Examples of testing will also be split into OEM and system testing. OEM testing shall be defined as testing that is required at the subsystem level, and is carried out by the supplier where the COTS were purchased. The supplier testing of the subsystems must meet the requirements of the system as a whole. Examples are displayed in the matrix below. These are just a few examples, there would have to be tens more in order to fully test and evaluate the system.

Test and Evaluation Master Plan

6.2 OEM Testing

Name / Description / Control scheme / Inputs / Outputs / Procedures
1 / Storage Device Life Test / Data shall be written and rewritten to hard drives to prove drive life / Drive shall be selected at random at the end of line production using run at rate speed and placed into a laptop / Sample data / Drive fidelity / Data shall be written and rewritten from a sample file mimicking real data inputs from actual infield testing. After every 80000 hours training time equivalent writes which is the current specified system life the drive will be evaluated for successful operation.
2 / Bioharness Shock and Vibe Testing / Bioharness shall prove robustness to field environment / Drive shall be selected at random at the end of line production using run at rate speed and placed into fixture on a vibration table / PSD Profile / Harness functionality / Bioharness shall be placed into a fixture on an electric shaker table. An agreed upon Power Spectral Profile shall be utilized to provide vibration and vertical acceleration inputs. Each PSD profile shall represent one life, after each life the Bioharness shall be checked for successful operation.

6.3 System Testing

Name / Description / Control scheme / Inputs / Outputs / Procedures
1 / Operational Area Test / This test shall test the ability of the system to function at different distances. / System shall be outfitted upon a test technician and evaluated at various distances and through obstacles. / Sample data broadcast from the ITV / Signal reception and accuracy / A list of faux pas shall be performed at specific distances and behind specific obstacles representing customer environment. After the list has been completed the control center shall verify all signals registered without fault.
2 / Time to Alert / This test shall test the ability of the system to alert the control center to a safety emergency. / Test shall be performed in conjunction with system test 1, including distance and obstacles. / Mote directed emergency broadcast / Signal reception and speed / A Mote shall demand an emergency signal at distances and behind obstacles equal to system test 1, the signals of emergency will then be evaluated in the control center for reception and speed with which it was received.