{Company Name}{Department Name}

{Name} System

[Version|Release m.n]

User Acceptance Test Plan

Written by: / {Name(s)}
Organization: / {Name}
Location: / {Location}
Original Publication Date: / {mm/dd/yy}
Current Plan Version: / {m.n}
Current Publication Date: / {mm/dd/yy}

How to Use the Template

The Template provides a “shell” for a User Acceptance Test Plan. The template contains specifications of the information to be filled in, as follows:

·  Text in normal type is intended to be used as is.

·  Text in italic type is either

-  a description or placeholder to be replaced with the information described, e.g., replacing describe briefly with an actual description.

-  instructions to be deleted once they have been followed, e.g., select one of the following alternatives.

·  {Text in braces - normal or italic} specifies required information.

·  [Text in brackets - normal or italic] specifies optional information.

·  The vertical bar within braces or brackets specifies a choice: either this|or that.

To prepare the plan, fill in the information specified within the Template, as described above, guided by the UAT methodology. Change the footer to reflect the plan being produced, with its version and date. The Test Plan Sample provides an example of a test plan following the template. The text may also be varied as required, as long as the underlying approach is followed.

Familiarity with Word 6.0 is assumed. The following are some key tips:

·  Use Edit/Find to locate all occurrences of “{“ and “[“.

·  If not familiar with using styles, to get a heading of any level, copy an existing one and insert it where the new heading is needed. The number will be updated automatically. Then change the text as needed.

·  Update the Table of Contents at any time by placing the cursor in it and hitting F9. The entire Table of Contents should be updated, not just the page numbers, if you have added or changed headings at Level 1 or 2.

·  These instructions are placed within the Template for convenience of reference. Delete this entire page when finished, and do a final update of the Table of Contents.

Table Of Contents

1. Introduction

1.1. Document Structure

1.2. Purpose and Scope of the User Acceptance Test

1.3. Assumptions

1.4. Risk Assessment

2. Strategy

2.1. Test Levels

2.2. Types of Tests

2.3. Requirements Identification

2.4. Requirements Coverage Strategy

2.5. Build and Test Run Strategy

2.6. Parallel Test Approach

2.7. Pilot Test Approach

2.8. Requirements Validation Matrices

3. Work Plan

3.1. Organization and Responsibilities

3.2. Major Tasks, Milestones and Target Dates

3.3. Work Breakdown Structure

3.4. Resources Needed for Test Execution

4. Testing Procedures

4.1. Library Procedures

4.2. Problem Reporting Procedures

4.3. Change Management Procedures

4.4. Test Execution Procedures

4.5. Certification Procedures

5. Test Case Design

5.1. Build {m}: {Name}

6. Test Case Execution

6.1. Components to be Tested

6.2. Test Data Components

Attachment A: Requirements Hierarchy

Attachment B: Requirements Validation Matrices

Attachment C: Work Plan

Attachment D: Test Case Design

1.  Introduction

This Test Plan describes how and when User Acceptance Testing will be performed to ensure that [version|release m.n of] the {name} System performs according to its business requirements and functional specifications.

The business purpose of the {name} System is {brief statement of overall business functionality}.

The business owner of this system is {name, title, location, phone}.

The objectives of the current {version|release|project} are:

·  {list/describe briefly}.

The business justification is {describe briefly}.

1.1.  Document Structure

Section 1 continues below to describe the purpose and scope of the test, the assumptions underlying the plan, and the risks that may impede testing or implementation.

Section 2 details the strategy for the user acceptance test.

Section 3 provides the work plan for the test.

Section 4 includes the procedures used to control the test.

Section 5 lists the test cases to be executed.

Section 6 lists the components involved in the test.

The Glossary in the Appendix defines key testing terms.

[The attachments contain the following parts of the plan that are lengthy and/or are prepared by software other than word processing:] {Select/change the items in the following list as needed. Note if item is stored separately rather than as part of this document.}

Attachment A: Requirements Hierarchy

Attachment B: Requirements Validation Matrices

Attachment C: Work Plan

Attachment D: Test Case Design]

1.2.  Purpose and Scope of the User Acceptance Test

User acceptance testing is performed to verify that the total system, both software deliverables and associated non-software deliverables (documentation, forms, procedures, etc.), will function successfully together in the business environment and will fulfill user expectations as defined in the business requirements and functional specifications. User acceptance testing normally comprises the final set of tests to be performed on the system or release.

This acceptance test will be {carried out|coordinated} by the {name} Business Acceptance Testing group located at {location}. The {name(s)} user organizations located at {location(s)}, respectively will participate as follows:

{Describe involvement in test planning, test case development and/or test execution, including reviews and signoffs}

The acceptance test will begin on {mm/dd/yy} and be completed by {mm/dd/yy}.

1.3.  Assumptions

Prior to acceptance testing, the tests listed below will have been performed, and the results reviewed by the User Acceptance Testing group. These tests are considered to have been satisfactorily completed, with the exceptions noted below.
{To be completed in updating the plan prior to acceptance test execution.}
  1. Unit testing by the {name} group at {location}. Exceptions: {None|describe briefly and reference detail}.
  1. Integration testing by the {name} group at {location}. Exceptions: {None|describe briefly and reference detail}.
  1. System testing by the {name} group at {location}. Exceptions: {None|describe briefly and reference detail}.
  1. [Other, if any. Exceptions: {None|describe briefly and reference detail}.]

This test plan describes all of the remaining tests to be performed on the {system|release} [except for describe any subsequent tests not covered by this plan and reference the appropriate test plan(s)].

An adequate testing environment at {location}will be available beginning {n} days before the start of acceptance testing for setup and shakedown.

{Select one of the following two items.}

The software and non-software deliverables will be made available to the UAT group in builds, as detailed in the Build Strategy Section below.

The entire system will be made available to the UAT group at one time on or before the above start date. Hence the term “build” as used in this plan refers only to the top level of the test hierarchy structure.

Adequate staff resources with appropriate knowledge and/or training will be available as specified in the work plan during the period required to set up and complete the test.

{Insert any other assumptions as appropriate.}

1.4.  Risk Assessment

Significant risks that are inherent in the system design or environment, or result from the selected approach to development or testing are listed below. The Severity level (High, Medium, Low) represents the overall significance of the risk, i.e., the combination of its likelihood of occurrence and its degree of impact if it occurs.

Risks that may impact the ability to complete the acceptance test successfully and on time:

Description / Severity / Avoidance/Minimization Approach /

Risks that may impact the ability to install the system on time and/or operate the system successfully:

Description / Severity / Avoidance/Minimization Approach /

2.  Strategy

This key section of the test plan describes the approach used to assure that the system is thoroughly tested. It details the levels and types of tests to be performed and the requirements to be tested, as well as the coverage and build strategies [and the approach to pilot and parallel testing]. Finally, it presents the validation matrices or equivalent that assure coverage of the requirements at all levels.

2.1.  Test Levels

This acceptance test will include the following test level(s):

{Delete any not applicable.}

  1. Normal acceptance testing in the UAT environment at {location}.
  1. Parallel testing against {release m.n of this system|the {name} system|the {name} process} in the {UAT|production} environment at {location}.
  1. Pilot testing in production at {location(s)}.

2.2.  Types of Tests

The following types of tests will be performed {delete any not applicable}:

·  Environment tests: Tests that validate the functioning of the system with the hardware, system software and networks.

·  Positive functional tests: Response to valid user actions and valid data.

·  Negative functional tests: Response to user actions or data that should generate error messages.

·  Invalid input tests: Response to inputs or user actions that may be unanticipated by the system design.

·  Usability tests: Verification of the ease of use of the system and its associated documentation and procedures, including recovery procedures.

·  Control tests: Tests of the system’s ability to produce appropriate audit trails.

·  Security tests: Tests of the system’s ability to restrict access to data or functions.

·  Capacity/performance tests: Tests of the system’s ability to handle specified volumes, or produce specified response time or throughput.

·  Regression tests: Repeated tests in any of the above categories that verify that problems were fixed, or other changes were made, correctly and without adverse impact on other functions.

·  [Other tests: Describe, if any.]

2.3.  Requirements Identification

The requirements to be validated by this test plan originate in the following document(s):

Title / Version / Date /

{Select one of the following two items.}

The requirements to be validated by this test plan have been identified and decomposed hierarchically, and are shown {below|in Attachment A}.

The requirements to be validated by this test plan have been decomposed hierarchically and are stored in {tool or environment} on {workstation|server identification} as { drive:\directory path\file }. [A hard copy listing is found in Attachment A.] {Hard copy should be attached unless the access to the on-line material is available and familiar to all concerned.}

2.4.  Requirements Coverage Strategy

{Select one of the following two paragraphs.}

This acceptance test of a new system is designed to validate nnn% {normally 100%} of the requirements relating to the environment, the software deliverables, and the non-software deliverables. {Describe any exceptions.}

This acceptance test of corrections and/or changes is designed to validate nnn% {normally 100%} of the new and changed requirements relating to the environment, the software deliverables, and the non-software deliverables, as well as {nn%} of pre-existing requirements via regression tests. {Describe any exceptions.}

The coverage of requirements is verified by the {Select one of the following:}

Requirements Validation Matrices shown in {Section 2.8|Attachment B}.

{Name of report} produced by {tool or environment} on {workstation/server identification} from the input file {drive:\directory path\file}. [A hard copy listing is found in Attachment B.] {Hard copy should be attached unless the on-line material is available to all concerned.}

{Other approach.}

2.5.  Build and Test Run Strategy

The builds listed below represent the highest level logical grouping of tests [and the stages in which the system will be delivered to UAT] [as well as the stages in which the system will be installed in production]. The system components to be validated by these builds will be delivered to UAT {on the dates listed below|at the start of user acceptance test execution}. The test runs within each build are listed below. The system components in each build are listed in Section 6.1. Coverage of high-, intermediate- and detail-level requirements by the builds, test runs and test cases respectively is validated as described in Section 2.8.

Build No. / Build Name / Software/Non-Software / Delivery Date /

The sequence of builds reflects the following dependencies and other factors relating to the development and/or testing processes and the functionality to be delivered:

·  {List factors leading to choice of build structure and sequence.}

{Include either or both of the following two items, if applicable.}

Builds will be moved to parallel testing {individually|all at one time}. {Detail as required}.

Builds will be moved to pilot testing {individually|all at one time}. {Detail as required}.

{Reproduce either of the subsections below for each build. Build numbers should be unique through all builds.}

2.5.1.  Software Build {n}: {name}

2.5.1.1.  Description

{List functions included.}

2.5.1.2.  Test Runs

The tests for this build will be divided into the following test runs, each generally representing a single on-line session or batch job unless otherwise described in the text:

Run No. / Name / Description/Objective(s) /
2.5.1.3.  Test Files

The following test files must be available to test this build: The names provide a reference to the Test Execution section below.

Description / Content* / Source** / Name /

{*e.g., empty file, valid records, records processed by specified subsystem or function}

{**e.g., from developers, create by specified tool}

2.5.1.4.  Test Tools

The following tools will be used in testing this build.

Tool Name / Tool Type* / Purpose/How Used / Required?** /

{*e.g., Test Management, Capture/Replay or Scripting, Test Data Generator, File Comparison, Interactive Debugging, other categories such as Simulation or Performance Monitoring}

{**i.e., use of this tool is required by current standards/methodology}

2.5.2.  Non-Software Build {n}: {name}

2.5.2.1.  Description

This build includes the following component(s):

·  {Name: description}

2.5.2.2.  Acceptance Criteria

·  {List criteria, e.g., conformance to standards or specifications, readability/usability}

2.5.2.3.  Validation Method(s)

·  {List approach(es), e.g., inspection, review, user test. If appropriate, list test runs as above.}

2.6.  Parallel Test Approach

{Describe how parallel test will be executed and validated If possible, list test runs as above.}

2.7.  Pilot Test Approach

{Describe how pilot test will be executed and validated If possible, list test runs as above.}

2.8.  Requirements Validation Matrices

{Select one of the following two items.}

The Requirements Validation Matrices are shown {below|in Attachment A}.