eliverables

About This Document

·  This is an Active Deliverable (AD) document, which is attached to the Active Deliverable Development Kit template (ADDK.DOT).

·  The ADDK contains macros that perform special functions for the AD. These commands can be accessed either through the AD Tools menu or the AD Toolbar. If the commands or toolbar are not available, verify

·  that the document is associated with ADDK.DOT by selecting Templates and Add-Ins from the Tools menu in Word 97 or Word 2000. In the Document Template text box, ADDK.DOT should be seen.

·  that ADDK.DOT is in the Templates folder or in the same folder as this document.

·  Instruction Text throughout the document provides suggestions about how to complete various sections.

Using This Document

·  Where applicable, replace Instruction Text (hidden, red text) with project specific information.

·  Follow the instructions and suggestions included in Instruction Text.

·  It is recommended the user keep the Instruction Text displayed until the document is completed.

How to convert Instruction Text to normal text

1. Select Instruction Text that needs to be changed

2. Press the Control button then the spacebar

How to update fields

·  Select the field, and press F9.

·  Alternatively, press Control + A to select the entire document and press F9.

·  To also lock a field, select Update and Lock Field from the AD Tools menu.

How to complete the title page

The title page contains important information about the Active Deliverable, such as the Project Name.

Enter the following information for the title page:

Organization Name

Project Name

Document Name

Author

Version

Date

Sponsor

Customer Representative

Technical Representative

How to reuse information from the title page

The information that the user supplies on the title page can be reused in other parts of the document.

Each of the items on the title page has an associated style, which is based on the item name and preceded by an asterisk (*).

1. Click where relevant title page information should be inserted in the document.

2. From the AD Tools menu, select Add Document Information

3. In the dialog, choose a document information item.

4. Press OK.

This method can also be employed for other styles by doing the following:

1. Click where the information is going to be inserted in the document.

2. From the Insert menu, select Field.

3. Click the "Links and References" category, and then click the "StyleRef" field name.

4. To add a style click Options. Title Page styles appear at the top of the style list.

NOTE: If items are changed items on the title page, the corresponding fields within the document need to be updated as well.

Enter Organization Name

Enter Project Name

Test Plan

Prepared by:
Version:
Date:
Project Board:
Sponsor:
Customer Representative:
Technical Representative:

Approval Signatures

Name: / Name:
Title: / Title:
Name: / Name:
Title: / Title:
Name: / Name:
Title: / Title:
Name: / Name:
Title: / Title:
Name: / Name:
Title: / Title:

[INSTRUCTIONS - To update these fields, click on the item and press F9.]

Enter Project Name

Test Plan

Enter Organization Name

Document History

Reviewed By

Organization / Person

Copied To

Organization / Person

Revision Record

Number / Date and Sections / Notes

ad_nttp.doc vi 6/27/00

Acronyms

Acronym / Description

ad_nttp.doc vi 6/27/00

References

[INSTRUCTIONS - Enter document title and its file name for each referenced document]

Document Title / File Name
Test Log / AD_TL.DOC
Test Design / AD_NTTD.DOC

ad_nttp.doc vi 6/27/00

Table of Contents

[INSTRUCTIONS - From the AD Tools menu, select Update Table of Contents or double-click the button]

tents

1. INTRODUCTION

1.1 Document Purpose

2. TEST STRATEGY

2.1 Scope

2.2 Levels and Objectives

2.3 Completion Criteria

2.4 Resource Requirements

3. TESTING ENVIRONMENT

3.1 Test Tools

3.2 Libraries/Directories

3.3 Submission of Test Items

3.4 Security

4. TEST APPROACH

4.1 Test Design

4.2 Test Data

4.3 Testing Priorities

4.4 Planning a Test Run

4.5 Test Results/Deliverables

4.6 Release Notes

5. SCHEDULE (FOR TESTING)

5.1 Initial Schedule

5.2 Assignment of Resources and Responsibilities

6. APPENDICES

6.1 Related Work Papers

7. GLOSSARY

ad_nttp.doc vi 6/27/00

1.  Introduction

If a project name has not already been supplied, go to the first page and enter a project name in the appropriate place.

1.1  Document Purpose

The purpose of the Test Plan deliverable is to define a detailed, comprehensive plan for controlling and testing the application by organizing the testing activities during its prototype, build, and production states. Major activities include:

·  establishing the strategy that must be taken to ensure successful testing

·  identifying and establishing the testing environment that needs to be in place

·  describing the test approach by identifying the test design, data and test priorities

·  scheduling the testing activities and resource assignments in accordance with the testing priorities

Start the Test Plan early in the project life cycle (usually during the Project Initiation & Planning stage) to provide the needed time to:

·  analyze the breadth of the required testing activities

·  plan for test design, execution, and completion

The Test Plan product is completed as products from all test planning activities are reviewed and compiled into the final Test Plan.

2.  TEST STRATEGY

2.1  Scope

Test Strategy Scope defines the coverage of testing that will be performed. Coverage can consist of integration testing, system testing, acceptance testing and final testing during hand-over of the system to the users.

Scope can range from testing just the business functions that are critical to the life of the organization to testing a full-blown system and all its functionality. The testing of incremental builds may be appropriate.

Describe the limitations of testing by documenting elements that will not be tested. Explain why testing will not occur on these elements and the impact that it will cause from not testing.

2.2  Levels and Objectives

Define the objectives and participants for each level (unit, integration, system, acceptance) of testing.

2.3  Completion Criteria

Define criteria for progressing to the next level of testing. For example, "the system can progress to acceptance testing when 95% of the system tests have been completed and the 5% remaining are accepted as being non-critical."

Completion of all testing will be determined by the number of defects found and resolved. The Test Log Active Deliverable, which identifies all defects and the status of each, is a good source to use in measuring completion criteria.

2.4  Resource Requirements

This section contains the types of resources that will be needed to be successful in testing an application. The test teams must be reviewed to assess their skills and upgrade their needs to perform the tests effectively.

Training may be required in the following areas:

·  testing tools

·  systems software

·  support software

·  operating system

·  database structure

Resource / Skills / Training Needs

To resolve these training requirements, “Just-In-Time” (JIT) training should be established for those individuals who need it. If there is a shortage of testers, additional people should be enlisted and trained before testing begins.

3.  TestING Environment

This section describes the test environment which will be used to perform the actual testing activities. Any dissimilarities between the testing environment and the production environment are noted and reviewed to determine any impact.

3.1  Test Tools

List all the test tools that will be used with a schedule indicating when they will be best used. These tools include the software, hardware and system tools that will be needed to effect testing.

Ensure that all aspects of the test environment are installed and commissioned ready for testing. Factors to be considered include:

·  Testing tools (e.g., test data generators, test drivers, etc.) installed

·  Debugging tools

·  Capture/playback tools

·  Test data generators

·  Database, network, and performance monitoring products

·  Load-testing software

·  Test drivers/stubs

Test Environment Checklist
Test Component / Build #1
Installed
By Whom / Build #2
Installed
By Whom / Build #3
Installed
By Whom

3.2  Libraries/Directories

This section defines the location of the program libraries, data libraries, and directories that will be needed when testing begins. This information will be used to set up the test environment.

Document the program and data libraries and directories identifying where the files can be located. Make sure that the test team has access to these libraries. Testing tools may need to be reviewed to determine if there are any limitations in running under the test environment.

3.3  Submission of Test Items

Document procedures for submitting software into the test environment. This may include requirements for evidence of prior testing (walkthrough reviews, unit tests). Determine acceptance procedures and standards for accepting the software into the test environment.

3.4  Security

Identify the security requirements for the test environment and ensure testers will have the proper security access to perform testing.

Some of the security requirements are:

·  Program access

·  Supporting tools access

·  System access

·  Network access

·  Database access

·  Test System Access

4.  TEST APPROACH

4.1  Test Design

This section contains a summary of the test design.

The main purpose of the Test Design is to prepare all the items that will be necessary for performing tests of the defined application. Test Design contains a list of Test Cases, the Test Scripts and the Test Packages that will be required to test the system. Before the Test Packages are considered complete, the deliverables must correspond with the Test Case/Requirements Traceability Matrix that is developed and included in the design.

4.2  Test Data

Determine the types of test data and sources that will be needed to effectively test the system. All aspects of data usage should be considered as well as data cleansing.

Examples of types of test data are:

·  interface data received from other applications,

·  data that needs to be sent to and received from interfacing applications,

·  types and amount of data that will be required from existing system,

·  new data based on new requirements not currently found in the enterprise database.

4.3  Testing Priorities

This section lists the testing priorities that must be addressed during software testing. Testing priorities are very important to the test teams since they will dictate how the test schedule will be prepared and executed. Testing priorities should be based on the importance of the functions being tested, system software and operating system preferences, and the user's needs and expectations.

Example of some testing priorities are:

·  Testing under Windows 95 will take priority over testing for Windows 98 or NT..

·  Fixed Assets module has been given high priority because of merger activities.

·  Performance testing is a high priority, as this has been determined to be an area of high risk for the project and is also a key requirement to users.

4.4  Planning a Test Run

This section describes the planning activities that should be performed prior to testing to ensure success in the test operation

Preparation for a test run will involve the following:

·  determining the types of tests to be run and the order in which they will be run. For example:

·  Business Requirements

·  Quality/Standards compliance

·  Performance

·  Operational Considerations.

·  making certain that resources are available (may include QA testers and Users)

·  validating the test data to ensure all data is present for the respective test run

·  performing a quality assurance review of the test schedule, test cases and test scripts

·  ensuring that all designated requirements are to be tested by creation of a test case/requirements matrix

·  scheduling test time with operations

·  notifying management of the upcoming tests

4.5  Test Results/Deliverables

Document the location of test results (e.g., the Defect Tracking System) and any other test products produced

Types of deliverables to be stored are:

·  Test Plan

·  Test Design

·  Analysis reports/graphs

·  Test Log

·  Supporting information used in execution of Test Scripts

4.6  Release Notes

Release Notes will reflect any outstanding issues that represent lingering problems in the system which have not been resolved. In these cases, the notes will identify work-around corrective measures to allow the system to perform appropriately.

5.  Schedule (for Testing)

This section defines how the application is going to be scheduled for testing. Schedules will be different based on the type of application and business functionality to be tested. Included here are integration tests, function tests, and feature tests, which demonstrate that the application works as intended.

5.1  Initial Schedule

Prepare a schedule showing testing activities with estimated start/finish dates and revise as necessary during iteration and stage level planning of development activities. Develop the test schedule based on the incremental builds such as the integration, system and acceptance test activities. Schedule in a manner that parallels the development approach of the application. Ensure that priorities are taken into consideration.

Include the following:

·  Definition of system components and Test Packages to be included in each integration and system test

·  Identification of the test harnesses and stubs required

Test Schedule Summary
Ü Iteration Þ
Test Plan Item / Build #1 / Build #2 / Build #3
Integration / Schedule
Test Team
System Tests / Schedule
Test Team
Acceptance Tests / Schedule
Responsibilities/Roles
Test Team

5.2  Assignment of Resources and Responsibilities

Test teams will be formed and assigned to execute test packages. Assignment of roles and responsibilities will be made depending on expertise.

Schedule resources for each type of testing activity that requires support. This includes involvement in interactive design sessions and participation in iteration peer reviews of the prototype. Through this involvement, the test team is able to stay up-to-speed on requirements and provide feedback on the prototype as a part of the review team.

Roles and responsibilities are to be assigned to each individual in the test team. For example, determine who will be responsible for: