Test Specification Template

Test Specification Template

Spell out the Project Name

(Project Acronym)

Testing Specifications and Procedures

Month DD, YYYY

UNITED STATES PATENT AND TRADEMARK OFFICE

INFORMATION TECHNOLOGY TESTING BRANCH

[Project Name]Testing Specifications and Procedures

TABLE OF CONTENTS

Qualification Statement......

1SCOPE......

1.1System Overview......

1.2Overview......

1.3Performance Testing......

1.4Security Test and Evaluation......

1.5System Compatibility Testing......

2REFERENCED DOCUMENTS......

2.1Regulatory References......

2.2Department of Commerce Document References......

2.3Project Specific Document References......

2.4Commercial Document References......

3TEST SPECIFICATIONS AND PROCEDURES......

3.1Features to be Tested......

3.2Features Not to be Tested......

3.3Approach Refinements......

3.4Feature Pass/Fail Criteria......

3.5Input Specifications......

3.6Output Specifications......

3.7Environmental Needs......

3.7.1Hardware......

3.7.2Software......

3.7.3Other Needs......

3.8Test Specifications And Procedures......

APPENDIX AACRONYMS AND ABBREVIATIONS......

APPENDIX B[REQUIREMENTS or USE CASE] TO TEST ALLOCATION MATRIX

LIST OF TABLES

Table 3-1Severity Rankings for Discrepancies

GENERAL comments on the Header and Footer. The footer date on the right should be same as the cover date, which is also the deliverable date.

There is a slash and a version number following the date. You assign the version number based on the iteration of the document (e.g., is this the first time the document is being written = version 1; are you revising a version 1 document, which has been delivered to USPTO = version 2 and so on ). The version number is NOT the AIS version #, which is in the left header.

QUALIFICATION STATEMENT

The paragraph below is required.

All product names and services identified throughout this document are trademarks or registered trademarks of their respective companies.

1

Month DD,YYYY/v.1

[Project Name]Testing Specifications and Procedures

1 SCOPE

1.1 System Overview

The purpose of United States Patent and Trademark (USPTO) [project name and version #) project is to enhance its existing system and further address customer needs. Add anything that provides further insight into the project.

1.2 Overview

The Testing [Specifications or Scenarios] and Procedures contains: scope, referenced documents, test [specifications or scenarios], and test procedures. It provides a detailed description of each test [specification or scenario] and the [requirement(s) or Use Case(s)] it tests. Each requirement from the [RTMx Report (RTMx) or Use Cases] includes a unique identification (ID) and specified functionality. The test procedures describe the step-by-step actions, expected results, and any special conditions necessary for testing.

During System Integration Testing the System Development Lead (SDL) edits the Testing [Specifications or Scenarios] and Procedures, which the testing staff then uses as a basis for conducting Functional Qualification Testing (FQT) and determining the system’s ability to meet requirements. The testing staff will update the Testing [Specifications or Scenarios] and Procedures as needed during the FQT process.

1.3 Performance Testing

This document does not address Performance Testing for [project acronym and version].

1.4 Security Test and Evaluation

This document does not address Certification and Accreditation (C&A) Security Test and Evaluation (ST&E) testing for [project acronym and version].

1.5 System Compatibility Testing

This document does not address System Compatibility Testing (SCT) for [project acronym and version].

1

Month DD,YYYY/v.1

[Project Name]Testing Specifications and Procedures

2 REFERENCED DOCUMENTS

The following documents are either referenced in or were used in preparation of this document:

2.1 Regulatory References

Section 508 of the Rehabilitation Act (29 U.S.C. 794d), as amended by the Military Construction Appropriations Act for Fiscal Year 2001 (P.L. 106 246), July 2000.

2.2 Department of Commerce Document References

Technical Reference Model, Version 7.0, February 2003.

Systems Development Life Cycle, Version 1.00, June 5, 2007.

Quality Assurance for Automated Information Systems Technical Standard and Guideline, IT 212.2 04, January 30, 1997.

Testing Technical Standards and Guidelines, USPTO IT-212.3 01, October 2001.

Accommodation Technology Compliance (ATC) Functional Qualification Testing Specifications and Procedures, June 1, 2001.

2.3 Project Specific Document References

Use the exact title of the AIS document rather than writing the skeleton/shortcut titles shown below.

Keep month, day, and year together by using non-breaking spaces [CTRL + SHIFT + Space] and NON-breaking hyphens [CTRL underline which means control + shift + dash]

Do not write a shortcut name such as” QA Plan for ABC System,” rather, “Quality Assurance Plan for Act Baseline Cat 2.5 (ABC 2.5), January 23, 2006.”

Requirements Specification for [name of AIS and version], Month XX, YYYY.

Requirements Traceability Matrix for [name of AIS and version], Month XX, YYYY.

Concept of Operations for [name of AIS and version], Month XX, YYYY.

Detailed Design Document for [name of AIS and version], Month XX, YYYY.

QA Plan for [name of AIS and version], Month XX, YYYY.

Test Plan for [name of AIS], Month XX, YYYY.

Configuration Management Plan for [name of AIS and version], Month XX, YYYY.

Office of Procurement Check List for Processing EIT Request (Electronic Information Technology),

2.4 Commercial Document References

1

Month DD,YYYY/v.1

[Project Name]Testing Specifications and Procedures

3 TEST SPECIFICATIONS AND PROCEDURES

3.1 Features to be Tested

Specific functionality and features of [project acronym and version] to be tested are included in Appendix B, [Requirements or Use Cases] To Test Allocation Matrix.

The principal features to be tested are categorized into the following areas:

a) Accessibility (Section 508).

b) External Interfaces.

c) Performance.

d) Important feature.

e) Important feature.

f) Important feature.

3.2 Features Not to be Tested

[Copy the FIRST paragraph from Section 5.5 from the Test Plan.] [Testing will also not include product features that have not been modified from the previous version.]

3.3 Approach Refinements

There are no changes to the testing approach that was defined in the [name of AIS] Test Plan.

[OR]

[Describe changes that have been made to the test approach. In particular, was §508 planned or added? Are automated scripts now expected to be used to test system functionality? Are Performance Testing scripts now planned? Has the expected location of testing changed?]

3.4 Feature Pass/Fail Criteria

Any discrepancies identified are classified as one of three types defined in Table 3-1:

Table 3-1Severity Rankings for Discrepancies

Severity / Description
Critical / Discrepancies that halt further program execution.
Example: run-time errors that cause the system to lock up in an unexplained or unexpected manner.
Major / Discrepancies that cause the application not to perform as functionally required.
Example: inability to print a report.
Minor / Discrepancies that are not considered critical or major.
Examples: misspellings on a screen, ambiguous Help messages.

3.5 Input Specifications

See the Operator Action column for the detailed input specifications in Section 3.8.

3.6 Output Specifications

See the Expected Results column for the expected outputs of each operator action in Section 3.8.

3.7 Environmental Needs

Copy the hardware and software from the Test Plan unless you have more current information.

Include Testing Support software (Bobby, JAWS, etc.) in the Software subsection.

The SDL will supply the test environment and is responsible for building the test database.

3.7.1 Hardware

3.7.2 Software

3.7.3 Other Needs

a) System IDs and passwords.

b) PTOnet.

c) Interface with [System name] using a [full or partial copy /production version] and [similar/production] hardware.

d) Interface with [System name] using a [full or partial copy /production version] and [similar/production] hardware.

e) Interface with [System name] using a [full or partial copy /production version] and [similar/production] hardware.

f) Interface with [System name] using a [full or partial copy /production version] and [similar/production] hardware.

g) Interface with [System name] using a [full or partial copy /production version] and [similar/production] hardware.

h) Interface with [System name] using a [full or partial copy /production version] and [similar/production] hardware.

i) Interface with [System name] using a [full or partial copy /production version] and [similar/production] hardware.

j) Interface with [System name] using a [full or partial copy /production version] and [similar/production] hardware.

1

Month DD,YYYY/v.1

[Project Name] Functional Qualification Testing

3.8 Test Specifications And Procedures

Examples of Testing [Specifications or Scenarios] and Procedures are provided below.

Remember to include testing Support software (Bobby, JAWS, etc.) in the Prerequisites or Setup.

Test Name: / Test Case 1: Create Workload Navigator Basic Pie Chat Test Case.
Description: / The Pie and Text Chart features should display independently for Assigned and Unassigned cases in Workload Screen.
Requirement(s): / TICRS-2.
Prerequisites: / The user is logged on as an SLIE/BA.
Setup: / Tester must point to Mock P drive in test environment. Verify that the most recent file is currently in Trademarks\FAST 2.1\BIN\Fast Application.exe of the current CM Build. Create a desktop shortcut of the FAST executable file from Mock P and launch the FAST exe.
Map to the TICRS drive(s):

Steps:

Step / Operator Action / Expected Results / Observed Results / Pass/
Fail
/ Logon to system as an SLIE user with valid ID. / User should be able to log in the system and Workload screen appears for SLIE.
/ Click on the ‘Assigned’ tab. / The Assigned work for Law Office in which the SLIE is assigned by default appears on screen. The Suspensions cases appear in the work load screen.
/ Click on the ‘ Pie Chart.’ / Pie Chat screen will appear.
/ Click ‘Close.’ / Display Suspension cases screen.
/ Click on the ‘Unassigned’ tab. / ‘Unassigned’ will be appear.



Test Name: / Test Case 2: Prosecution History Webpage
Description: / In case details screen Prosecution History Webpage link will be available which provided by USPTO.
Requirement(s): / TICRS-4.
Prerequisites: / The user is logged on as an LIE/SLIE/BA.
Setup: / Tester must point to Mock P drive in test environment. Verify that the most recent file is currently in Trademarks\FAST 2.1\BIN\Fast Application.exe of the current CM Build. Create a desktop shortcut of the FAST executable file from Mock P and launch the FAST exe.
Map to the TICRS drive(s):

Steps:

Step / Operator Action / Expected Results / Observed Results / Pass/
Fail
/ Logon to system as an SLIE user with valid ID. / User should be able to log in the system and Workload screen appears for SLIE.
/ Click on the ‘Assigned’ tab. / The Assigned work for Law Office in which the SLIE is assigned by default appears on screen. The Suspensions cases appear in the work load screen.




Test Name: / Test Case 3: Display list of all TICRS documents.
Description: / Case Details screen tab will provide access to users to view or display list of all TCRS documents.
Requirement(s): / TICRS-6.
Prerequisites: / The user is logged on as an LIE/SLIE/BA.
Setup: / Tester must point to Mock P drive in test environment. Verify that the most recent file is currently in Trademarks\FAST 2.1\BIN\Fast Application.exe of the current CM Build. Create a desktop shortcut of the FAST executable file from Mock P and launch the FAST exe.
Map to the TICRS drive(s):
Step / Operator Action / Expected Results / Observed Results / Pass/
Fail
1. / Logon to system as an SLIE user with valid user ID. / User should be able to log in the system and Workload screen appears for SLIE.
2. / Click on the ‘Work List’ tab. / The ‘Work List’ screen will be appear, also the Suspensions cases category page display by default.
3.
4.
5.

1

Month DD,YYYY/v.1

[Project Name]Functional Qualification Testing
Specifications and Procedures

APPENDIX A ACRONYMS AND ABBREVIATIONS

ATC / Accommodation Technology Compliance
C&A / Certification & Accreditation
CD / Change Document
CM / Configuration Management
COTS / Commercial Off-The-Shelf
EIT / Electronic Information Technology
FQT / Functional Qualification Testing
GUI / Graphical User Interface
HP / Hewlett-Packard
ID / Identification
OCIO / Office of Chief Information Officer
PTOnet / USPTO’s campus-wide network
QA / Quality Assurance
RTMx / RTMx Report
SCT / System Compatibility Testing
SDL / System Development Manager
ST&E / Security Test and Evaluation
USPTO / United States Patent and Trademark Office

1

Month DD,YYYY/v.1

[Project Name]Functional Qualification Testing
Specifications and Procedures

APPENDIX B [REQUIREMENTS or USE CASES] TO TEST ALLOCATION MATRIX

Requirement Number / Requirement Description / Test Case Where Verified
RE-V1.0001 / The system shall provide for the initial receipt and initial scanning of reexamination applications in OIPE when a final filing date is established and will provide in OIPE the ability to create, maintain and print Contents entries for the electronic file corresponding to the separate papers in the paper reexamination application. / 1
RE-V1.0002 / The system shall will provide the ability to preclude reexamination applications from automated initial classification. / 1
RE-V1.0003 / The system shall provide the ability to retrieve an “as filed” version of reexamination electronic file wrapper. / 2
RE-V1.0004 / The system shall provide the ability to view and print an electronic Contents listing of all papers for each electronic file folder in the REPS system. / 2
RE-V1.0036 / The system shall provide an interface with the Revenue Accounting Management System (RAM), so that at any time during a REPS session, a user may enter or modify fee information. / 3
RE-V1.0037 / The system shall distinguish between a reexam requester and other types of filers when the bibliographic data is captured. Additionally, the system shall determine the patent number on which the reexamination request is based from the bibliographic data entered on the form. / Deferred
RE-V1.0067 / The PSR printer must print a separator sheet containing this information as the first page of the print job. / 3
RE-V1.0095 / PTCS will have the capability to print an electronic CD-R contents list to be stored on the CD-R as a file. / 3

1

Month DD,YYYY/v.1