OMP Quality Assurance Plan

May 15, 2006

OMPArchitectability Team

Version 1.1

Team Members:

Eunyoung Cho

Minho Jeung

Kyu Hou

Varun Dutt

Monica Page

REVISION LIST

Document Name: OMP Quality Assurance Plan

Document Number: OMP-SQAP-001

No / Revision / Date / Author / Comments
1 / 0.1 / May 2, 2006 / All team members / Created initial draft
2 / 0.2 / May 3, 2006 / Kyu Hou / Added tools for functional requirements
3 / 0.3 / May 3, 2006 / Varun Dutt / Added tools for quality attribute requirements
4 / 0.4 / May 4, 2006 / Minho Jeung / Added quality assurance strategy
5 / 0.5 / May 4, 2006 / Eunyoung Cho / Added testing approach
6 / 0.6 / May 4, 2006 / Monica Page / Added organization
7 / 1.0 / May 4, 2006 / All members / Final review
8 / 1.1 / May 15, 2006 / All members / Revised SQAP according to reviewing

Table of Contents

1Introduction

2Quality Assurance Strategy

3Reference Documents

Software Verification and Validation Plan(SVVP) (See the review and audit section for review process)

Software Verification and Validation Report (SVVR) (See the review and audit section for review process)

User Documentation

Software Project Management Plan (SPMP) (See the review and audit section for review process)

Software Configuration Management Plan (SCMP) (See the review and audit section for review process)

4Goals

4.1QA Goals of each phase

5Reviews and Audits

5.1Work Product Reviews

5.2Quality Assurance Progress Reviews

6Tools and Techniques

6.1Tools and Techniques for assuring quality of functional requirements

6.2tools and techniques for assuring the quality attribute requirements

7Testing strategy

7.1Unit Testing

7.2Integration Testing

7.3Acceptance Testing

7.4Regression Testing

7.5Criteria of Completeness of Test

8special requirement (equivalence class)

9Organization

9.1Available resources that team intends to devote

9.2Quality assurance team

9.3Managing of the Quality Of artifacts

9.4Process for Prioritizing Quality Assurance Techniques

9.5QA strategy break down into tasks

9.6Quality Assurance Process Measures

10Bibliography

11Glossary

11.1Definition

11.2Acronyms

12Appendix

1Introduction

Purpose:

This document outlines the actions of our team in order to make our object system “Overlay Multicast Protocol” (hereafter referred to as OMP) and other related artifacts conform to the requirement of the stakeholders and the qualitative standards within the specified project resources and constraints. This document with its present format has been created using the IEEE

Std. 730.1-1989.

Scope:

The primary spectators of this document are the OMPArchitectability MSE/MSIT team members. Every member in our team is responsible for the actions planned in this document such as developing the overlay multicast protocol, documenting the results throughout the development of the project, reviewing the project progress, and testing the project quality, controlled by this plan.

The following are the portions of the software lifecycle that are covered by the SQAP:

Requirements, Design, Implementation, Test, Verification andAcceptance

The list of software items to be covered in each of the above mentioned lifecycle phases are given below:

Software Lifecycle Phase / Software Item
Requirements / SRS, SOW
Design / SDD
Implementation / SJCS
Test / STP
Verification and Acceptance / SVVP

The Software Quality Assurance Plan covers the software items. In addition, the SQAP is also covered by this quality assurance plan.

Project Overview

Background and Context

The major stakeholder in this project is POSDATA Co., Ltd. (POSDATA), an IT service provider. With the advent of the ubiquitous era, POSDATA is going a step further than just providing IT services (which consists of System Integration and outsourcing services) by actively providing strategic business solutions for the future. POSDATA now extends their business area to DVR (Digital Video Recorder), which allows users to monitor, store, and control video streaming of images in real time from a remote location through wide area network.

POSDATA wishes to enlarge the DVR system to the N-DVR (Next generation Digital Video Recorder) system. A major objective of the N-DVR system is that many users should be able to audit the traffic status via the N-DVR system at the same time. Currently, if many users attempt to watch the traffic status concurrently, the visual image will not be shown smoothly because the bandwidth of the Internet might exceed the limit caused by large data transaction. Thus, it is necessary to reduce network load because POSDATA has many branches and factories across Korea.

The client will apply a new protocol to N-DVR where N-DVR will be used to transfer video streaming in an in-house broadcasting system or a factory monitoring system. Applying OMP (Overlay Multicast Protocol) to N-DVR will provide added value to POSDATA as they continue to provide IT solutions and seek to improve their business operations.

Project Objectives

  • Apply OMP to the N-DVR Server in order to provide efficient video streaming to clients
  • Solve the network congestion problem that occurs when many clients attempt to view the stream at the same time via the N-DVR Server

Architectural Objectives

  • Evaluate the architecture for OMP with N-DVR according to how well the quality attributes pertaining to the studio objectives mentioned above are addressed and representative of the business drivers for POSDATA.
  • Develop and Prioritize Quality Attribute Scenarios
  • Conduct an ATAM Evaluation on the architecture for POSDATA

Technical Constraints

  • Hardware constraints: Use Linux Server and Windows (Operating System) Clients
  • Development Software: C++

Business Constraints

  • The OMP will run in Linux environment in the N-DVR server.
  • The client OS is Window 2000 or XP.

SQAP will cover the project resources and constraints.

Requirements

Major functional requirements of the project

  • Group Configuration: A group consists of dynamic members (logging into and out of network in real time) that dynamically share data where each group has a unique group id.
  • Member Configuration Each member of a group (as described above in Group Management) can be registered or unregistered at any time.
  • Multicast Routing: The data should be transmitted through an optimized path.
  • Data Replication: Each parent node (which could be a router for example) in a group should duplicate incoming data according to number of child nodes it possess.

Non Functional (quality attribute) requirements of the project

  • Performance: The client should be able to watch the video stream within 3 seconds of the request for the stream.
  • Usability: The configuration of group or group members should be user friendly. Moreover, the system should provide POSDATA with a workable user interface(UI) to manage the network
  • Security: Access of unregistered users should not be allowed.
  • Availability: The systemtries to reestablish the transmission of video stream between the appropriate client nodes within 5 seconds and logs the failure of transmission message.
  • Portability: The system should be able to work on diverse hardware and software platforms existing on connecting client nodes.
  • Modifiability: The system should be able to provide enhanced security requirements that may come from POSDATA client in the future.

Potential extensions of the project

N: N Conference or Network connection services maybe provided over the network from the current 1:N Conference or Network connection services.

2Quality Assurance Strategy

To assure quality of software deliverables in each software development phase, we will use the ‘test factor/test phase matrix’. The matrix has two elements. Those are the test factor and the test phase. The risks coming from software development and the process for reducing the risks should be addressed by using this strategy. The test factor is that the risk or issue which is needed to be tackled, and the test phase is that the phase of software development which conducts the test. The matrix should be customized and developed foreach project. Thus, we will adapt the strategy to our studio project through four steps.

In the first step, we will select the test factors and rank them. The selected test factors such as reliability, maintainability, portability or etc, will be placed in the matrix according to their ranks.

The second step is for identifying the phases of the development process. The phase should be recorded in the matrix.

The third step is that identifying the business risks of the software deliverables. The risks will be ranked into three ranks such as high, medium and low.

The last step is that deciding the test phase of addressing the risks. In this step, we will decide that which risks will be placed each development phase.

For example, the table given below addresses a ranked list of test factors on the project and also specifies the various lifecycle phases on the project. One risk has been highlighted and a strategy to mitigate the same is also marked. Whenever the team enters a phase, the corresponding risks associated with the phase are identified. The table below serves only as a purpose of example.

Test phase
Test factors / Requirements / Design / Build / Dynamic test / Integrate / Maintain
Correctness / Risk:
The SRS may not be correct as per the goals of the SQAP;
Strategy:
Formal Technical Review of SRS
Performance
Availability
Continuity of Processing /
Compliance
Ease of use
Coupling
Easeof Operations
Access Control
File Integrity

Test factors/test phase matrix [Perry 2000]

The matrix forms a part of the quality assurance strategy and as mentioned above this matrix would be used in each of the project lifecycle phases to identify the risks associated in each of the phases with respect to the testing factors. The risks would also be accompanied with their mitigation strategies and in case the risk materialized into a problem, the respective mitigation would be applied. It is for these reasons, that a mention is made about the matrix here in a separate section of the document and not mixed with other sections of the document to avoid repetition.

3Reference Documents

Purpose

This section identifies the documents or the work products that will govern our main project activities such as Requirements, Design, Implementation, Test, Verification and Acceptance of the software and lists which documents are to be reviewed or audited for adequacy and completeness. For each document, review and audit to be conducted and the criteria to judge the adequacy are to be specified.

Minimum documentation requirements

Software Requirements Document (SRS)(See the review and audit section for review process)

Software specification review is to be used to check for adequacy and completeness of this documentation. The Software Requirements Document, which defines all the functional requirements, quality attributes requirements and constraints on the OMP project.

Software Architecture and Design (ADD) or Software Design Document (SDD)(See the review and audit section for review process)

Software Architecture and Design review and detailed design review are to be used for adequacy and completeness of this documentation. This document provides the quality attributes on the project and also various architectural decisions the team took meet the quality attributes.

Software Verification and Validation Plan(SVVP)(See the review and audit section for review process)

Software verification and validation plan review is to be used for adequacyand completeness of this documentation. This document although still does not exist presently with the team but would cater to providing the steps for verification and validation of the created work product.

Software Verification and Validation Report (SVVR)(See the review and audit section for review process)

This documentation which still does not exist, should include the following information basically pertaining to the tasks results of SVVP:

Summary of all life cycle V & V tasks and the results of these activities

Suggestions whether the software is, or is not, ready for operational use

User Documentation

This is to be included in the Software Project Management document. This document is not presently made by the team. This provides all the information on the successful software execution and operation of the software to the end user.

Software Project Management Plan (SPMP)(See the review and audit section for review process)

SPMP should identify the following items. These should be reviewed and assessed by all the team members in the team. The items and their corresponding checks include:

Items / Check
Full description of software development activity as defined in the SPMP
Software development and management organizations responsibilitiesas defined in SPMP
Process for managing the software development as defined in SPMP
Technical methods, tools, and techniques to be used in support of the software developmentas defined in the SPMP
Assignment of responsibilities for each activity as defined in the SPMP
Schedule and interrelationships among activities as defined in SPMP
Process improvement activities as defined in SPMP
Goals deployment activities as defined in SPMP
A list of deliverables as defined in the SPMP
Strategic quality planning efforts triggered by reviews as defined in the SPMP

Figure 1 SPMP Review Checklist

Software Configuration Management Plan (SCMP) (See the review and audit section for review process)

This documentation should describe the methods to be used for:

Maintaining information on all the changes made to the software

Other Miscellaneous Documents

Statement of Work

This document defines the work as negotiated with the client.

IEEE Std. 730-2002

IEEE Standard for Software Quality Assurance Plans. This document defines the standards for making the SQAP document.

4Goals

4.1QA Goals of each phase

Phase / Goals
Requirement gathering / SRS should have no more than one defect per page as per the client’s review of the SRS.
Architecture / The ADD should not have more than two defects per architectural representation during its formal technical review (FTR).
Development / Each application program should not have more than 10 defects per 1 KLOC found in FTR.
Testing / All tested work products should be checked for finding at least one defect per page or 10 defects per 1 KLOC of codes in FTR.

5Reviews and Audits

5.1Work Product Reviews

The general Strategy for the review is given below:

The checklist (See Appendix)for review is the same as given in section 4 of the final project, quality assurance plan assignment.

Formal Reviews:

  1. One week prior to the release of document to the client, the SQA will review the document list generated by the Software Product Engineers (team members on a project team).
  2. The SQA will ensure that the necessary revisions to the document have been made and that the document would be released by the stated date. In case there are any shortcomings then the same would be pointed to the software project management.

Informal Reviews:

  1. Design Walk-throughs

The SQA will invite design walk-throughs to encourage peer and management reviews of the design. The Software Project Management would ensure that all the reviews are done in a verifiable way and the results are recorded for easy reference. SQA will ensure that all the action items are addressed

  1. Code Walk-throughs

SQA will invite all the code walk-throughs to ensure that a peer review is conducted for the underlying code. The Software Project Management would ensure that the process is verifiable where as the SQA will ensure that all the items have been addressed.

C.Baseline Quality Reviews

The SQA would review any document or code that is baselined as per the revision number of the work product. This would ensure:

  1. The testing and inspection of module and code before release
  2. Changes to software module design document have been recorded and made
  3. Validation testing has been performed
  4. The functionality has been documented
  5. The design documentation conforms to the standards for the document as defined in the SPMP.
  6. The tool and techniques to verify and validate the sub system components are in place.

Work Product / When Reviewed by Quality Assurance (Status or Criteria) / How Reviewed by Quality Assurance (Standards or Method)
Requirements
(Software Requirements Specification) / After a new release or modification / The Requirements Specification document is reviewed and approved by the assigned reviewer(s). The Requirements Specification document if supplied by the customeris also reviewed by the designated reviewer(s) and any issues or gaps between the requirements stipulated in the contract and those covered in the document are resolved. The reviewed document is presented to the customer for acceptance, as stipulated in the Contract. Requirements Specification document forms the baseline for the subsequent design and construction phases. Changes, if any, to the Requirements Specification document after its release, are studied, their impact evaluated, documented, reviewed and approved before the same are agreed upon and incorporated.
Design and Construction (SDD) / After a new release or modification / The Design phase is carried out using an appropriate system design methodology, standards and guidelines, taking into account the design experience from past projects. The design output is documented in a design document and is reviewed by the Reviewer to ensure that:
  • The requirements including the statutory and regulatory requirements as stated in the Requirements Specification document, are satisfied
  • The acceptance criteria are met
  • Appropriate information for service provision (in the form of user manuals, operating manuals, as appropriate) is provided.
Acceptance for the design document is obtained from the customer, if required by the Contract.
The Design Document forms the baseline for the Construction phase. Changes, if any, to the Design Document after its release, are studied, their impact evaluated,
documented, reviewed and approved before the same are agreed upon and incorporated.
Construction (Code) / After a new release or modification / The Project Team constructs the software product to be delivered to meet the design
specifications, using:
  • Suitable techniques, methodology, standards and guidelines
  • Reusable software components, generative tools, etc. as appropriate
  • Appropriate validation and verification techniques as identified in the Project Plan.
Changes, if any, to the software programs after the release, are studied, their impact evaluated, documented, reviewed and approved before the same are agreed upon and incorporated.
Testing and Inspection (Code and other work products like Test plans, SVVP, SVVR, SCMP and SPMP etc) / After a new release or modification / Before delivery of the product, the PL ensures that all tests, reviews, approvals andacceptances as stipulated in the Project Plan have been completed and documented. No product is delivered without these verifications.
Acceptance (Final Software Deliverables: Code, User Manual, SRS, SDD etc as per the SOW) / As per the SOW / The customer generally reviews and tests the final product. The customer may also review or test intermediate deliveries as stipulated in the contract.
The Project Team assists the customer in planning and conducting the Acceptance
Test, as stipulated in the contract. The customer may prepare the Acceptance Test Plan covering schedules, evaluation procedures, test environment and resources required and conduct acceptance tests. Any problems and defects reported during the Acceptance Testing phase are analyzed and rectified by the Project Team as stipulated in the contract.

5.2Quality Assurance Progress Reviews

In order to remove defects from the work products early and efficiently and to develop a better understanding of causes of defects so that defects might be prevented, a methodical examination of software work products is conducted in projects in the following framework: