Test Plan and Cases (TPC) s7

Acceptance Test Plan and Cases Version 3.0 Version no x.xx

Acceptance Test Plan and Cases (ATPC)

LEMA Pilot School Integrated Scheduling System

Team Number 12

Name / Primary Role / Secondary Role
David Wiggins / Project Manager / Developer
Aakash Shah / Prototyper / Developer
Kushalpreet Kaur / Developer / Developer
Thammanoon Kawinfruangfukul / Tester / Developer
Eunyoung Hwang / Architect / Developer
Louis Demaria / IIV&V / Developer
Mark Villanueva / QFP / Developer
Sangik Park / Developer / Developer

Version History

Date / Author / Version / Changes made / Rationale /
11/29/11 / LD / 1.0 / ·  Initial check-in of the document / ·  First draft of the document
12/11/11 / A.S. & H.S.B. / 2.0 / ·  Everything was re worked – new test cases created and traced back / ·  For the Development Commitment Package, documentation of all test cases to proceed into the next phase
02/06/12 / L.D. / 3.0 / ·  Updates to conform to new group formation and rebaselining / ·  Initial check in for Spring ’12 Semester

Table of Contents

Test Plan and Cases (TPC) i

Version History iii

Table of Contents iv

Table of Tables v

Table of Figures vi

1. Introduction 7

2. Test Strategy and Preparation 8

2.1 Hardware preparation 8

2.2 Software preparation 8

2.3 Other pre-test preparations Error! Bookmark not defined.

2.4 Requirements Traceability 8

3. Test Identification 9

3.1 Test Identifier 10

3.2 Test Identifier Error! Bookmark not defined.

4. Resources and schedule 16

4.1 Resources 16

4.2 Staffing and Training Needs 17

4.3 Schedule 17

17

ATPC_RDCP_F12b_T12 Version Date: 02/06/12

Test Plan and Cases (TPC) Table of Contents

Table of Tables

Table 1: Requirements Traceability Matrix 8

Table 2: TC-01-01 Student current progress page 10

Table 3: TC-01-02 Student grade history entry Error! Bookmark not defined.

Table 4: Testing Schedule 17

17

ATPC_RDCP_F12b_T12 Version Date: 02/06/12

Test Plan and Cases (TPC) Table of Contents

Table of Figures

Figure 1: <Figure Title> Error! Bookmark not defined.

17

ATPC_RDCP_F12b_T12 Version Date: 02/06/12

Acceptance Test Plan and Cases Version 3.0 Version no x.xx

1.  Introduction

The purpose of the Acceptance Test Plan is to verify that the system delivered to the customer satisfies all of the win conditions identified by the stakeholders.

A successful run of the ATP will validate the system for delivery to the customer.

Win conditions will be matched with test cases and test cases will be developed in such a way that a passing test will verify that the win conditions associated with that test case have been met. Every win condition will be associated with at least one test case.

Unit testing will be preformed of the individual pieces of the system while system level testing will be done to validate the higher level win conditions. Integration testing will be done as pieces of the system are added to the project though this testing will not be included as part of the acceptance test plan as this testing does not validate win conditions.

2.  Test Strategy and Preparation

The project’s test strategy is using Boundary value testing using exit and entry criteria for continuous Unit and Integration Testing. These will be prioritized by using the Value based Test Prioritization. For test preparation – we need a similar environment like the development environment - at least for the Unit and Integration and Functionality testing. For the final go-live testing any machines at the client’s end will be used.

Also since we will be following an agile incremental development and more so a test based development approach – at least for all the critical test cases, we will be using the unit tests to validate our agile continuous integration.

2.1  Hardware preparation
There is no special hardware preparation needed for testing
2.2  Software preparation

The main Software that will require testing will be FET i.e. our scheduling system. Also the client will be required to use Firefox 7.1 or higher version web browser. We will be making use of MySql as a database.

2.3  Other pre-test preparations
There are no other test-preparations required.
2.4  Requirements Traceability

Table 1: Requirements Traceability Matrix

Requirement ID / Verification Type / Test Case ID (if applicable)
OC-1 Automated Schedule Generation
The system should enable the APSCS to use the student-entered preferences to define classes for the semester; map teachers to these classes; add additional constraints/rules and have the system generate a final optimized schedule / Testing / TC-01-01
OC-2 Student Course Registration
The system should allow students to enter their course preferences for the semester; view their default sets of courses for the semester / Testing / TC-02-01
OC-3 Segmentation of Classes
The system should allow the segmentation of classes determined by the number of students registered and the LA USD norms for maximum and minimum number of students in the class / Testing / TC-03-01
OC-4 Student Progress Tracking
Allows the student to track their progress through the A-G curriculum, check their graduation requirements, grades, and the progress towards the prerequisite college credits / Testing / TC-04-01
OC-5 Database Management
The system should allow the importing and exporting of data to and from the System in .csv format / Testing / TC-05-01
OC-6 User Interface
A simple and intuitive user interface that is easily navigable and informative for the teachers, students, counselors, APSCS / Demonstration
OC-7 Level of Normalization
The introduction of a school code which will enable the scheduling system to be ubiquitous / Demonstration
OC-8 Integration with Family Accountability System
Integration with Team 4 to get student details, grades, as well as permissions for access / Testing / TC-06-01
LOS 1- The system will provide an intuitive User interface in order to provide ease of usability / Demonstration

3.  Test Identification

3.1  Test Identifier

TC-01 Automated Schedule Generation – Data Validation Testing

All the data entered by the APSCS into the system – Course details, the data we get from the other system and the ones selected by the students and the counselors must always be validated. We must at all times have correct entries in our database so that we can go ahead with automated schedule generation.

3.1.1  Test Level

This is a software item level test – is a User Interface input test.

3.1.2  Test Class

This is an erroneous test. Around boundary conditions it should fail when we go beyond and succeed when we are within the boundary.

3.1.3  Test Completion Criteria

·  All data entered is valid and does not violate any Database or code constraint.

·  The automatic Schedule Generation steps can proceed with Class Section splitting

3.1.4  Test Cases

Table 2: TC-01-01 Student current progress page

Test Case Number / TC-01-01 Automated Schedule Generation – Data Validation Testing
Test Item / Input Validation Functionality
Test Priority / M (Must Have)
Pre-conditions / Individuals performing the testing have a validated session
Post-conditions / System is displaying the course pool
Input Specifications / Course Details: Course number (Must be unique)
Bubble Number
Course Name
A-G Requirement Type
Summer/Fall course Type
Expected Output Specifications / The course is entered into the system and students can view it as a part of their drop down list to select a course to register for.
Pass/Fail Criteria / Pass Criteria: The valid data is input in the database
Fail Criteria: The data validation fails and an erroneous entry foes into the database
Assumptions and Constraints / The APSCS knows what they are entering is correct
Dependencies / None
Traceability / WC 858
3.2  Test Identifier

TC-02 Student Course Registration Testing

The students must be able to see a default list of courses (some are mandatory and some are optional) as soon as they access the course registration page. Also the page allows them to exactly select the number of courses they can take and not lower or higher than that.

3.2.1  Test Level

This is a software item level test – is a User Interface input test and a Database integrity test.

3.2.2  Test Class

This is an erroneous test. Around boundary conditions it should fail when we go beyond and succeed when we are within the boundary.

3.2.3  Test Completion Criteria

·  All data entered is valid and does not violate any Database or code constraint.

·  The students are enlisted for registration into the selected courses

3.2.4  Test Cases

Table 3: TC-01-01 Student current progress page

Test Case Number / TC-01-01 Student Course Registration – Data Validation Testing
Test Item / Input Validation Functionality
Test Priority / M (Must Have)
Pre-conditions / Individuals performing the testing have a valid session
Post-conditions / System displays the courses that the student has registered of r(asked for registration)
Input Specifications / A course Number for each A-G type – selected via a drop down list.
Expected Output Specifications / The course is entered into the system and students can view it as a part of their drop down list to select a course to register for.
Pass/Fail Criteria / Pass Criteria: The valid data is input in the database
Fail Criteria: The data validation fails and an erroneous entry goes into the database
Assumptions and Constraints / The students are constrained to take the mandatory courses and there are no exceptions to this rule.
Dependencies / TC-01-01, TC-06-01
Traceability / WC 858
3.3  Test Identifier

TC-03 Segmentation of Classes

Depending on the number of students enrolled by the counselors in every course, and the go ahead by the APSCS to start the first step of the automated scheduling – classes are segmented correctly into sections depending on the minimum and maximum number of students constraints.

3.3.1  Test Level

This is a software item level test – is a Database – stored procedure test.

3.3.2  Test Class

This is an erroneous test. Around boundary conditions it should fail when we go beyond and succeed when we are within the boundary.

3.3.3  Test Completion Criteria

·  Every course is successfully split into one or more sections and this information is now available for the APSCS to input teachers for each course.

3.3.4  Test Cases

Table 4: TC-01-01 Student current progress page

Test Case Number / TC-03-01 Segmentation of classes testing
Test Item / Stored Procedure Testing
Test Priority / M (Must Have)
Pre-conditions / All students have been enrolled in the number of courses they are expected to take. The constraints are also entered into the system.
Post-conditions / The course sections are available for the APSCS to assign teachers to.
Input Specifications / No specific input – just a go ahead to generate the course sections
Expected Output Specifications / Every course is split into sections and each section size is smaller than Maximum size and greater than the minimum size.
Pass/Fail Criteria / Pass Criteria: Courses are successfully split
Fail Criteria: The Stored Procedure does not split every course into sections.
Assumptions and Constraints / All associated data entry has been completed before running this sproc.
Dependencies / TC-02-01
Traceability / WC 858
3.4  Test Identifier

TC-04 Student Progress Tracking Tests

The students are able to view their progress with respect to the various universities – as to where they stand and how many more units they need to complete the university/college requirments.

3.4.1  Test Level

This is a software item level test – is a User Interface output test with Web Service (REST) testing across the two systems..

3.4.2  Test Class

This is a UI view test where the UI displays the progress of the students.

3.4.3  Test Completion Criteria

·  The students are able to view their progress successfully and the data viewed is in fact correct and reflects the student’s current status.

3.4.4  Test Cases

Table 5: TC-01-01 Student current progress page

Test Case Number / TC-04-01 Student Progress Tracking Tests
Test Item / UI Validation and REST service Integration testing
Test Priority / M (Must Have)
Pre-conditions / The Family Accountability system has the student grade history with them.
Post-conditions / System displays the student progress
Input Specifications / None
Expected Output Specifications / UI – a rid view to see the student progress
Pass/Fail Criteria / Pass Criteria: The Grid is populated correctly
Fail Criteria: The data populated has errors or is partial and incorrect
Assumptions and Constraints / Team 4’s system is up and running fine.
Dependencies / TC-06-01
Traceability / WC 858
3.5  Test Identifier

TC-05 Database Management – Export of CSV files testing

The COTS in use – FET needs data that is valid from the database in the form of CSV files that must be exported out. There are three types of expected data – Teachers, Courses and Activities i.e. Teacher Course Mapping. Thus this export must be tested.

3.5.1  Test Level

This is a software item level test – is a Data coherence test.

3.5.2  Test Class

This is a validation test that all data exported actually is correct.

3.5.3  Test Completion Criteria

·  The CSV files have all the details that are mapped correctly from the database and there are no missing entries.

3.5.4  Test Cases

Table 6: TC-01-01 Student current progress page

Test Case Number / TC-05-01 Database Management – Export of CSV files testing
Test Item / Code and Data Validation
Test Priority / M (Must Have)
Pre-conditions / All course allocation, course segmentation and assigning teachers to each course section has been completed.
Post-conditions / Exported CSV file is fed into the FET COTS product.
Input Specifications / No input – just a click on Export CSV files.
Expected Output Specifications / 3 CSV files are generated and are available for saving,
Pass/Fail Criteria / Pass Criteria: The CSV files are generated and are consistent with the Database
Fail Criteria: CSV generation fails
Assumptions and Constraints / The APSCS hits the export button only when all data entry is completed.
Dependencies / TC-04-01
Traceability / WC 858
3.6  Test Identifier

TC-06 Integration with the Family Accountability System – via REST services

We need to integrate the two systems together as we share database entities such as Student and Teacher information (resides in their system), Course Information (residing on our system) and Student Grade history information (on their system again).