Test Plan and Cases (TPC) Version 1.5 Version no x.xx

Test Plan and Cases (TPC)

Cash Doctor 3.0

Team 12

Name / Primary Role
Alisha Parvez / Developer
Danny Lee / Quality Focal Point
Ekasit Jarussinvichai / Developer
Kenneth Anguka / IIV&V
Le Zhuang / Developer
Shreya Sharma / Automation Tester
Steven Helferich / Project Manager
Xichao Wang / Tester

April 26, 2015

Version History

Date / Author / Version / Changes made / Rationale /
11/28/14 / KA / 1.0 / ·  Tailored from Original template for CSCI577a. / ·  Initial draft
12/01/14 / Shreya Sharma / 1.1 / ·  All the test cases / ·  Initial draft
12/08/14 / Shreya Sharma / 1.2 / ·  Making edits / ·  Refined version
2/8/15 / Danny Lee / 1.3 / ·  Perform edits / ·  Revised version
2/9/15 / Shreya Sharma / 1.4 / ·  Added test cases, edited the schedule, requirement traceability and scheduling and training needs / ·  Revised version
4/26/15 / Danny Lee / 1.5 / ·  Removed win conditions that are no longer part of scope. Added new win conditions that are part of scope. / ·  Revised version

Table of Contents

Test Plan and Cases (TPC) i

Version History ii

Table of Contents iii

Table of Tables v

1. Introduction 6

1.1 Purpose 6

1.2 Scope 6

1.3 Focus 6

1.4 Type of Testing 6

2. Test Strategy and Preparation 7

2.1 Hardware preparation 7

2.2 Software preparation 8

2.3 Requirements Traceability 9

3. Test Identification 11

3.1 TC-01 11

3.2 TC-02 12

3.3 TC-03 12

3.4 TC-04 13

3.5 TC-05 14

3.6 TC-07 15

3.7 TC-08 16

3.8 TC-09 17

3.9 TC-10 19

3.10 TC-12 21

3.11 TC-14 22

4. Resources and schedule 25

4.1 Resources 25

4.2 Staffing and Training Needs 25

4.3 Schedule 26

26

TPC_ASBUILT_S15b_T12_V1.5.doc Version Date: 4/26/15

Test Plan and Cases (TPC) Table of Contents

Table of Tables

Table 1 Hardware Preparation 7

Table 2 Software Preparation 8

Table 3: Requirements Traceability Matrix 9

Table 4: TC-01: Capturing the data 11

Table 5: TC-02: Integrate with the existing database 12

Table 6: TC-03: Run on iOS, Android, and Windows Phone 13

Table 7: TC-04-01: Search the data 14

Table 8: TC-05-01: Consumer access to account and dashboard 15

Table 10: TC-07-01: Manual price input 15

Table 11: TC-08-01: User registration – Invalid email address 16

Table 12: TC-08-02: User registration – Duplicate email address 17

Table 13: TC-08-03: User registration – Valid email address 17

Table 14: TC-09-01: Provider review – no content 18

Table 15: TC-09-02: Provider review – valid 18

Table 16: TC-10-01: Geo-location search – retrieve GPS 19

Table 17: TC-10-02: Geo-location search – show nearby providers 19

Table 18: TC-10-03: Geo-location search – show relevant providers 20

Table 19: TC-10-04: Geo-location search – set search radius 20

Table 20 Resources 25

Table 21 Staffing Needs 25

Table 22 Training Needs 26

Table 23: Testing Schedule 26

26

TPC_ASBUILT_S15b_T12_V1.5.doc Version Date: 4/26/15

Test Plan and Cases (TPC) Template Version 1.5

1.  Introduction

1.1  Purpose

The purpose of testing in the Cash Doctor Mobile Application 3.0 is to show the person who requested the software; hereafter known as the client; that all requirements for the system have been meet. Each of tests described in this document either can demonstrate, or can validate through code, that the requirements as agreed upon in the SSRD have been meet.

A secondary purpose to testing is to ensure the uniformity of results, and reduce the number of bugs in the released system. Despite a team’s best effort, there will always be bugs in the system, but a thorough testing plan can assure that these issues are minimized.

1.2  Scope

The scope of testing in this document is contained within only the Cash Doctor Mobile Application 3.0. It will not be the job of the development or testing team to test the capabilities and reliability of outside pieces of the application such as web services.

1.3  Focus

The focus of the testing plan found within this document is to show that the product the development team produces in the future, meets all the customer requirements. In the future, the customer and our client would be requesting a minimum viable product (MVP) initially, so the test plan within this document is targeted towards proving the requirements of the MVP. Discovery of errors in the system is a secondary focus.

1.4  Type of Testing

The testing team will be doing manual/unit testing. As many of the tests involve user interaction with the mobile application, the team decided to not use an automated testing framework at this time. These tests will involve the tester doing a series of actions and expecting a certain result at the end. If this certain result is not met, the test will be considered a failure. Also, these manual tests would be then divided into the following types of testing:

1.  Positive testing

2.  Negative testing

3.  Performance and stress testing

2.  Test Strategy and Preparation

The test strategy that we will employ will involve testing of the user interface as well as testing of the underlying logic and data processing code. Our system architecture will provide for a programmatic separation of the user interface code from the logic and data processing code in order to allow for the application’s functionality to be tested independently from the user interface. This separation will also allow for the development of regression tests and test automation on the component level. Our logic and data processing code will be unit tested primarily by our developers and test engineers. Primarily our independent validation and verification engineer, customer, and the team as a whole will test our user interface.

2.1  Hardware preparation

Our application will eventually run on three different hardware platforms. These three different platforms include the iPhone 6, Samsung Galaxy Note 4, and the Nokia Lumia 830. These three hardware platforms will provide us a good variation of screen sizes and software platforms with each exercising the iOS, Android, and Windows Phone OS platforms. Prior to obtaining hardware, our application will also be emulated to run on the iPhone SDK, Android SDK, and the Windows Phone SDK.

Table 1 Hardware Preparation

Hardware / SOC / Display / Storage / Size/Mass / Camera
iPhone 6 / Apple A8 / 4.7 inch 1334 x 750 LCD / 16GB / 138.1 x 67 x 6.9 mm, 129 grams / 8MP iSight with 1.5µm pixels Rear Facing + True Tone Flash
1.2MP f/2.2 Front Facing
Samsung Galaxy Note 4 / 2.7 GHz Snapdragon 805 / 5.7” 1440p Super AMOLED / 32GB NAND / 153.5 x 78.6 x 8.5 mm, 176 grams / 16MP Rear Facing w/ OIS, 1/2.6" CMOS size (Sony IMX240), F/2.0, 3.7MP FFC w/ F/1.9 aperture
Nokia Lumia 830 / MSM8926 1.2 GHz Snapdragon 400 / 5.0” 1280x720 IPS ClearBlack LCD Corning Gorilla Glass 3 / 16 GB NAND / 139.4 x 70.7 x 8.5 (mm) / 10MP, 1.1 µm pixels, 1/3.4" 16x9 CMOS, f/2.2, 26 mm focal length, LED Flash
2.2  Software preparation

During our initial development, we will need to run the various software SDKs on a separate computer platform. The software requirements for each SDK will vary in the following way:

Table 2 Software Preparation

SDK / Hardware / Software Requirements
iOS / Intel-Based Mac / Mac OS X Snow Leopard or greater
iOS SDK
Xcode
Registration as Apple Developer
Android / Windows XP (32-bit), Vista (32- or 64-bit), or Windows 7 (32- or 64-bit)
or
Mac OS X 10.8.5 or later
Or
Linux / JDK 6
Apache Ant 1.8 or later
Android SDK
Windows Phone / Windows 8 64-bit Pro edition or higher
4 GB RAM or more / Windows Phone 8 emulator
2.3  Requirements Traceability

Table 3: Requirements Traceability Matrix

Requirement ID / Requirement Description / Verification Type / Test Case ID (if applicable)
WC_3082 / System shall capture an image and user entered invoice details for sharing. / Demo / TC-01
WC_3085 / System shall integrate with the existing database at Cash Doctor. / Testing / TC-02
WC_3079 / System shall run on iOS, Android, and Windows Phone. / Demo / TC-03
WC_3084 / System shall search for healthcare pricing, provider by location, price, code, and specialty. / Demo / TC-04
WC_3087 / System shall allow consumer access to his/her existing account by user ID and password, and can view his/her existing dashboard. / Demo / TC-05
WC_3077 / System will be appealing to the target consumer (80% female). / Analysis / LOS-1
WC_3083 / System shall allow consumers to manually enter price information for sharing. / Demo / TC-07
WC_3086 / System shall allow consumer to register as a user. / Demo / TC-08
WC_3089 / System shall allow consumer to create a review of a provider. / Demo / TC-09
WC_3094 / System shall allow users to find their current location to access relevant providers in and around area (some mile radius). / Demo / TC-10
WC_3076 / System will be easy to use and intuitive by all users. / Analysis / LOS-2, TC-21
WC_3080 / System shall be able to support at least 1000 simultaneous users. / Analysis / LOS-3
WC_3091 / System shall allow consumer to rate a provider. / Demo / TC-12
WC_3078 / System will be accurate within a 5 mile radius at a 90% confidence interval. / Analysis / TC-14

3.  Test Identification

3.1  TC-01

TC-01 System shall capture an image and user entered invoice details for sharing.

3.1.1  Test Level

Software item level

3.1.2  Test Class

·  Functionality test

·  Erroneous test

3.1.3  Test Completion Criteria

The user is able to capture an image of the invoice, manually enter the final price, select/tag the service(s), and save to the database.

3.1.4  Test Cases

Table 4: TC-01: Capturing the data

Test Case Number / TC-01-01
Test Item / Capturing the data
Test Priority / M
Pre-conditions / CMS database is initialized
Post-conditions / User captures a photo, enters details about the invoice, and is able to submit and save this to the database.
Input Specifications / The image, total price, and service(s) of the medical bill.
Expected Output Specifications / Stores the image and invoice details in the database
Pass/Fail Criteria / Pass: System is able to capture image and details of the invoice correctly.
Fail: System is unable to capture and save the image or details of the invoice correctly due to any reason
Assumptions and Constraints / None
Dependencies / None
Traceability / WC_3082
3.2  TC-02

TC-02 System shall integrate with the existing database at Cash Doctor.

3.2.1  Test Level

Software item level

3.2.2  Test Class

·  Functionality test

·  Erroneous test

3.2.3  Test Completion Criteria

The backend of the mobile application is integrated with the existing database that is in place for the Cash Doctor website.

3.2.4  Test Cases

Table 5: TC-02: Integrate with the existing database

Test Case Number / TC-02-01
Test Item / Integrate with the existing database
Test Priority / M
Pre-conditions / CMS database is initialized
Post-conditions / System is able to interact with the database successfully
Input Specifications / Data is entered in the database
Expected Output Specifications / Retrieve the previously entered data from the database
Pass/Fail Criteria / Pass: User is able to store/retrieve data into the database and confirm accuracy
Fail: System is unable to save or retrieve the data
Assumptions and Constraints / None
Dependencies / None
Traceability / WC_3085
3.3  TC-03

TC-03 System shall run on iOS, Android, and Windows Phone.

3.3.1  Test Level

Software item level

3.3.2  Test Class

·  Functionality test

·  Erroneous test

3.3.3  Test Completion Criteria

The Cash Doctor app runs on iOS, Android, and Windows Phone with the same functionality.

3.3.4  Test Cases

Table 6: TC-03: Run on iOS, Android, and Windows Phone

Test Case Number / TC-03-01
Test Item / Run on iOS
Test Priority / M
Pre-conditions / Cash Doctor mobile app is deployed on iOS
Post-conditions / Cash Doctor mobile app is running on iOS
Input Specifications
Expected Output Specifications
Pass/Fail Criteria / Pass: Functionality is the same on iOS
Fail: Functionality is not the same on iOS
Assumptions and Constraints / None
Dependencies / None
Traceability / WC_3079
Test Case Number / TC-03-02
Test Item / Run on Android
Test Priority / M
Pre-conditions / Cash Doctor mobile app is deployed on Android
Post-conditions / Cash Doctor mobile app is running on Android
Input Specifications
Expected Output Specifications
Pass/Fail Criteria / Pass: Functionality is the same on Android
Fail: Functionality is not the same on Android
Assumptions and Constraints / None
Dependencies / None
Traceability / WC_3079
3.4  TC-04

TC-04 System shall search for healthcare pricing, provider by location, price, code, and specialty.

3.4.1  Test Level

Software item level

3.4.2  Test Class

·  Functionality test

·  Erroneous test

3.4.3  Test Completion Criteria

The user is able to perform search and retrieve the data accordingly.

3.4.4  Test Cases

Table 7: TC-04-01: Search the data

Test Case Number / TC-04-01
Test Item / Test the search page by location, price, code, and specialty
Test Priority / M
Pre-conditions / CMS database should be initialized, User should be logged in
Post-conditions / Results return results based on search criteria
Input Specifications / Search by location, price, code, and specialty
Expected Output Specifications / Display the results of providers with pricing
Pass/Fail Criteria / Pass: Returned results match the criteria specified from the input
Fail: Returned results do not match the criteria specificed from the input
Assumptions and Constraints / None
Dependencies / None
Traceability / WC_3084
3.5  TC-05

TC-05 System shall allow consumer access to his/her existing account by user ID and password and can view their existing dashboard.

3.5.1  Test Level

Software item level

3.5.2  Test Class

·  Functionality test

·  Erroneous test