Lab 3 – Traffic Wizard Prototype Test PlanTraffic Wizard - 1

Running Head: Lab 3 – Traffic Wizard Prototype Test Plan
Lab 3 – Traffic Wizard Prototype Test Plan/Procedure
Traffic Wizard – Blue Team
Old Dominion University
CS 411 - Brunelle
Author: Andrew Crossman
Last Modified: April 3, 2012
Version: 1.0

Table of Contents

1.Objectives

2.References

3.Test Plan

3.1.Testing Approach

3.2.Identification of Tests

3.3.Test Schedule

3.4.Fault Reporting and Data Recording

3.5.Resource Requirements

3.6.Test Environment

List of Figures

Figure 1 : Prototype Major Functional Components Diagram

Figure 2 : E&CS Building Presentation Room

List of Tables

Table 1 : Test Category Identification

Table 2 : Test Schedule

Table 3 : Fault Reporting and Data Recording

1.Objectives

What could easily be considered as one of the world’s most inconvenient societal problems is that of heavy traffic in high population areas. In regions where population growth exceeds the road capacity, traffic flow often becomes congested, which delays many drivers on the way to their respective destinations. Americans suffer 4.8 billion hours of excess commute time every year while 1.9 billion gallons of excess fuel is consumed while waiting in traffic (Texas Transportation Institute, 2011). These costs tend to be incurred the most in very largely populated areas when the data is grouped and analyzed. The amount of traffic on the road is a time-sensitive statistic, which makes it difficult to accurately predict what traffic conditions will be like as a driver travels a specific route. There are many factors involved in the way we currently handle heavy traffic avoidance, each with their own liabilities.

Current methods for determining if heavy traffic lies ahead are not reliable enough to be considered effective. Visual cues are too dependent on environmental factors and are not timely enough to be considered practical in avoiding traffic. Information on traffic conditions through the media, traffic cameras, or a friend is simply not timely enough to be accurate and cannot always be accessed easily. GPS devices sometimes have a traffic monitoring feature, but these devices are prone to connectivity issues and errors when their software becomes outdated. Current mobile apps are a source of distraction and are known to promptly deplete a user’s battery level and put a burden on their network data usage. These risks with smartphone apps need to be mitigated through a means that puts less strain on a user’s smartphone battery and minimizes the amount of data usage needed, but still provide timely traffic information.

Traffic Wizard is a personalized smartphone app solution to help inform drivers of traffic conditions in real-time before and during their trip. The app will feature travel profiles that let drivers store their most frequent routes and set times for their route to be analyzed by the server prior to travel time. By advising alternate routes during a trip to avoid unfavorable traffic congestion, Traffic Wizard will reduce the amount of delay that a driver encounters on their trip.

2.References

Lab 1 – Crossman, Andrew – Traffic Wizard Product Description.CS 411.Spring 2012.

Lab 2 – Crossman, Andrew – Traffic Wizard Prototype Product Specification.CS 411.Spring 2012.

Brownlow, M. (September 2011). Smartphone Statistics and Market Share.Email Marketing Reports. Retrieved from

Lomax, T., & Schrank, D., & Turner, S. (2011). Annual Urban Mobility Report.Texas Transportation Institute. Retrieved from

Schroeder, S. (May 19, 2011). Smartphone Sales Up 85% Year-Over-Year.Mashable Tech. Retrieved from

3.Test Plan

This section provides a comprehensive explanation of the Traffic Wizard test plan. It provides an overview of the types of tests to be performed, the testing schedule, reporting procedures, resource requirements, and the testing environment. Team member responsibilities are also outlined within this section.

3.1.Testing Approach

Traffic Wizard testing will consist of unit tests, integration tests, and system tests to verify performance of the system. The unit tests will test individual software components of the prototype. Integration tests will test the intercommunication between one or more related software components. These tests demonstrate the ability of related components to function together properly. System tests will demonstrate the entire system and how it works. To fully ensure system functionality, system tests will need to be thorough and test all components.

Figure 1 : Prototype Major Functional Components Diagram

The major functional components of the Traffic Wizard prototype are illustrated in Figure 1. Each component involved in the prototype must be tested through these various types of tests. The Traffic Wizard databases, algorithms, user interfaces, and the Simulation Console must all be tested as part of this testing plan. Each set of tests have their own methodology in terms of testing all aspects of their respective components. The databases will be verified through test queries using SQL to ensure the contents of each table. Algorithms will be tested through a specialized test harness designed to demonstrate the output of a particular algorithm based on manually entered input. The user interfaces and Simulation Console will be tested through visiting each GUI screen and executing each of their respective features.

3.2.Identification of Tests

Test cases for the Traffic Wizard prototype are identified in Table 1. These test cases have been divided into five categories, each with their own set of test cases with names and descriptions to define the tests. Test cases may prove one or more functional requirements specified for the Traffic Wizard prototype as outlined in Lab 2. A more detailed description of each test case and its respective procedures can be found in Section 5.

Category ID / Categories / Subcategory ID / Subcategory / Test Case / Name / Description
1 / Databases / 1.1 / Virtual Checkpoint / 1.1.1 / Database Structure Test / Verify the structure of all tables and fields
1.2 / Driver Profile
1.3 / Speed Limit
2 / Algorithms / 2.1 / Speed Aggregator / 2.1.1 / TestAggregateSpeeds / To determine whether the aggregate speed function is working and if it is accurate.
2.2 / VC Reallocator / 2.2.1 / Source Code / To check if the code is written in Java or C++.
2.2.2 / Open VC Database / To test the ability to open the Virtual Checkpoint Database.
2.2.3 / Open Speed Limit Database / Test the ability to open the speed limit database.
2.2.4 / Add Checkpoint / Test the ability to add a checkpoint.
2.2.5 / Delete Checkpoint / Test the ability to delete a checkpoint.
2.3 / Route Matcher / 2.3.1 / Source Code / Verify code is written in Java or C++
2.3.2 / VC Database Connect / Verify Virtual Checkpoint Database is accessible
2.3.3 / Input Parameter / Verify parameter acceptance for input coordinates
2.3.4 / Proximity / Verify ability to return checkpoint ID's within vicinity
2.3.5 / False Proximity / Verify ability to return no checkpoint ID
2 / Algorithms / 2.4 / Route Analyzer / 2.4.1 / Route Analysis Accuracy Test / Verify that the Route Analysis Algorithm properly validates a route against the Virtual Checkpoint Database.
2.4.2 / Route Analysis Data Test / To verify the calculation and communication of congestion data for a user specified route.
2.5 / Blockage Finder / 2.5.1 / Source code / Test for code language used in the Blockage Finder algorithm.
2.5.2 / User Interface / Testing the user interface to be used on the server for Blockage Algorithm.
2.5.3 / Accessing Information / Ensuring if information received is valid.
2.5.4 / Geographical Area / Checking the location through Google Maps.
2.5.5 / Virtual Checkpoints / Virtual Checkpoints
2.5.6 / Route Analysis / Route Analysis algorithm
2.5.7 / Result / Optimal Traffic
2.6 / Next Checkpoint Estimator / 2.6.1 / Next Checkpoint Estimator calculations / Ensure the calculations performed by the algorithm are correct
2.6.2 / Next Checkpoint Estimator deviation / Test the conditional branch in the algorithm that checks whether a user has deviated from a route
3 / Simulation Console / 3.1 / Region Selection / 3.1.1 / Region Support / Verify region maps available for simulation
3.1.2 / Arrival and Destination / Verify regiona maps have entry and exit points
3.2 / Traffic Scenario Selection / 3.2.1 / Scenario Support / Verify scenario options defined and available
3.2.2 / Scenario Scale / Verify scenarios have scalability functions
3.3 / Driver Generator / 3.3.1 / Driver Generator / Ensure that realistic proportions of drivers and users are generated, conforming to variable thresholds which can be changed by the user
3.4 / Simulation Runtime Execution / 3.4.1 / Runtime Defaults and Selections / Verify simulation defaults and selectability of regions and scenarios
3.4.2 / Scenario 1 Execution / Verify Scenario 1 (low congestion) can be executed to show algorithm proof
3.4.3 / Scenario 8 Execution / Verify Scenario 8 (high congestion) can be executed to show algorithm proof
3.4.4 / Virtual Driver Type / Verify generation of two types of virtual drivers
3.5 / Traffic Activity Display / 3.4.1 - 3.4.4 / 3.4 Tests / Verified through Simulation Runtime Execution test cases
4 / Client User Interface / 4.1 / Login / 4.1.1 / Login / Ensure that only authorized users are able to access the main user interface functionality of the application
4.2 / New Trip / 4.2.1 / New Trip / Ensure the process of New Trip Creation runs correctly or fails gracefully
4.3 / Route Tracer / 4.3.1 / Route Tracer / Test the functionality of the Route Tracer screen to ensure that illegal start/stop presses are prevented
4.4 / Edit Trip / 4.4.1 / Edit Trip / Ensure the process of editing a Trip runs correctly or fails gracefully
4.5 / End of Trip / 4.5.1 / End of Trip / Ensure the End of Trip process of runs correctly and unobtrusively to the user
4.6 / Delay Notification / 4.6.1 / Delay Notification / Ensure the delay notification process runs correctly and unobtrusively to the driver
5 / Simulation Console Interface / 5.1 / Main Menu / 5.1.1 / Main Menu Test / Verify interface has accessible buttons/tabs for features
5.2 / Driver Profile Demo / 5.2.1 / Driver Profile Database / Verify that features of Driver Profiles have been implemented correctly
5.2.2 / Driver Profile Screenshots / Verify that features of Driver Profile Demonstration utilizes appropriate GUI screenshots
5.2.3 / Driver Profile Main Menu / Verify that features of Driver Profile Demonstration allows access to the main menu
5.3 / Route Creation Demo / 5.3.1 / Create / Edit / Must describe all fields required for creating a new route manually as outlined in Requirement 3.1.4.1.3.
5.4 / Route Tracer Demo / 5.4.1 / Route Tracer demo / Show the functionality of the Route Tracer works as expected and returns correct results
5.5 / Traffic Simulation Window / 3.4.1 - 3.4.3 / 3.4 Tests / Verified through Simulation Runtime Execution test cases
5.6 / Dashboard / 5.6.1 / Dashboard Access / Verify accessibility from Traffic Simulation window
5.6.2 / Dashboard Return / Verify that Dashboard cannot return to Main Menu during simulation

Table 1 : Test Category Identification

3.3.Test Schedule

Team Blue will be allotted a total of 45 minutes to demonstrate the functionality of the Traffic Wizard prototype. The first ten minutes of the presentation will be utilized to set up and explain the scope of the Traffic Wizard prototype. Table 2 shows the testing schedule for the prototype and the time allotted for each segment of the demonstration. After the prototype has been fully demonstrated, the last part of the presentation will allow time to Team Blue to answer questions for the review board.

Start Time (minutes) / Duration (minutes) / Description / Test Cases Covered
0:10 / 5 / Database Demo
(Driver Profile and Virtual Checkpoint) / 1.1
0:15 / 10 / Algorithm Unit Tests
(via Simulation Console) / 2.1- 2.6
0:25 / 10 / Integration Simulation / 3.1 – 3.5, 5.1 – 5.7
0:35 / 10 / Smartphone Application Demo / 4.1 – 4.6
0:45 / 15 / Questions

Table 2 : Test Schedule

3.4.Fault Reporting and Data Recording

Team Blue will record the failures and successes of the Traffic Wizard prototype during demonstration. The test components are defined as hardware, GUI’s, databases, and algorithms. Table 3 describes these test components and the process for reporting failures for each.

Component / Recording Process
Hardware / Report failures through visual inspection of Smartphone device and server stations
Document through hardcopy forms
GUI / Report failures through visual inspection of GUI screens in app and simulation console
Document through hardcopy forms
Database / Report failures through visual inspection of returned SQL Statements
Document through hardcopy forms
Algorithms / Report failures through visual inspection of output logs
Document through hardcopy forms

Table 3 : Fault Reporting and Data Recording

3.5.Resource Requirements

To demonstrate the functionality of the Traffic Wizard prototype, specific hardware and software resources will be required. Hardware resources will consist of an iPhone smartphone device to demonstrate the app and a desktop workstation to access the Traffic Wizard server and the Simulation Console. Software resources will consist of Ubuntu Server operating system software to run the Traffic Wizard server, PHPMyAdmin software for the databases, the prototype Traffic Wizard app for the iPhone device, and the Simulation Console platform to demonstrate the system functionality.

3.6.Test Environment

The demonstration presentation for the Traffic Wizard prototype will take place at Old Dominion University in Norfolk, VA, in the Engineering and Computational Sciences (E&CS) building first floor conference room. Figure 2 is a photograph of the conference room being used for the presentation. Team Blue will utilize the front of the room, where the workstation and large screen projector are present, to demonstrate the prototype. The CS 411 class and review board will be present as the audience for the demonstration.

Figure 2 : E&CS Building Presentation Room