ALSTOM Signaling, Inc.

Team Silver Bullet

Modernization of Relay Equivalent Drawing System

Software Project Plan

Version 1.4

Modernization of Relay Equivalent Drawing Program / Version: 1.4
Software Project Plan / Date: 10/6/2018

Revision History

Date / Version / Description / Author
12/13/2004 / 0.1 / Initial Revision. / Kristy Rozanski
12/14/2004 / 0.2 / Added process and metrics definition. / Tim Lund
Kristy Rozanski
12/15/2004 / 0.3 / Updated from Dr. Vallino’s feedback. / Tim Lund
12/17/2004 / 1.0 / Updated roles/responsibilities.
Updated details for each iteration. / Kristy Rozanski
1/7/2005 / 1.1 / Updated with Dr. Vallino’s ideas for Resources & Budget. / Kristy Rozanski
1/23/2005 / 1.2 / Updated iteration release dates
Release the following SRS documents on the same day as the system release / Kristy Rozanski
2/9/2005 / 1.3 / Changed date for next quarter from Thursdays to Wednesdays / Kristy Rozanski
3/11/2005 / 1.4 / Updated release dates and responsibilities
Updated CAAPE definition / Kristy Rozanski

Table of Contents

1.Introduction

1.1Purpose

1.2Scope

1.3Definitions, Acronyms, and Abbreviations

1.4References

1.5Overview

2.Project Organization

2.1External Interfaces

2.2Team Member Roles and Responsibilities

3.Deliverables

3.1Deliverables with each iteration

3.2One-time Deliverables

4.Project Schedule

4.1Iteration Break-down

4.2Major Milestones

5.Process Overview

5.1Background Information

5.1.1Iterative Software Process

5.1.2Builds

5.1.3Iteration Spans

5.1.4Examples

5.2Our Process

6.Tracking & Control

6.1Metrics Goals

6.1.1Ease of Collection

6.1.2Iterative Process Corrections

6.1.3Defect Concentration

6.1.4Overall Tracking

6.2Identified Metrics

6.2.1Progress Metrics

6.2.2Defect Metrics

6.2.3Effort Metrics

6.2.4Size Metrics

7.Project Resourcing

7.1CAAPE Integration/ Shared Functionality

7.2DXF Understanding/ Symbols Library

7.3Placement Algorithm/ Border Implementation

7.4Windows Printing

8.Budget

Software Project Plan

1.Introduction

1.1Purpose

This purpose of this document is to define the project organization and schedule to be used to create a high quality project within the given time frame. This project plan is developed according to the chosen development process defined in Section 5. This document is meant to create a mutual understanding and agreement of the project plan for both ALSTOM Signaling, Inc. and the senior project development team.

1.2Scope

The plan defined by this document contains milestones that are to be met by both the development team and the customer, along with internal development deadlines to follow. The project organization deals with both the external interfaces regarding the project, and also the individual team member roles and responsibilities identified.

1.3Definitions, Acronyms, and Abbreviations

AE – Application Engineer

CAD – Computer Automated Design

CAAPE – Computer-Aided Application Programming Environment, a comprehensive set of development tools for creating VPI and CenTraCode II-s applications; the system that the Relay Drawing program receives its input from.

DLL – Dynamic Linked Library

DXF – Drawing Exchange Format, the standard format for the drawings to be produced by the system (DXF files are viewable with AUTOCAD or Voloviewer – freeware).

SRS – Software Requirements Specification (AKA System Requirements Document)

1.4References

Practical Software Engineering: A Case Study Approach, Maciaszek, Liong, 2005.

1.5Overview

This document provides a basic understanding of the development process being launched with respect to key organization and process areas, as well as gives a list of important milestones for the project. The later sections provide details about the project management effort, specifically defining the metrics that will be used by the team for tracking of the project.

2.Project Organization

2.1External Interfaces

The following are the external contacts within ALSTOM that interface with the project. The responsibilities of each contact are detailed, relating to deployment and acceptance of the product.

Person / ALSTOM Title / Responsibilities
Pete Hart / Staff Engineer, Systems Engineering / - Project Lead
- ALSTOM Senior Project Sponsor
** Main point of contact for the team.
Jack Perkins / Senior Engineer, Systems Engineering / - Technical Lead
- Requirements & Specifications contact.
Joan Mitchell / CAD Analyst, AE CAD System / - Uses CAD daily.
- End user of this system.
- Understanding of daily nuances.
Dale Wood / CAD Software Analyst, AE CAD System / - Inherit the project.
- C/C++ knowledge.
Joe Missler / Software Engineer, VPI CAAPE / - Knowledgeable on CAAPE system.
- Output from CAAPE is the input file to this system.
- Knowledgeable on the CAAPE interface.
Doug Stephans / Systems Engineer, Railroad Projects / - The relay drawings are used by Doug to present to the railroad customers.
Skip Danesi / Systems Engineer, Transit Projects / - The relay drawings are used by Skip to present to the transit customers.

2.2Team Member Roles and Responsibilities

Each member of the team is assigned to be a leader/manager to some aspect of the system. In addition to the identified leadership role, each team member also has a research role to be the expert in a given technical area of the project. This is done to delegate responsibilities and mitigate for the more complex and risky areas of the system.

Person / Role(s) / Responsibilities
Kristy Rozanski / Team Leader
Planning Leader / - Maintains communication with customer
- Schedules meetings with customer
- Maintains development schedule
- Team Coordinator
LLRecon DLL Expert / - Understand DLL interface that parses the input files
Tim Lund / Process/Quality Leader
Requirements Manager / - Process & Quality Assurance
- Compiles/analyzes metrics weekly
- Reports information regarding work load/pace
- Maintains quality and consistency of documentation
- Provides templates for documentation if none exist
- Responsible for the requirements document
Printing Expert / - Understand print preview (if possible - highest priority!)
- Understand printing in Windows/ print drivers
- Identify issues team must be concerned with
Dave Kerstanski / Risk Management Leader
Requirements Analyst / - Manages identified risks
- Reports to team if new plan of action needs to be taken
- Maintains a traceability matrix to ensure requirements are covered in the design, implementation and test
DXF Expert / - Find possible DXF toolkits/libraries
- Find a viewing tool for testing
- Understand format & associated algorithms
- Understand input of DXF symbols
Jessica Linendoll / Development Leader / - System architect
- Responsible for ensuring a clear & consistent design
- Delegation of development assignments
Placement Algorithm Expert
CAAPE System Expert / - Research placement algorithm improvements
- Reduction of whitespace and performance issues
- Communicate with ALSTOM to understand CAAPE
- Address reusability needs
- Address possible integration interfaces
Dan Lovette / Website Manager
Configuration Management
Testing Leader / - Maintains project website, keeping artifacts up-to-date
- Provides solutions for development environment (tools, software, etc.)
- Integration of complete system
- Responsible for test plan & testing tasks
DXF Expert / - Find possible DXF toolkits/libraries
- Find a viewing tool for testing
- Understand format & associated algorithms
- Understand input of DXF symbols

3.Deliverables

The following is the list of artifacts requested by the customer:

  • Project Plan (this document)
  • Software Requirements Specification (SRS)
  • Detailed Design Document
  • Test Plan
  • System Source Code
  • Executable file(s)
  • On-line Help Functions
  • User Manuals

3.1Deliverables with each iteration

The following items will be considered deliverable with each iteration of the process.

Software Requirements Specification

The requirements for the system shall be developed in conjunction with the iterative process and considered a key deliverable with each iteration.Specifically, the requirements shall be developed in a manner such that higher level requirements appear in earlier iterations, and more detailed subsections of those requirements are presented in later iterations.

Detailed Design Document

This document will define the design of the system overall and give insights to key detailed design decisions.

Test Plan

A test plan will be evolved and used throughout the process. It should be developed in parallel with the requirements document in order to clarify system behavior. All input file(s) to test cases will be a part of this deliverable.

System Source Code/ Executable file(s)

In order to generate customer feedback, the system shall be delivered at the end of each iteration for review.

Defect Report

With each iteration, a defect report will be generated and delivered to show which parts of the system had issues with defects and so that outsiders can objectively evaluate the team’s performance.

3.2One-time Deliverables

The following documents will be considered one-time deliverables, rather than iteratively evolving documents:

  • Project Plan
  • On-line Help functions
  • User Manuals

4.Project Schedule

Refer to the Microsoft Project Gantt Chart which details the schedule for the project. The project commenced on December 2, 2004, and will end on or before May 20, 2005. The schedule is derived based on our iterative development process described in the following section, Process Overview.

4.1Iteration Break-down

The following specifies what can be expected with each iteration.

Iteration 1:

-Basic functionality to process input and generate output (DXF files)

-High-level architecture design

-Understanding of existing system/ functionality, incorporate into new architecture

-Use of existing algorithms

Iteration 2:

-Enhance more detail into the system design

-Tackle areas identified as RISKY:

  • Printing
  • DXF intricacies
  • Placement algorithm
  • CAAPE interface

-Improve the existing algorithms

Iteration 3:

-Reflection on process & project plan

-Redesign/ Refactor the system

-Implementation of major feature set (specified in the Vision & Scope document)

-Draft of User Manuals

-Help Features

Iteration 4:

-Final touches & improvements on the system

-Clean-up of design and system documentation

4.2Major Milestones

The following summarizes the dates associated with the milestones identified for the project.

Note: Milestones in Microsoft Project are denoted by a black diamond on the Gantt Chart.

Inception:

Friday, December 10, 2004 – Launch project website (

Friday, December 10, 2004 – Submit project synopsis to SE Department

Friday, December 17, 2004 – Final draft of project inception documents:

Vision & Scope, Project Plan (including development process), and Risk List

Iteration 1:

Week of January 3, 2005 – Specification of input, DLL interface to parsing module (Joe Missler)

Week of January 3, 2005 – Specification of DXF format description (Dale Wood)

Week of January 3, 2005 – Test plans (or test cases) from current system (Dale Wood/ Pete Hart)

Friday, January 7, 2005–Requirements v0.1 for review

Thursday, January 13, 2005 – Validation/ Review of Requirements v0.1 (ALSTOM team)

Friday, January 21, 2005 – Release of beta system v0.1 for acceptance tests

Week of January 24, 2005 – CAAPE development guide/ interface definition (Joe Missler)

Thursday, January 27, 2005 – Review of System v0.1 (ALSTOM team)

Iteration 2:

Monday, January 24, 2005– Requirements v0.2 for review

Thursday, January 27, 2005 – Validation/ Review of Requirements v0.2 (ALSTOM team)

Monday, February 14, 2005 – Release of beta system v0.2 for acceptance tests

Thursday, February 17, 2005 – Review of System v0.2 (ALSTOM team)

Mid-Project Wrap-Up:

Week of February 21, 2005 – Present to ALSTOM Executive Committee

Iteration 3:

Friday, February 25, 2005 – Requirements v0.3 for review

Wednesday, March 16, 2005 – Validation/ Review of Requirements v0.3 (ALSTOM team)

Monday, April 18, 2005 – Release of beta system v0.3 for acceptance tests

Wednesday, April 20, 2005 – Review of System v0.3 (ALSTOM team)

Iteration 4:

Monday, April 18, 2005 – Requirements v1.0 for review

Wednesday, April 20, 2005 – Validation/ Review of Requirements v1.0 (ALSTOM team)

Monday, May 9, 2005 – Release of system v1.0 for acceptance tests

Wednesday, May 11, 2005 – Review of System v1.0 (ALSTOM team)

Deployment:

Wednesday, April 20, 2005 – Submit Project Poster

Friday, May 6, 2005 – Submit Conference Paper

Friday, May 13, 2005 – Final release of project deliverables

Week of May 16, 2005 – Present to ALSTOM Executive Committee

5.Process Overview

5.1Background Information

5.1.1Iterative Software Process

An iterative software process is the repetition of another basic development process with an objective to enrich the software product. The waterfall model contains the basic development steps of requirements specification, design, implementation, testing, and maintenance, but is only slightly iterative because of overlaps between phases of development, but cannot be considered iterative because iteration means movement from one version of the product to the next. Hence, the goal of an iterative process is to take these basic software process steps and repeat them in a manner makes software development more effective and thorough.

5.1.2Builds

An iterative process assumes builds – executable code that is a deliverable of an iteration. The scope of a build is the whole system, but without full functionality, fully developed user interfaces, and other detailed aspects of the system in earlier builds. A build is something that can be demonstrated to the end user as a version of the system on its way to the final product.

5.1.3Iteration Spans

An iterative process assumes short intervals between builds, in weeks or days, not months. This allows for the best possible use of metrics and project management, as well as a more active plan to mitigate risk. However, this amount of process overhead may be overwhelming for a small project, particularly if the developers are not familiar with the process.

5.1.4Examples

The classic example is the Spiral Model (Boehm, 1988). Other examples are the IBM Rational Unified Process (originating from the Rational Unified Process), Model Driven Architecture (Kleppe, 2003), and the agile development process.

5.2Our Process

Our process features a full iteration through requirements, design, implementation, and testing for each cycle. This will allow the user to begin acceptance testing from the initial release to provide feedback throughout the following iteration. In addition, the requirements will be specified in greater detail with each iteration, such that each product version has a corresponding requirements version to be tested against. This is ensured by releasing the requirements to the customer, as was desired.

In addition to the requirements document, all other iterative documents will be considered evolutionary in nature, with versions specific to a release of the product. For example, the Test Plan would evolve with each iteration, with additional content for each release. These documents are fully outlined in Section 3.

Note that one of the primary advantages of this process is that it allows the customer to generate feedback from very early in development. As of the completion of the first iteration, the customer has a working system to evaluate. However, it is also crucial that the customer understand that this initial system is not intended to be fully functional, nor is it intended to match complete specifications of the final system.

For our process, we have decided to initially plan four iterations. These iterations will be logically divided and documented in the project plan.

6.Tracking & Control

The progress of the project must be monitored. Tracking and control are done by collecting metrics. The best way to determine what metrics to use is to first declare the tracking and control goals, and then determine the best set of metrics to meet those goals.

6.1Metrics Goals

6.1.1Ease of Collection

Due to the nature, size, and time constraint of this project, proportional effort should be put into metrics collection. If too much effort is put into tracking a small project, the project becomes bloated and overly bureaucratic.

6.1.2Iterative Process Corrections

One of the advantages of an iterative process is that it can be slightly adjusted during development if there are apparent problems. Therefore, it is desirable to track the project in a way such that the origin of any defects in the system can be determined. It is also desirable to correlate effort to effectiveness – that is, if the process becomes ineffective in a certain area, the metrics should reflect that so an adjustment can be made before development is completed.

6.1.3Defect Concentration

If the process proceeds in a manner that consistently injects defects in a certain manner the metrics should reflect that so a correction can be made.

6.1.4Overall Tracking

As is the goal with most project metrics, tracking and control need to be facilitated. Specifically to determine a reasonable balance of workload and responsibility, as well as proof of good Software Engineering practices in a post-mortem situation.

6.2Identified Metrics

This section identifies metrics and a table depicting which of the defined goal/criteria each metric meets.

Metric Category / Ease of Collection / Iterative Process Corrections / Defect Concentration / Overall Tracking
Progress Metrics (Work Breakdown) / X / X / X
Defect Metrics / X / X / X
Effort Metrics / X / X
Size Metrics / X / X

6.2.1Progress Metrics

Goal: Provide a quick overview of progress toward project completion.

Units: Points system ranked by team before project and adjusted if needed.

Collection: Earned value based on work breakdown, marked off as completed.

Rationale: Simple progress metrics will enable the team to instantly have a rough idea of where they are in the overall development process. Additionally, the team is setting up a detailed project plan, which is the first key to collecting these metrics.

6.2.2Defect Metrics

Goal: Show defect density of software and potential flaws in process that may inject defects.

Units: Defects rated by severity, phase injected, phase detected.

Collection: Preferably a web application, but a spreadsheet may also be acceptable.

Rationale: Collection of these metrics should show whether a certain point in the team’s process is flawed, i.e. if many defects are injected during requirements, perhaps the team needs to re-evaluate its requirements phase.

6.2.3Effort Metrics

Goal: Track each team member’s effort according to time, phase, and activity.

Units: Activity, hours spent on activity, phase activity is part of.

Collection: Individual spreadsheets compiled weekly.

Rationale: These metrics are part of the PSP (Personal Software Process) and give a measure of how the developers are spending their time. They will provide an overview of total team effort for each given phase and combine with the defect metrics to give an idea of any irregularities or defects in the process.

6.2.4Size Metrics

Goal: Track the size of the project from week to week and between deliverables

Units: Lines of code

Collection: Use of an automated tool

Rationale: These metrics can be easily automated and provide an approximation of the size of the project at any given time. While other size estimations may be considered superior, this metric is by far much easier to collect.

7.Project Resourcing

The RIT team is comprised of the necessary staff of Software Engineers; however, there are special skills or experience needed specifically for this project for it to be developed with the necessary level of quality.

This will require a form of training, or self-teaching, for the team members. The following details the skills needed by iteration, the person responsible, and target dates indicating when the training should be completed.