Team Silver Bullet
Iteration 2 – Metrics Report
The following is a summary of the metrics captured by the Silver Bullet development team during the second iteration of the project. The metrics are meant to capture both effort and effectiveness of the development effort.
Effort Metrics
Phase Effort Breakdown
Compared to the previous iteration, we can easily see that administrative tasks have taken a back seat to implementation. Initially, much implementation time was spent on .NET configuration and the associated learning curve with the development environment. Here, however, we can associate much of the implementation time with new functionality and integration. Note also the increase in testing time, as well as a bit of maintenance. Both of these areas were lacking in the previous iteration. The team opted to try and finish implementation earlier in order to do more thorough testing, and thus uncovered a few more defects which cost time to fix in both the testing and maintenance areas.
Research time has been reduced due to a better understanding of the domain. Requirements time was also reduced for similar reasons – the team now has a much clearer vision of what needs to be done for the project. Administrative overhead time has also been reduced since the project was launched.
Cumulative Effort
Looking at cumulative effort, we can see that past trends continued, with most team members putting in about the same amount of effort as with the first iteration. Note that the team average total is at 131 hours, an average of 14.5 hours per week. It is estimated that senior project team members will spend roughly 15 hours per week on their projects, and this statistic seems to reinforce that.
Weekly Phase Breakdown
There are several distinct things to note here:
1. Implementation – note that once implementation was started for the first iteration, it continued regularly from week to week. While there was a significant drop-off, around 30 hours a week was spent on implementing additional features after week 5 (the first iteration’s delivery date). The team noted during the last release that implementation would not cease once the delivery was made, and this seems to be the case.
2. Testing – the team made a much more dedicated effort to have a more significant testing phase, and that shows here. Note that testing was almost as significant of an effort as implementation for the last week, meaning that many features were implemented but not necessarily tested entering week 9.
3. Maintenance – notice the gradual increase in maintenance. This trend is expected to continue as the team engages in refactoring. Expect a minor decline in implementation as maintenance increases.
4. Requirements – note the complete drop-off of requirements in week 9. This is a good sign, indicating a clear understanding of the requirements well before the deliverable was made. The team spent most of their requirements and design effort in week 7, when implementation was at its lowest point.
5. Design – design has occupied a steady portion of the team’s time as features are added. It dropped off slightly near the end as most design decisions were made and implementation and testing were in progress.
Defect Metrics
Phase Breakdown
Defect Removal Effectiveness chartP / Phase of Origin
h / 0 / 0 / Requirements
a / 0 / 0 / 0 / Design
s / 0 / 2 / 4 / 6 / Implementation
e / 0 / 2 / 1 / 0 / 3 / Unit testing
0 / 1 / 7 / 0 / 0 / 8 / Integration testing
F / 1 / 3 / 10 / 0 / 0 / 0 / 14 / System testing
o / 0 / 2 / 1 / 0 / 0 / 0 / 0 / 3 / Acceptance testing
u / 0 / 0 / 0 / 0 / 0 / 0 / 0 / 0 / 0 / Regression testing
n / 1 / 10 / 23 / 0 / 0 / 0 / 0 / 0 / 34
d
Requirements / Design / Implementation / Unit testing / Integration testing / System testing / Acceptance testing / Regression testing
Defect injection appears to be done most commonly at implementation, and also significantly at design. Unit, integration, system, and acceptance testing all appear to be effective. The majority of defects are injected at implementation and found at system testing.
Defect Types
Most defects are attributed to coding mistakes – perhaps this is caused by poor communication between team members. The other key area is bad design assumptions, which implies that design is not being done thoroughly enough. The team would probably benefit from some sequence diagrams and an earlier draft of the design document, which would make the design clearer and more accessible to all team members.
Also note the 6 defects injected due to requirements. Originally, this number was expected to be higher due to a lack of domain knowledge on the part of the team. It is expected that this number is lower due to the iterative development process, as well as an accessible customer.
Size Metrics
Raw Stats
· The total lines of code is currently at 4702.
· The total time spent is 655 hours
· There are currently 12 open or assigned defects
· There have been 34 defects injected
· The productivity of the team in LOC/Hour is 7.17
· The defect density is 2.55 Defects/KLOC
· The defect injection rate is 1 defect for every 138 lines of code