NOTE ON AWARD FEE PROVISIONS

(as of 27 October 2009)

This Handout shows how to USE THE 16 BEST PRACTICES as Award Fee Criteria!

TEACHING NOTE: EXAMPLES OF AWARD FEE CRITERIA FOR SOFTWARE-INTENSIVE SYSTEMS

Notes: (1) The award fee plan typically will consist of an Award Fee Provision in the base contract. Evaluation of award fee should be made on a periodic basis with contract modifications to reflect award fees earned and establish incentives for the next evaluation period. The incentives should be in periods dependent on the life cycle planned reviews. This document provides award fee criteria samples that could be used for software-intensive systems in phases of development encompassing the System Requirements Review (SRR), Software Specification Review (SSR), Preliminary Design Review (PDR), and Critical Design Review (CDR). The evaluation periods and reviews along with associated specific award fee criteria should be based on the contractor’s actual proposed methodology, their review schedule and planned development activities. Three periods

(2) A set of software development suggested “Best Practices” were used as the basis to create these notional award fee criteria. However, if you use them all or don’t tailor them to the specific needs of your program, you will likely end up diluting what is most important to the program and not really influence contractor efforts. Ideally, 3 to 5 key areas should be picked and used as the basis to create program-specific award fee criteria. In addition, you should ensure the software quality features you believe are important to the program get emphasized appropriately by aligning award fee Criteria, where appropriate, to the specifics of a Program’s Software Quality Performance Statement and other quality criteria.

(3) Since contracting procedures and requirement vary by command vary, use of these should be considered only as a suggested way, of many possible ways, that award fee criteria could be structured as part of a CPAF effort. The final decision on use of award fee criteria, criteria levels and weighting methodology used on any particular contract must be approved by the Contracting Office and Fee Determining Official (FDO) for the specific contract under consideration and be in accordance with relevant FAR and DFARS provisions. No direct endorsement of these criteria by the DAU is implied.

SOME TYPICAL AWARD FEE CONTRACTUAL PROVISIONS

1. This is a Cost-Plus-Award Fee (CPAF) type contract. The Award Fee amount, the Award Fee provisions contained herein and the administration of these provisions by the Government are not subject to the Disputes Clause of this contract.

2. The total fee structure for this contract shall consist of a fixed base fee and award fee pool.

3. The fixed fee and award fee pool elements of the total pool will be adjusted upward or downward for only those changes to the Statement of Work expressly authorized by the Contracting Officer pursuant to the Authority of the Changes Clause of this contract.

4. A government Award Fee Review Board (AFRB) shall be appointed to evaluate performance and determine the amount of Award Fee earned. Such evaluation board shall consist of Program Management Representatives, Technical Engineers and the Contracting Officer. The evaluation performed by the Board and the resulting Award Fee amount shall be reviewed and approved by the Fee Determining Official (FDO).

5. The Evaluation Board shall rate performance in accordance with the criteria and weights set forth in Table H.6 of this clause. Said ratings shall be scored in accordance with definitions included in Table H.6 of this clause. Prior to the beginning of any evaluation period, the Government reserves the right to change the award fee evaluation criteria, period duration, distribution of remaining award fee dollars, and other matters covered in this plan. Every reasonable attempt will be made to coordinate changes to future periods with the contractor prior to the changes taking place. Changes to the plan for the current period will be subject to mutual agreement between the Government and the contractor.

6. At such time as the Chairperson of the Review Board shall determine, the Government's current evaluation of performance against the criteria may be discussed with the contractor.

7. The contractor has the option of submitting a self-assessment paper, not to exceed 20 pages in length, to the AFRB Secretary, not later than 5 working days after the final working day of a period. The contractor also has the option of presenting a self-assessment briefing to the AFRB not later than 5 working days after the final working day of the period.

8. The contractor shall submit proposed criteria, within each area of emphasis, as well as other plan changes, for the next period of performance, as requested by the AFRB.

9. Where it is determined that an award fee is applicable, any such amount shall be incorporated by contract modification.

EXAMPLES OF AWARD FEE CRITERIA STATEMENTS

1. FORMAL RISK MANAGEMENT

RATING / CRITERIA
Acceptable / (50-84)

a. The contractor has established and implemented a project Risk Management Plan. The plan and infrastructure (tools, organizational assignments, and management procedures) have been agreed to by the Government Program Office (GPO) and the contractor has placed the plan under configuration management (CM).

b. The contractor senior management has established reporting mechanisms and employee incentives in which all members of the project staffs are encouraged to identify risks and potential problems and are rewarded when risks and potential problems are identified early.

c. The contractor has designated and assigned a senior member of the technical staff as risk officer to report directly to the program manager and has been chartered with independent identification and management of risks across the program

d. Risk identification is accomplished in facilitated meetings attended by project personnel most familiar with the area for which risks are being identified. Risk identification includes risks throughout the life cycle in at least these areas: cost, schedule, technical, staffing, external dependencies, supportability, maintainability, and programmatic. Risk identification is updated at least monthly. Risks are characterized in terms of their likelihood of occurrence and impact.

e. The contractor risk officer updated the risk data and database on the schedule defined in the Risk Management Plan. All risks intended for mitigation and any others that are on the critical path and their status against the mitigation strategy are summarized. Newly identified risks go through the same processes as the originally identified risks.

RATING / CRITERIA
Excellent / (85-100)

All acceptable criteria, plus:

1.  Risk identification is accomplished in facilitated meetings attended by project personnel most familiar with the area for which risks are being identified and personnel familiar with problems from similar projects in this area in the past participate in these meetings.

2.  Risk mitigation activities are included in the project's task activity network.

3.  Each medium-impact and high-impact risk is described by a complete Risk Control Profile.

4.  Periodically updated estimates of the cost and schedule at completion include probable costs and schedule impact due to risk items that have not yet been resolved.

5.  The contractor provided for the direct distribution of the Formal Risk Management award fee to all project employees in furtherance of establishing and maintaining a risk culture.


2. EMPIRICAL COST AND SCHEDULE ESTIMATION

RATING / CRITERIA
Acceptable / (50-84)

a. Estimated the cost, effort, and schedule for a project for planning purposes and as a yardstick for measuring performance (tracking). Software size and cost were estimated prior to beginning work on any incremental release.

b. Software cost estimation was a reconciliation between a top-down estimate (based on an empirical model; e.g., parametric, cost) and a bottom-up engineering estimate.

c. All of the software costs were identified with the appropriate lower-level software tasks in the project activity network

RATING / CRITERIA
Excellent / (85-100)

All acceptable criteria, plus: Software cost estimation was subjected to a "sanity check" by comparing it with industry norms and specifically with the contractor's past performance in areas such as productivity and percentage of total cost in various functions and project phases.

3. METRICS-BASED PROJECT MANAGEMENT

RATING / CRITERIA
Acceptable / (50-84)

a. Every project has a project plan with a detail activity network that defines the process the team follows, organizes and coordinates the work, and estimates and allocates cost and schedule among tasks. The project plan included adequate measurement in each of these three categories: (1) Early indications of problems; (2) Quality of the products and (3) Process conformance.

b. Metrics used are sufficiently broad based. Data was collected for each process/phase to provide insight into the above 3 categories.

c. Metrics were used effectively by establishing thresholds for each of these metrics. These thresholds were estimated initially using suggested industry norms for various project classes. Local thresholds have evolved, based upon experience in the life cycle. Violation of a threshold value causes further analysis and decision making. [Examples of data, initial thresholds, and analysis of size, defect, schedule, and effort metrics can be found at http://www.qsm.com.]

d. Continuous data on schedule, risks, libraries, effort expenditures, and other measures of progress are available to all project personnel along with the latest revision of project plans.

RATING / CRITERIA
Excellent / (85-100)

All acceptable criteria, plus: The project plan includes adequate measurement in each of these two additional categories: Effectiveness of the processes and Provision of a basis for future estimation of cost, quality, and schedule.

4. EARNED VALUE TRACKING

RATING / CRITERIA
Acceptable / (50-84)

a. The contractor uses the designated cost and schedule allocations as measures towards progress for producing the products.

b. The contractor has developed and maintained a hierarchical task activity network based on allocated requirements that includes the tasks for all effort that will be charged to the program. All level of effort (LOE) tasks have measurable milestones. All tasks that are not LOE explicitly identify the products produced by the task and have explicit and measurable exit criteria based on these products.

c. No task has a budget or planned calendar duration that is greater than the cost and schedule uncertainty that is acceptable for the program.

d. No task duration is longer than two calendar weeks of effort.

e. Each task that consumes resources has a cost budget allocated to it along with the corresponding staff and other resources that consume this budget. Staff resources are identified in person hours or in days for each labor category working on the task.

f. Milestones for all external dependencies are included in the activity network.

g. Earned value metrics have been collected for each schedule level and made available to all members of the contractor and government project teams monthly. These metrics include: a comparison of Budgeted Cost of Work Scheduled (BCWS), Budgeted Cost of Work Performed (BCWP), and Actual Cost of Work Performed (ACWP). A comparison of budgeted cost for work performed and actual cost of work performed, a Cost Performance Index, a Schedule Performance Index, and a To Complete Cost Performance Index.

h. The lowest-level schedules have been statused weekly.

i. The high-level schedules have been statused at least monthly.

RATING / CRITERIA
Excellent / (85-100)

All acceptable criteria, plus:

1.  For each identified significant risk item, a specific risk mitigation/resolution task has been defined and inserted into the activity network.

2.  The cost reporting system for the total project has segregated the software effort into software tasks so that the software effort can be tracked separately from the non-software tasks.

3.  Earned value reports have been based on data that is no more than two weeks old.

5. DEFECT TRACKING AGAINST QUALITY TARGETS

RATING / CRITERIA
Acceptable / (50-84)

a. The contractor has established quality targets for subsystem software depending on its requirements for high integrity. A mission-critical/safety-critical system may have different quality targets for each subsystem component.

b. Quality Assurance monitored quality targets and reported defects as per the Quality Plan.

c. Quality targets addressed the number of defects by priority and by their fix rate.

d. Actual quality or defects detected and removed are tracked against the quality targets.

e. Periodic estimates of the cost and schedule at completion have been based on the actual versus targeted quality.

RATING / CRITERIA
Excellent / (85-100)

All acceptable criteria, plus: Quality targets are under change control and established at the design, coding, integration, test, and operational levels.

6. PEOPLE AWARE PROGRAM MANAGEMENT

RATING / CRITERIA
Acceptable / (50-84)

a. Contractor senior management ensured that all projects maintained a high degree of personnel satisfaction and team cohesion. They identified and implemented practices designed to achieve high levels of staff retention

b. Contractor senior management provided the project with adequate staff, supported by facilities and tools to develop the software system efficiently.

c. The training of contractor personnel included training according to a project training plan in all the processes, development and management tools, and methods specified in the software development plan.

d. The contractor determined the existing skills of all personnel and provided training, according to the needs of each role, and methods specified in the Software Development Plan (SDP).

RATING / CRITERIA
Excellent / (85-100)

All acceptable criteria, plus:

1.  The contractor employed focus groups and surveys to assess employee perceptions and suggestions for change to support the implementation of practices designed to maintain a high degree of personnel satisfaction and team cohesion.

2.  Employee focus groups and surveys were used to assess the adequacy of staff, facilities, and tools.

7. CONFIGURATION MANAGEMENT

RATING / CRITERIA
Acceptable / (50-84)

a. The contractor developed CM plans, which were approved by the Government Program Office (GPO), to facilitate management control of information under their control. These CM procedures served as the requirements for the CM plan that describes and documents how the contractor implements a single CM process. The CM process included contractor-controlled developed baselines as well as GPO-controlled baselines. The CM process also included release procedures for all classes of products under control, means for identification, change control procedures, status of products, and reviews and audits of information under CM control. The CM plan was consistent with other plans and procedures used by the project.