COCOMO II/Chapter 2 tables/Boehm et al.

Table 2.1 User Function Types
Function Point / Description
External Input (EI) / Count each unique user data or user control input type that enters the external boundary of the software system being measured.
External Output (EO) / Count each unique user data or control output type that leaves the external boundary of the software system being measured.
Internal Logical File (ILF) / Count each major logical group of user data or control information in the software system as a logical internal file type. Include each logical file (e.g., each logical group of data) that is generated, used, or maintained by the software system.
External Interface Files (EIF) / Files passed or shared between software systems should be counted as external interface file types within each system.
External Inquiry (EQ) / Count each unique input-output combination, where input causes and generates an immediate output, as an external inquiry type.
Table 2.2 FP Counting Weights
For Internal Logical Files and External Interface Files
Data Elements
Record Elements / 1 - 19 / 20 - 50 / 51+
1 / Low / Low / Avg.
2 - 5 / Low / Avg. / High
6+ / Avg. / High / High
For External Output and External Inquiry
Data Elements
File Types / 1 - 5 / 6 - 19 / 20+
0 or 1 / Low / Low / Avg.
2 - 3 / Low / Avg. / High
4+ / Avg. / High / High
For External Input
Data Elements
File Types / 1 - 4 / 5 - 15 / 16+
0 or 1 / Low / Low / Avg.
2 - 3 / Low / Avg. / High
3+ / Avg. / High / High
Table 2.3 UFP Complexity Weights
Complexity-Weight
Function Type / Low / Average / High
Internal Logical Files / 7 / 10 / 15
External Interfaces Files / 5 / 7 / 10
External Inputs / 3 / 4 / 6
External Outputs / 4 / 5 / 7
External Inquiries / 3 / 4 / 6
Table 2.4 UFP to SLOC Conversion Ratios
Language / SLOC / UFP / Language / SLOC / UFP
Access / 38 / Jovial / 107
Ada 83 / 71 / Lisp / 64
Ada 95 / 49 / Machine Code / 640
AI Shell / 49 / Modula 2 / 80
APL / 32 / Pascal / 91
Assembly - Basic / 320 / PERL / 27
Assembly - Macro / 213 / PowerBuilder / 16
Basic - ANSI / 64 / Prolog / 64
Basic - Compiled / 91 / Query – Default / 13
Basic - Visual / 32 / Report Generator / 80
C / 128 / Second Generation Language / 107
C++ / 55 / Simulation – Default / 46
Cobol (ANSI 85) / 91 / Spreadsheet / 6
Database – Default / 40 / Third Generation Language / 80
Fifth Generation Language / 4 / Unix Shell Scripts / 107
First Generation Language / 320 / USR_1 / 1
Forth / 64 / USR_2 / 1
Fortran 77 / 107 / USR_3 / 1
Fortran 95 / 71 / USR_4 / 1
Fourth Generation Language / 20 / USR_5 / 1
High Level Language / 64 / Visual Basic 5.0 / 29
HTML 3.0 / 15 / Visual C++ / 34
Java / 53
Table 2.5 Rating Scale for Software Understanding Increment SU
Very Low / Low / Nominal / High / Very High
Structure / Very low cohesion, high coupling, spaghetti code. / Moderately low cohesion, high coupling. / Reasonably well-structured; some weak areas. / High cohesion, low coupling. / Strong modularity, information hiding in data / control structures.
Application
Clarity / No match between program and application world-views. / Some correlation between program and application. / Moderate correlation between program and application. / Good correlation between program and application. / Clear match between program and application world-views.
Self-Descriptive-ness / Obscure code; documentation missing, obscure or obsolete / Some code commentary and headers; some useful documentation. / Moderate level of code commentary, headers, documentation. / Good code commentary and headers; useful documentation; some weak areas. / Self-descriptive code; documentation up-to-date, well-organized, with design rationale.
SU Increment to ESLOC / 50 / 40 / 30 / 20 / 10
Table 2.6 Rating Scale for Assessment and Assimilation Increment (AA)
AA Increment / Level of AA Effort
0 / None
2 / Basic module search and documentation
4 / Some module Test and Evaluation (T&E), documentation
6 / Considerable module T&E, documentation
8 / Extensive module T&E, documentation
Table 2.7 Rating Scale for Programmer Unfamiliarity (UNFM)
UNFM Increment / Level of Unfamiliarity
0.0 / Completely familiar
0.2 / Mostly familiar
0.4 / Somewhat familiar
0.6 / Considerably familiar
0.8 / Mostly unfamiliar
1.0 / Completely unfamiliar
Table 2.8 Adapted Software Parameter Constraints and Guidelines
Code Category / Reuse Parameters
DM / CM / IM / AA / SU / UNFM
New
- all original software / not applicable
Adapted
- changes to pre-existing software / 0% - 100% normally > 0% / 0+% - 100% usually > DM and must be > 0% / 0% - 100+%
IM usually moderate and can be > 100% / 0% – 8% / 0% - 50% / 0 - 1
Reused
- unchanged existing software / 0% / 0% / 0% - 100% rarely 0%, but could be very small / 0% – 8% / not applicable
COTS
- off-the-shelf software (often requires new glue code as a wrapper around the COTS) / 0% / 0% / 0% - 100% / 0% – 8% / not applicable
Table 2.9 Variation in Percentage of Automated Re-engineering
Re-engineering Target / AT (% automated translation)
Batch processing / 96%
Batch with SORT / 90%
Batch with DBMS / 88%
Batch, SORT, DBMS / 82%
Interactive / 50%
Table 2.10 Scale Drivers for COCOMO II Models
Scale Drivers / Very Low / Low / Nominal / High / Very High / Extra High
PREC / thoroughly unprecedented / largely unprecedented / somewhat unprecedented / generally familiar / largely familiar / thoroughly familiar
SFj: / 6.20 / 4.96 / 3.72 / 2.48 / 1.24 / 0.00
FLEX / rigorous / occasional relaxation / some relaxation / general conformity / some conformity / general goals
SFj: / 5.07 / 4.05 / 3.04 / 2.03 / 1.01 / 0.00
RESL / little (20%) / some (40%) / often (60%) / generally (75%) / mostly (90%) / full (100%)
SFj: / 7.07 / 5.65 / 4.24 / 2.83 / 1.41 / 0.00
TEAM / very difficult interactions / some difficult interactions / basically cooperative interactions / largely cooperative / highly cooperative / seamless interactions
SFj: / 5.48 / 4.38 / 3.29 / 2.19 / 1.10 / 0.00
PMAT / SW-CMM Level 1 Lower / SW-CMM Level 1 Upper / SW-CMM Level 2 / SW-CMM Level 3 / SW-CMM Level 4 / SW-CMM Level 5
SFj: / 7.80 / 6.24 / 4.68 / 3.12 / 1.56 / 0.00
or the estimated Process Maturity Level (EMPL)
Table 2.11 Precedentedness Rating Levels
Feature / Very Low / Nominal / High / Extra High
Organizational understanding of product objectives / General / Considerable / Thorough
Experience in working with related software systems / Moderate / Considerable / Extensive
Concurrent development of associated new hardware and operational procedures / Extensive / Moderate / Some
Need for innovative data processing architectures, algorithms / Considerable / Some / Minimal
Table 2.12 Development Flexibility Rating Levels
Feature / Very Low / Nominal / High / Extra High
Need for software conformance with pre-established requirements / Full / Considerable / Basic
Need for software conformance with external interface specifications / Full / Considerable / Basic
Combination of inflexibilities above with premium on early completion / High / Medium / Low
Table 2.13 RESL Rating Levels
Characteristic / Very
Low / Low / Nominal / High / Very
High / Extra
High
Risk Management Plan identifies all critical risk items, establishes milestones for resolving them by PDR or LCA. / None / Little / Some / Generally / Mostly / Fully
Schedule, budget, and internal milestones through PDR or LCA compatible with Risk Management Plan. / None / Little / Some / Generally / Mostly / Fully
Percent of development schedule devoted to establishing architecture, given general product objectives. / 5 / 10 / 17 / 25 / 33 / 40
Percent of required top software architects available to project. / 20 / 40 / 60 / 80 / 100 / 120
Tool support available for resolving risk items, developing and verifying architectural specs. / None / Little / Some / Good / Strong / Full
Level of uncertainty in key architecture drivers: mission, user interface, COTS, hardware, technology, performance. / Extreme / Significant / Considerable / Some / Little / Very Little
Number and criticality of risk items. / > 10 Critical / 5-10 Critical / 2-4 Critical / 1 Critical / > 5Non-Critical / < 5 Non-Critical
Table 2.14 TEAM Rating Components
Characteristic / Very
Low / Low / Nominal / High / Very
High / Extra
High
Consistency of stakeholder objectives and cultures / Little / Some / Basic / Considerable / Strong / Full
Ability, willingness of stakeholders to accommodate other stakeholders’ objectives / Little / Some / Basic / Considerable / Strong / Full
Experience of stakeholders in operating as a team / None / Little / Little / Basic / Considerable / Extensive
Stakeholder teambuilding to achieve shared vision and commitments / None / Little / Little / Basic / Considerable / Extensive

Table 2.15 PMAT Ratings for Estimated Process Maturity Level (EPML)

PMAT Rating / Maturity Level / EPML
Very Low / - / CMM Level 1 (lower half) / 0
Low / - / CMM Level 1 (upper half) / 1
Nominal / - / CMM Level 2 / 2
High / - / CMM Level 3 / 3
Very High / - / CMM Level 4 / 4
Extra High / - / CMM Level 5 / 5
Table 2.16 KPA Rating Levels
Key Process Areas (KPA) / Almost Always1 / Frequently2 / About Half3 / Occasionally4 / Rarely if Ever5 / Does Not Apply6 / Don’t Know7
Requirements Management
  • System requirements allocated to software are controlled to establish a baseline for software engineering and management use.
  • Software plans, products, and activities are kept consistent with the system requirements allocated to software.
/  /  /  /  /  /  / 
Software Project Planning
  • Software estimates are documented for use in planning and tracking the software project.
  • Software project activities and commitments are planned and documented.
  • Affected groups and individuals agree to their commitments related to the software project.
/  /  /  /  /  /  / 
Software Project Tracking and Oversight
  • Actual results and performances are tracked against the software plans
  • Corrective actions are taken and managed to closure when actual results and performance deviate significantly from the software plans.
  • Changes to software commitments are agreed to by the affected groups and individuals.
/  /  /  /  /  /  / 
Software Subcontract Management
  • The prime contractor selects qualified software subcontractors.
  • The prome contractor and the subcontractor agree to their commitments to each other.
  • The prome contractor and the subcontractor maintain ongoing communications.
  • The prime contractor tracks the subcontractor’s actual results and performance against its commitments.
/  /  /  /  /  /  / 
Table 2.16 (Cont'd)
Software Quality Assurance (SQA)
  • SQA activities are planned.
  • Adherence of software products and activities to the applicable standards, procedures, and requirements is verified objectively.
  • Affected groups and individuals are informed of software quality assurance activities and results.
  • Noncompliance issues that cannot be resolved within the software project are addressed by senior management.
/  /  /  /  /  /  / 
Software Configuration Management (SCM)
  • SCM activites are planned.
  • Selected workproducts are identified, controlled, and available.
  • Changes to identified work products are controlled.
  • Affected groups and individuals are informed of the status and content of software baselines.
/  /  /  /  /  /  / 
Organization Process Focus
  • Software process development and improvement activities are coordinated across the organization.
  • The strengths and weaknesses of the software processes used are identified relative to a process standard.
  • Organization-level process development and improvement activities are planned.
/  /  /  /  /  /  / 
Organization Process Definition
  • A standard software process for the organiation is developed and maintained.
  • Information related to the use of the organization’s standard software process by the software projects is collected, reviewed, and made available.
/  /  /  /  /  /  / 
Training Program
  • Training activities are planned.
  • Training for developing the skills and knowledge needed to perform software management and technical roles is provided.
  • Individuals in the software engineering group and software-related groups receive the training necessary to perform their roles.
/  /  /  /  /  /  / 
Table 2.16 (Cont'd)
Integrated Software Management
  • The project’s defined software process is a tailored version of the organization’s standard software process.
  • The project is planned and managed according to the project’s defined software process.
/  /  /  /  /  /  / 
Software Product Engineering
  • The software engineering tasks are defined, integrated, and consistently performed to produce the software
  • Software work products are kept consistent with each other.
/  /  /  /  /  /  / 
Intergroup Coordination
  • The customer’s requirements are agreed to by all affected groups.
  • The commitments between the engineering groups are agreed to by the affected groups.
  • The engineering groups identify, track, and resolve intergroup issues.
/  /  /  /  /  /  / 
Peer Reviews
  • Peer review activities are planned.
  • Defects in the software work products are identified and removed.
/  /  /  /  /  /  / 
Quantitative Process Management
  • The quantitative process management activities are planned.
  • The process performance of the project’s defined software process is controlled quantitatively.
  • The process capability of the organization’s standard software process is known in quantitative terms.
/  /  /  /  /  /  / 
Table 2.16 (Cont'd)
Software Quality Management
  • The project’s software quality management activities are planned.
  • Measurable goals of software product quality and their priorities are defined.
  • Actual progress toward achieving the quality goals for the software products is quantified and managed.
/  /  /  /  /  /  / 
Defect Prevention
  • Defect prevention activities are planned.
  • Common causes of defects are sought out and identified.
  • Common causes of defects are priortized and systematically eliminated.
/  /  /  /  /  /  / 
Technology Change Management
  • Incorporation of technology changes are planned.
  • New technologies are evaluated to determine their effect on quality and productivity.
  • Appropriate new technologies are transferred into normal practice across the organization.
/  /  /  /  /  /  / 
Process Change Management
  • Continuous process improvement is planned.
  • Participation in the organization’s software process improvement activities is organization wide.
  • The organization’s standard software process and the project’s defined software processes are improved continuously.
/  /  /  /  /  /  / 
1. Check Almost Always when the goals are consistently achieved and are well established in standard operating procedures (over 90% of the time).
2. Check Frequentlywhen the goals are achieved relatively often, but sometimes are omitted under difficult circumstances (about 60 to 90% of the time).
3. Check About Half when the goals are achieved about half of the time (about 40 to 60% of the time).
4. Check Occasionally when the goals are sometimes achieved, but less often (about 10 to 40% of the time).
5. Check Rarely If Ever when the goals are rarely if ever achieved (less than 10% of the time).
6. Check Does Not Apply when you have the required knowledge about your project or organization and the KPA, but you feel the KPA does not apply to your circumstances.
7. Check Don’t Know when you are uncertain about how to respond for the KPA.
Table 2.17 RELY Cost Driver
RELY Descriptors: / slight inconven-ience / low, easily recoverable losses / moderate, easily recoverable losses / high financial loss / risk to human life
Rating Levels / Very Low / Low / Nominal / High / Very High / Extra High
Effort Multipliers / 0.82 / 0.92 / 1.00 / 1.10 / 1.26 / n/a
Table 2.18 DATA Cost Driver
DATA* Descriptors / DB bytes/Pgm SLOC  10 / 10  D/P  100 / 100  D/P  1000 / D/P  1000
Rating Levels / Very Low / Low / Nominal / High / Very High / Extra High
Effort Multipliers / n/a / 0.90 / 1.00 / 1.14 / 1.28 / n/a
* DATA is rated as Low if D/P is less than 10 and it is very high if it is greater than 1000. P is measured in equivalent source lines of code (SLOC), which may involve function point or reuse conversions.
Table 2.19 Component Complexity Ratings Levels
Control Operations / Computational Operations / Device-dependent Operations / Data Management Operations / User Interface Management Operations
Very Low / Straight-line code with a few non-nested structured programming operators: DOs, CASEs, IF-THEN-ELSEs. Simple module composition via procedure calls or simple scripts. / Evaluation of simple expressions: e.g., A=B+C*(D-E) / Simple read, write statements with simple formats. / Simple arrays in main memory. Simple COTS-DB queries, updates. / Simple input forms, report generators.
Low / Straightforward nesting of structured programming operators. Mostly simple predicates / Evaluation of moderate-level expressions: e.g., D=SQRT(B**2-4.*A*C) / No cognizance needed of particular processor or I/O device characteristics. I/O done at GET/PUT level. / Single file subsetting with no data structure changes, no edits, no intermediate files. Moderately complex COTS-DB queries, updates. / Use of simple graphic user interface (GUI) builders.
Nominal / Mostly simple nesting. Some intermodule control. Decision tables. Simple callbacks or message passing, including middleware-supported distributed processing / Use of standard math and statistical routines. Basic matrix/vector operations. / I/O processing includes device selection, status checking and error processing. / Multi-file input and single file output. Simple structural changes, simple edits. Complex COTS-DB queries, updates. / Simple use of widget set.
Table 2.19 (Cont'd)
Control Operations / Computational Operations / Device-dependent Operations / Data Management Operations / User Interface Management Operations
High / Highly nested structured programming operators with many compound predicates. Queue and stack control. Homogeneous, distributed processing. Single processor soft real-time control. / Basic numerical analysis: multivariate interpolation, ordinary differential equations. Basic truncation, round-off concerns. / Operations at physical I/O level (physical storage address translations; seeks, reads, etc.). Optimized I/O overlap. / Simple triggers activated by data stream contents. Complex data restructuring. / Widget set development and extension. Simple voice I/O, multimedia.
Very High / Reentrant and recursive coding. Fixed-priority interrupt handling. Task synchronization, complex callbacks, heterogeneous distributed processing. Single-processor hard real-time control. / Difficult but structured numerical analysis: near-singular matrix equations, partial differential equations. Simple parallelization. / Routines for interrupt diagnosis, servicing, masking. Communication line handling. Performance-intensive embedded systems. / Distributed database coordination. Complex triggers. Search optimization. / Moderately complex 2D/3D, dynamic graphics, multimedia.
Extra High / Multiple resource scheduling with dynamically changing priorities. Microcode-level control. Distributed hard real-time control. / Difficult and unstructured numerical analysis: highly accurate analysis of noisy, stochastic data. Complex parallelization. / Device timing-dependent coding, micro-programmed operations. Performance-critical embedded systems. / Highly coupled, dynamic relational and object structures. Natural language data management. / Complex multimedia, virtual reality, natural language interface.
Table 2.20 CPLX Cost Driver
Rating Levels / Very Low / Low / Nominal / High / Very High / Extra High
Effort Multipliers / 0.73 / 0.87 / 1.00 / 1.17 / 1.34 / 1.74
Table 2.21 RUSE Cost Driver
RUSE Descriptors: / none / across project / across program / across product line / across multiple product lines
Rating Levels / Very Low / Low / Nominal / High / Very High / Extra High
Effort Multipliers / n/a / 0.95 / 1.00 / 1.07 / 1.15 / 1.24
Table 2.22 DOCU Cost Driver
DOCU Descriptors: / Many life cycle needs uncovered / Some life cycle needs uncovered. / Right-sized to life cycle needs / Excessive for life cycle needs / Very excessive for life cycle needs
Rating Levels / Very Low / Low / Nominal / High / Very High / Extra High
Effort Multipliers / 0.81 / 0.91 / 1.00 / 1.11 / 1.23 / n/a
Table 2.23 TIME Cost Driver
TIME Descriptors: /  50% use of available execution time / 70% use of available execution time / 85% use of available execution time / 95% use of available execution time
Rating Levels / Very Low / Low / Nominal / High / Very High / Extra High