COCOMO III Forum Handout
Required Software Security (SECU)*
Reflects the level of required security based on the Common Criteria for Information Technology Security Evaluation. The Common Criteria provides assurance that the process of specification, implementation and evaluation of a computer security product has been conducted in a rigorous, standard and repeatable manner at a level that is commensurate with the target environment.
The Common Criteria has seven Evaluation Assurance Levels (EAL1 through EAL7)[1]. The computer system must meet specific assurance requirements that involve design documentation, design analysis, functional testing, or penetration testing. The higher EALs involve more detailed documentation, analysis, and testing than the lower ones. Achieving a higher EAL certification generally costs more money and takes more time than achieving a lower one.
Rating / Level / Description /Very Low /Low / N/A / Functionally Tested: applicable where some confidence in correct operation is required, but the threats to security are not viewed as serious.
Nominal / EAL1 – EAL2 / Structurally Tested: requires the cooperation of the developer in terms of the delivery of design information and test results, but should not demand more effort on the part of the developer than is consistent with good software engineering practice
High / EAL3 / Methodically Tested and Checked: permits a conscientious developer to gain maximum assurance from positive security engineering at the design stage without substantial alteration of existing sound software engineering practices.
Very High / EAL5 / Semi-formally Designed and Tested: permits a developer to gain maximum assurance from security engineering based upon rigorous software engineering development practices supported by moderate application of specialist security engineering techniques.
Extra High / EAL7 / Formally Verified Design and Tested: applicable to the development of security target of evaluation for application in extremely high risk situations and/or where the high value of the assets justifies the higher costs. Currently limited to tightly focused security functionality that is amenable to extensive formal analysis
Platform Attributes
“Platform” is used here to mean the complex of hardware and software (OS, DBMS, etc.) the software product calls on to perform its tasks, the target platform. If the software to be developed is an operating system then the platform is the computer hardware. If a database management system is to be developed then the platform is the hardware and the operating system. If a network web browser is to be developed then the platform is the network, computer hardware, the operating system, and the distributed information repositories. A target platform can include graphic user interfaces, databases, networks, distributed middleware, mobile computing, cloud computing, high security architectures, fault-tolerant computing, and software as a service (or “x” as a service).
The “platform” is what the application is being developed to; the target platform.
Platform Constraints*
This driver captures the limitations placed on the platform’s capacity such as execution time, primary/secondary storage, communications bandwidth, battery power, and maybe others. The purpose of the limitations is to reserve capacity for future growth of the software application.
Platform constraints have the following characteristics:
- Execution time constraint is the percentage of available execution time expected to be used by the system or subsystem consuming the execution time resource.
Very Low N/A
Low N/A
Nominal <=50% used of available execution time
High 70% used of available execution time
Very High 85% used of available execution time
Extra High 95% used of available execution time
- Primary/Secondary storage constraint is the percentage of available primary or secondary storage expected to be used by the system or subsystem consuming the storage resource.
Very Low N/A
Low N/A
Nominal <=50% used of available storage
High 70% used of available storage
Very High 85% used of available storage
Extra High 95% used of available storage
- Communication bandwidth constraint is the percentage of available communication bandwidth expected to be used by the system or subsystem consuming the bandwidth.
Very Low N/A
Low N/A
Nominal <=50% used of available bandwidth
High 70% used of available bandwidth
Very High 85% used of available bandwidth
Extra High 95% used of available bandwidth
- Power constraint is the percentage of available electrical power expected to be used by the system or subsystem consuming the power resource.
Very Low N/A
Low N/A
Nominal <=50% used of available battery power
High 70% used of available battery power
Very High 85% used of available battery power
Extra High 95% used of available battery power
Platform Volatility (PVOL)
The targeted platform may still be evolving while the software application is being developed. This driver captures the impact of the instability of the targeted platform resulting in unplanned rework.
- Targeted platform rate of changes
Very Low N/A
Low Major change every 12 mo.; Minor change every 1 mo.
Nominal Major: 6 mo.; Minor: 2 wk.
High Major: 2 mo.;Minor: 1 wk.
Very High Major: 2 wk.;Minor: 2 days
Extra High N/A
- Concurrent development for the target platform and software application
Very Low Very little
Low Fairly little
Nominal Average
High Extensive
Very High Fairly extensive
Extra High Very extensive
Programmer Capability (PCAP)
Major criteria which should be considered in the rating are ability, competence, proficiency, aptitude, thoroughness, and the ability to communicate and cooperate. The experience of the programming team should not be considered here.
Rating / Descriptor / Project Activities Impact /Very Low / 15th percentile / Worse than the Low rating
Low / 35th percentile / More problems in interactions with analysts
Less efficient activity
More errors, false starts More detailed design rework, errors in rework
More code and documentation rework, errors in rework
Nominal / 55th percentile / Typical capability; average number of defects
High / 75th percentile / Fewer problems in interactions with analysts
More efficient activity
Fewer errors, false starts
Fewer detailed design rework, errors in rework
Fewer code and documentation rework, errors in rework
Very High / 90th percentile / Improvement in the above
Extra High / N/A
Use of Software Tools (TOOL)
The TOOL rating ranges from simple edit and code to integrated life cycle management tools. This driver consists of three characteristics: Tool Coverage, Tool Integration and Tool Maturity.
- Tool Coverage (TCOV)[2]
1 Text-based editor, basic 3rd generation language compiler, basic library aids, and basic text-based debugger basic linker
2 Interactive graphical editor, simple design language, simple programming support library, simple metrics/analysis tool
3 Syntax checking editor, standard templates, simple configuration management, global find/replace, support metrics, simple repository, basic test case analyzer
4 Semantics checking editor, automatic document generator, requirement specification aids, extended design tools, automatic code generator from design, centralized configuration management tool, process management aids, partially associative repository (simple data model support) test case analyzer, verification aids, basic reengineering & reverse engineering tool
5 Global semantics checking, tailorable document generator, requirement specification aids with tracking capability, design tools with model verifier, code generator, extended static analysis tool, repository with complex data model support, distributed configuration management tool, test case analyzer with testing process manager, test oracle support, extended reengineering & reverse engineering tools
- Tool Integration (TINT)[3]
1 Incompatible file formats for each tool, no inter-tool communication/control, unique graphical user interfaces (GUO), incompatible tool process assumptions and object semantics
2 Different but convertible file formats, general message broadcasting/control to other tools, some common GUI elements, some compatibility with process assumptions and object semantics
3 Standard file formats, message broadcasting/control through message server, common GUI elements, workable compatibilities among process assumptions and object semantics
4 Shared file repository, point-to-point messaging/control, customizable GUI support, largely compatible process assumptions and object semantics
5 Highly associative file repository, point-to-point messaging/control using references, high degree of commonality in GUI elements, consistent process assumptions and object semantics
- Tool Maturity (TMAT)[4]
1 Version in prerelease beta test, simple documentation and help
2 Version on market/available less than 6 months, up- dated documentation, help available
3 Version on market/available between 6 months and 1 year, on-line help, tutorial available
4 Version on market/available between 1 and 2 years, on-line user support group
5 Version on market/available between 2 and 3 years, on-site technical user support group
6 Version on market/available more than 3 years
Process Capability & Usage (PCUS) (Formerly PMAT)*
The consistency and effectiveness of the project team in performing Software Engineering (SEW) processes. It is based on two characteristics:
- Project Team Behavioral Characteristics
1 Ad Hoc approach to process performance
2 Performed SWE process, activities driven only by immediate contractual or customer requirements, SWE focus limited
3 Managed SWE process, activities driven by customer and stakeholder needs in a suitable manner, SWE focus is requirements through design, project- centric approach – not driven by organizational processes
4 Defined SWE process, activities driven by benefit to project, SWE focus is through operation, process approach driven by organizational processes tailored for the project
5 Quantitatively Managed SWE process, activities driven by SWE benefit, SWE focus on all phases of the life cycle
6 Optimizing SWE process, continuous improvement, activities driven by system engineering and organizational benefit, SWE focus is product life cycle & strategic applications
- Software Development Plan (SDP) Sophistication
1 Management judgment is used
2 SDP is used in an ad-hoc manner only on portions of the project that require it
3 Project uses a SDP with some customization
4 Highly customized SDP exists and is used throughout the organization. SDP employs organization processes
5 The SDP is thorough and consistently used; organizational rewards are in place for those that improve it. Organizational processes support software development and are included in SDP requiring less specification.
6 Organization develop best practices for SDP; all aspects of the project are included in the SDP; organizational rewards exist for those that improve it
Platform Experience (PLEX)
PEXP recognizes the importance of understanding the target platform. The definition of a target platform is provided above. This driver captures the knowledge or skill required to use the capabilities and services provided by the target platform.
1 <= 2 months
2 6 months
3 1 year
4 3 years
5 6 years
6 more than 6 years
Applications Experience (APEX)
This driver captures the level of application and domain experience of the project team developing the software system or subsystem. This driver is different than the Precedentedness driver. This driver focuses on the development team’s experience. The Precedentedness driver focuses on the need for innovation and past successes than have demonstrated that innovation.
The amount of experience is based on time.
1 <= 2 months
2 6 months
3 1 year
4 3 years
5 6 years
6 more than 6 years
Language and Tool Experience (LTEX)
This is a measure of the level of programming language and software tool experience of the project team developing the software system or subsystem. When rating this driver, consider the volatility of the development tools.
1 <= 2 months
2 6 months
3 1 year
4 3 years
5 6 year
6 more than 6 years
Copyright © USC-CSSE 1
[1] Evaluation Assurance Level, and the definition is available here: https://en.wikipedia.org/wiki/Evaluation_Assurance_Level
[2] Baik, J. “The Effects of CASE Tools on Software Development Effort”, Dissertation, University of Southern California, Dec 2000.
[3] Baik, J. ibid.
[4] Baik, J. ibid.