PM’s Compendium of Technical Management Roles, Actions and Activities

PM Roles / PM Actions / PM’s Necessary Product or Outcome / Activities / Amplifying Steps and Guidance
TM 1 Engineering Management: Ability to manage a system engineering process; assess the government / contractor’s system engineering approach, activities, and products. / 1.1 Technical Planning
1.1.1 Establish, update and critically evaluate a plan for the technical management of an acquisition activity. / Systems Engineering Plan (SEP) update at each major event (milestone) / 1.1.1.1 Establish and/or review the current SEP IAW current system engineering planning guidance. / Steps:
  1. Establish a common configuration management approach for SEP and IMS and related documents.
  2. Identify any impact to cost, schedule and/or performance as a result of these changes.
  3. Evaluate the systems engineering documents with respect to currency, that they are reflective of changing program conditions at each milestone.

1.1.2 Ensure engineering processes are coordinated and applied properly throughout a system's life cycle consistent with the Systems Engineering Plan / Integrated Master Plan (IMP)
Integrated Master Schedule (IMS) update at each major event (milestone) / 1.1.2.1 Align and resource program technical plans (TEMP, LCSP. Etc) with the SEP via the IMP and IMS / Steps:
1. Ensure the IMP is review periodically and updated at least at each milestone.
2. Ensure the IMS is reviewed monthly and corrdinated within the government enterprise and all involved contractors.
3. Ensure that program planning (and resourcing) is reflective of RAM approach.
4. Ensure the RAM efforts across the program are intended to support cost and supportability projections for lifetime support.
References:
  • DoDI 5000.02, Enclosure 3
  • DoD RAM-C Report Manual June 2009

1.1.3 Apply software acquisition management principles (historic or emerging) needed to make sound decisions for planning and executing an acquisition program / Software Development Plan / 1.1.3.1 Assess program strategy/plan options that leverage agile or test-based acquisition approaches
1.1.3.2 Manage Software Development and Sustainment, including the evolution of an incrementally fielded capability [may be software-intensive] using a test-driven, prototype-based, iterative build-test-fix-test-deploy capability development process. / Steps:
  1. Review the Interim DODI 5000.02; four of the six program models in the interim 5000 have significant software activities and the remaining two models likely include software.
  2. Plan and resource transition and interfaces of related systems.
  3. Consider the development and sustainment of software at every decision point in the acquisition life cycle since it can be a major portion of the total system life-cycle cost.
  4. Consider a phased software development approach using testable software builds and/or fieldable software increments to enable developers to deliver capability in a series of manageable, intermediate products to gain user acceptance and feedback for the next build or increment, and reduce the overall level of risk.
References:
  • DoDI 5000.02 Defense Acquisition Program Models
  • DoDI 5000.02, Enclosure 3

1.1.4 Ensure Cybersecurity processes are coordinated and applied properly throughout a system's life cycle / Cybersecurity Strategy / 1.1.4.1 Develop and document the Risk Management Framework (RMF)
1.1.4.2 Align the RMF with key technical planning documents (program requirements documents, system engineering, test and evaluation, and product support plans). / Steps:
  1. Develop and implement the Risk Management Framework.
  2. Identify and document Cybersecurity risks.
References:
  • DoDI 5000.02, Enclosure 11
  • CJCSI 6510.01F, "Information Assurance (IA) and Support to Computer Network Defense (CND)," February 9, 2011
  • DoDI 8500.01 – Cybersecurity
  • DoDI 8500.02 (Information Assurance (IA) Implementation)
  • DoDI 8510.01 – Risk Management Framework for DoD IT
  • CJCSI 6212.01F, "Net Ready Key Performance Parameter (NR KPP)," March 21, 2012
  • Section 811 of P.L. 106-398

1.1.5 Manage re-use of legacy hardware and/or software / Reuse Plan / 1.1.5.1 Evaluate the identification, assessment and handling of the risks associated with a reusability plan (H/W and/or S/W).
1.1.5.2 Develop S/W and/or H/W reuse plan while handling inherent risk (e.g.: obsolescence and required missionization). / Steps:
  1. Identify resources for a S/W reuse repository
  2. Ensure integration with the program IMS

1.2 Requirements Decomposition
1.2.1 Ensure a requirements management process provides traceability back to user-defined capabilities. / Requirements Traceability Matrix (RTM) / 1.2.1.1 Evaluate a Requirements Traceability Matrix (RTM) in terms of its ability to continually capture all of a program’s approved requirements. / Steps:
  1. Ensure the RTM features address requirements decomposition, derivation, allocation history, rationale for all entries and changes, and provides traceability from the lowest level component all the way back to the user-defined capabilities (ICD, CDD, CPD).
  2. Develop the RTM to manage the design requirement process and is aligned with changes resulting from the JCIDS and CSB, and vice versa.
  3. Ensure cost and schedule parameters are adjusted as requirements change.
References:
  • Systems Engineering DAG Chapter 4

1.2.2 Describe the need to convert functional and behavioral expectations to technical requirements / Technical Requirements / 1.2.2.1 Evaluate technical requirements for affordability, achievability and traceability to stakeholder’s requirements, expectations, and perceived constraints.
1.2.2.2 Validate and base line the technical requirements.
1.2.2.3 Define the technical problem scope and the related design and product constraints. / Steps:
  1. Undertand and reference the Concept of Operations (CONOPS).
  2. Ensure connectios are made to user requirements (e.g. JCIDS) and mission operation summary/mission profile.

1.2.3 Ensure the design incorporates reliability, availability and maintainability requirements across a system's life cycle / Reliabiltity Growth Curves
RAM-C Report / 1.2.3.1 Develop and document reliability growth expecations.
1.2.3.2 Assess reliabiltiy growth progress versus the RAM-C expectations. / Steps:
  1. Ensure that program planning (and resourcing) is reflective of RAM approach.
  2. Use RAM across the program to support cost and supportability projections for lifetime support.
References:
  • DoDI 5000.02, Enclosure 3
  • DoD RAM-C Report Manual June 2009

1.2.4 Ensure the open systems architecture design is compatible with user performance, interoperability, and product support requirements and desired capabilities. / Open Systems Architecture Certification
SEP
LSCP / 1.2.4.1 Define system design to achieve interoperability with existing and planned DoD/Service/Agency systems.
1.2.4.2 Incorporate a modular open systems approach (MOSA) in order to optimize the design for effective and efficient product support. / Steps:
1. Survey existing and planned modular and open systems and lessons learned.
2. Determine from those systems candidates for the system.
3. Consider the life-cycle support costs and benefits of incorporating the existing systems.
4. Coordinate design decisions with stakeholders.
5. Document decisions in the LCSP in coordination with the SEP.
References:
  • DoDI 5000.02, Enclosure 3
  • "DoD Open System Architecture Contract Guidebook for Program Managers," December 15, 2011

1.2.5 Ensure the information technology design requirement considers Interoperability as well as trusted systems and networks / IT Interoperability Plan / 1.2.5.1 Evaluate (and document) the degree to which information technology (IT) design requirements consider both Interoperability and trusted systems/networks. / Steps:
  1. Establish working group of key IT and NSS stakeholders.
  2. Develop interoperability requirements.
  3. Develop requirements for trusted systems and networks.
  1. Iterate (with stakeholders) development of the IT interoperability plan.
References:
  • DoDI 5000.02, Enclosure 11
  • DoDD 4630.05, "Interoperability and Supportability of Information Technology (IT) and National Security Systems (NSS)," May 5, 2004
  • DoD Instruction 4630.8, ''Procedures for Interoperability and Supportability of Information Technology (IT) and National Security Systems (NSS)," June 30,2004
  • DoD Instruction 8320.02, "Sharing Data, Information, and Information Technology (IT) Services in the Department of Defense," August 5, 2013
  • DoD Instruction 8410.03, "Network Management (NM)," August 29,2012

Criticality Analysis / 1.2.5.2 Conduct a criticality analysis to identify mission critical functions and critical components related to trusted systems and networks
. / Steps:
  1. Establish objectives, scope and approach of analysis effort.
  2. Assess relevant Trusted Systems and Networks plans.
  1. Assess relevant implementation activities in Program Protection Plans ,cybersecurity plans and related documentation.
References:
  • DoDI 5000.02, Enclosure 11
  • DoDI 5200.44, "Protection of Mission Critical Functions to Achieve Trusted Systems and Networks (TSN)," November 5, 2012

Supplier Threat Analysis / 1.2.5.3 Conduct a threat analysis of suppliers of critical components (Supplier All Source Threat Analysis) / Steps:
  1. Determine all available suppliers of critical components.
  2. Integrate results of criticality analysis with relevant suppliers as appropriate.
  3. Incorporate the above results into the development of the Supplier Threat Analysis.
References:
  • DoDI 5000.02, Enclosure 11
  • DoD Instruction 5200.44

1.3 Technical Assessment
1.3.1 Manage the process to document, coordinate, and substantiate the transition of system elements to the next level in the SE process / IMS / 1.3.1.1 Evaluate the implementation of an event-based review process that transitions a product (HW and/or SW) from concept through initial operating capability (IOC). / Steps:
  1. Establish program roadmap consisting of key program events (PE), especially technical reviews (e.g. SRR, SFR, PDR, CDR, TRR, PRR, etc).
  2. Ensure alignment of all technical planning documents (e.g. SEP, TEMP, LCSP) with program planning docuemnts (e.g. PMP, Risk mgt plan).
  3. Clarify entry and exit criteria for all program events (PE) in terms of significant accomplishments (SA) and accomplishment criteria (AC), and as a result generate an IMP.
  4. Ensure all relevant scope to achieving the PE, SA and AC are traced to the program WBS.
  5. Based on the results of 1-4 above, create the program Integrated MassterPlsan (IMP).
  6. Transfer IMP information into the Program Integrated Master Schedule (IMS) and align AC with the approapriate tasks.

1.3.2 Ensure a process for monitoring and selecting Design Solution that translates the outputs of the Requirements Development and Logical Analysis processes into alternative design solutions and selects a final design solution. / Successful accomplishment of SRR/SFR / 1.3.2.1 Chair, or participate substantially in, program technical reviews through and including the System Functional Review (SFR)) / Steps:
  1. When chairing the review, ensure all issues related to the SFR are successfully addressed and recorded with appropriate actions assigned.
  2. When “substantially participating” in the review, conduct actions that directly influence maturation of a design from system-level through and including CDR-level clarity.
  3. Ensure the final design solution is producible.
Guidance:
  • The reviews relevant to this competency include the ASR, SRR, and SFR

1.3.3 Ensure the process for monitoring the implementation effort actually yields the lowest level system elements in the system hierarchy. / Successful accomplishment of CDR (exit criteria) / 1.3.3.1 Chair, or participate substantially in, program technical reviews through and including the Critical Design Review (CDR)
1.3.3.2 Evaluate the integration activites of lower level system elements into a higher-level system element in the physical and logical architecture. / Steps:
  1. When chairing the review, ensure all issues related to the CDR are successfully addressed and recorded with appropriate actions assigned.
  2. When “substantially participating” in the review, conduct actions that directly influence maturation of a design from system-level through and including CDR-level clarity.
  3. Ensure producabiliity is a critical consideration at the review since it is part of the exit criteria for the CDR that sets the stage for production.
Guidance:
  • The reviews relevant to this competency include the SRR, SFR, and in particular the PDR and CDR.

1.3.4 Identify, explain and employ measures to assess the technical maturity of a design solution, relative to operational performance requirements / Documented performance assessment and forecasting
Entry and Exit Criteria included in the SEP / 1.3.4.1 Evaluate the alignment of technical performance measures to ensure delivery of the desired operational performance requirements.
1.3.4.2 Develop and track entry and exit criteria to be used to determine readiness for the program to proceed / Steps:
  1. Analyze all KPPs and KSAs in terms of key design drivers/determinants.
  2. Compare KPP/KSA analysis results with relevant information from technical planning documents such as the SEP, TEMP, PPP and LCSP.
  3. Develop technical performance measures (TPM) and align them with the WBS.
  4. Establish key technical decision points (such as design reviews) and determine associated significant accomplishmsnts and accomplishment criteria (per an IMP).
  1. Integrate decision criteria into the SEP, as appropriate, and publish formally as entry and exit criteria for each review.
Guidance:
  • “Documented performance assessment and forecasting” can take place in any number of venues, to include PMRs, TIMs and design reviews
  • The translation of desired performance/operating capabilities (KPP, COI, KSA) into system development terms is typically accomplished via TPMs

1.3.5 Ensure technical measures are continually assessed (tracked, trended and forecasted) to support program decisions / Technical performance measures (TPMs) and technical measures / 1.3.5.1 Identify and implement hardware and software metrics appropriate for each level of a program
1.3.5.2 Identify and implement technical support measures to be incorporated as part of the QASP
1.3.5.3 Identify and track technical measures associated with the handling of risks and opportunities / Steps:
  1. Establish technical performance measures (TPM) and/or functionality points/indicators as appropriate to ensure insight into KPP, KSA or other desired performance.
  2. Align TPMs and/or functionality with associated design/functional requirements and WBS elementsEstablish key indicators of technical process maturity/execution.
  3. Integrate TPM and technical process indicators into risk handling plans.
  1. Input TPM, risk handling tasks/options and technical process indicators into IMS tasks and associated closure criteria.
Guidance:
  • “Level of a program” refers to how the system is decomposed or manged at the component up to the system level.
  • “TPMs” refer to technical measures directly linked to attaining the program KPPs and KSAs, e.g. system weight, speed, S/W reliability, etc.
  • Technical measures are linked to the technical tracking of the program, e.g., status of process performance, status of requirementss, status of interfaces, status of technical documentation, etc.

1.3.6 Assess whether technical measures are causing the correct (expected) organizational and contractual behavior / 1.3.6.1 Evaluate technical measures in the context of program cost and schedule impacts
1.3.6.2 Evaluate technical measures in the context of contract incentives and contractor performance assessments / Steps:
  1. Establish technical performance measures (TPM) and/or functionality points/indicators as appropriate to ensure insight into KPP, KSA or other desired performance
  2. Align TPMs and/or functionality with associated design/functional requirements and WBS elementsEstablish key indicators of technical process maturity/execution
  3. Align TPMs and desired functionality with contractor incentives, as appropriate
  4. Integrate TPM and technical process indicators into risk handling plans
  1. Input TPM, risk handling tasks/options and technical process indicators into IMS tasks and associated closure criteria.

1.3.7 Plan for and/or evaluate a systems readiness to operate in the intended environment, e.g., information assurance, air worthiness, sea worthiness, net ready. / Letter of certification / 1.3.7.1 Develop and implement plans within appropriate technical planning documents to ensure system certification.
1.3.7.2 Assess methods, tools, and procedures, including development of artifacts to ensure the system is certified/approved for use in its intended operating environment / Steps
1. Review certification sources and associated requirements together with relevant certification authority to determine unique/tailored expectations.
2. Document tailored (unique to program and conditions) certification requirements for the desired capability/characteristic.
3. Confirm with certification authority that tailored requirements are adequate for certification.
4. Establish working group including all stakeholders that influence, or are influenced by, the certification requirements.
5. Integrate certification requirements into program requirements and planning documentation, to include technical requirements documents, technical planning, IMS, IMP, Test (DT/OT) planning and risk/opportunity register.
References:
  • DAG chapter 4

1.3.8 Conduct Post Implementation Review (PIR). / PIR / 1.3.8.1 Plan and conduct a PIR in coordination with the Functional Sponsor / Steps:
  1. Ensure the PIR will reports the degree to which doctrine, organization, training, materiel, leadership and education, personnel, facilities, and policy changes have achieved the established measures of effectiveness for the desired capability
  2. Evaluate systems to ensure positive return on investment and decide whether continuation, modification, or termination of the systems is necessary to meet mission requirements
  3. Document lessons learned from the PIR.
Guidance:
  • The Functional Sponsor, in coordination with the DoD Component CIO and Program Manager, is responsible for developing a plan and conducting a PIR for all fully deployed IT, including NSS.
Reference:
  • DoD I 5000.02, Enclosure 11 para 4

1.4 Decision Analysis
1.4.1 Apply, evaluate and explain multiple approaches to decision analysis concerning technical challenges / A documented significant program decision / 1.4.1.1 Identify decisions requiring technical information to be tracable and defendable
1.4.1.2 Apply, evaluate and explain multiple approaches to decision analysis in order to ensure, as a minimum, satisfactory solution to technical problems. / Steps:
  1. Clearly establish decision goals while framing the technical decision.challenge/problem within the context of program capabilities, cost, schedule and performance constraints/requirements
  2. Identify the options and alternatives and/or courses of action that are likely to be considered.
  3. Characterize risks of information uncertainties, including information that is likely to be missing, unreliable, conflicting, noisy (irrelevant) and confusing.
  4. Define evaluation criteria for the decision, to include defining the difference between what is optimal versus “good enough”.
  5. Evaluate likely alternative choices and, to the degree practicable, model, simulate or otherwise describe the implementation results from each alternative.
  6. Select justify and record the recommended alternative and/or course of action.

1.5 Configuration Management