Software T&E Summit/Workshop Issues & Recommendations White Paper
Software Test & Evaluation
Summit/Workshop Results
Issues & Recommendations White Paper
Joint Authorship of the NDIA System Engineering Division’s Software Industry Experts Panel and the Developmental Test & Evaluation Committee
Submitted: December 18, 2009
Table of Contents
White Paper Purpose
Software T&E Summit/Workshop
Background
Summit/Workshop Objective
Way Forward
Workshop Approach and Summary List of Results
Summary list of Issues
Summary List of Recommendations
Related Commercial Event
Conclusions
Appendix A – Agenda
Appendix B – List of Contributors During the Workshops
Appendix C – Workshop Issues & Recommendation
Workshop #1: How Much Testing is Enough
Workshop #2: Lifecycle & End-to-End SW T&E
Workshop #3: Changing Paradigms (SOA, SoS & Security)
Appendix D – Final Report of the STARWEST Leadership Summit
Appendix E – Top Software Issues Report from August 2006
White Paper Purpose
The purpose of this White Paper is to document the results of the National Defense Industrial Associations (NDIA) Software (SW) Test & Evaluation (T&E) Summit/Workshop that was held in Reston, VA, September 15 – 17, 2009.
Software T&E Summit/Workshop
Background
The SW T&E Summit/Workshop was held to fulfill a recommendation from the NDIA System Engineering (SE) Division Software Committee’s August 2006 Report titled,” Top Software Engineering Issues within the Department of Defense and Defense Industry”.
The August 2006 report was in response to a request from the Office of the Secretary of Defense (OSD) sponsor of the NDIA Systems Engineering Division (Director, Systems and Software Engineering, Office of the Undersecretary of Defense, Acquisition and Technology (OUSD (A&T)). The request was to convene a workshop that was conducted on August 24-25, 2006 to examine the top issues in Software Engineering (SW) that impact acquisition and successful deployment of software-intensive systems. The complete Top Software Engineering Issues August 2006 report is attached in Appendix E.
The August 2006 report included seven major issues confronting the SW acquisition and implementation community. The seven issues from the August 2006 reportare listed below:
- The impact of system requirements upon software is not consistently quantified and managed in development or sustainment.
- Fundamental system engineering decisions are made without full participation of software engineering.
- Software life-cycle planning and management by acquirers and suppliers is ineffective.
- The quantity and quality of software engineering expertise is insufficient to meet the demands of government and the defense industry.
- Traditional software verification techniques are costly and ineffective for dealing with the scale and complexity of modern systems.
- There is a failure to assure correct, predictable, safe, secure execution of complex software in distributed environments.
- Inadequate attention is given to total lifecycle issues for COTS/NDI impacts on lifecycle cost and risk.
Issue fiveand its associated recommendation, “Study current software verification practices in industry, and develop guidance and training to improve effectiveness in assuring product quality across the life cycle” were the key reason theActing Director, Systems and Software Engineering, OUSD(A&T) asked the NDIA SE Divisions, Developmental Test & Evaluation (DT&E) Committee and the Software Industry Experts Panel in December of 2008 to convene a SW T&E Summit/Workshop.
Summit/Workshop Objective
The objective of the SW T&E Summit/Workshop was to recommend policy and guidance changes to the Defense enterprise to emphasize robust and productive software Testing and Evaluation (T&E) approaches in Defense acquisition.
The Agenda used to achieve this objective is contained in Attachment A.
Way Forward
It is the recommendation of the NDIA SE Division’s DT&E Committee and Software Industry Experts Panel that the workshop lists of issues and recommendations be consolidated, prioritized and turned into a specific set of actionable recommendations for the Government to consider. This activity can be started in 2010 and worked on an on-going basis based on agreed to priorities.
Workshop Approach and Summary List of Results
There were there three workshop teams established for this event. They were:
1)How much Testing is Enough
2)Lifecycle and End-to-End Software T&E
3)Changing Paradigms
The workshops each had about five hours of work time to identify issues and recommendations in four focus areas (one Workshop team added a fifth focus area). The focus areas included:
1)Review, revise, improve Request for Proposal (RFP) language (include T&E activities/deliverables including those for Competitive Prototyping) Potential Stakeholders, Implementers: OSD/ATL, DT&E Committee
2)Training, Competency Model, Human Capital Potential Stakeholders/Implementers: Defense Acquisition University (DAU), Industry
3)Policy, Guidance and Standards Potential Stakeholders/Implementers: OSD/ATL, DAU, etc
4)Tools, Automation, Methodologies and Processes Potential Stakeholders/Implementers: Tool Vendors, Academia
5)Other Items of Interest Potential Stakeholders/Implementers: Industry, T&E Community
The complete set of issues and recommendations for each Workshop team are listed in Appendix C below. In total there were over 200 issues identified resulting in 82 specific recommendations.
Listed below is a summary of some of the issues and recommendations. This list of issues and recommendations is for review/informationalpurposes only. The complete list of issues and recommendations will need to be addressed in detail with a disposition generated for each recommendation in future work by the DT&E committee and the Software Industry Experts Panel.
Summary list of Issues
Here are some of the issues listed in no particular order:
- Lack of an established, formal professional association of software testers and evaluators in the defense enterprise (USG, DOD,Defense Industry) evidenced by:
-No evolved 'Body of Knowledge':
- Common Language - codified terms and definitions
- Common Process – when to start, how to start, deliverables, etc
- Lessons Learned - Better Practices
-No defined formal education/training and professional certification standards establishing essential skill sets, including tools, test automation and usage:
- In university degree curricula
- In DAU
- In on the job professional development tracts
-No professional career track or professional designation:
- Lacking - Defined progressive career path, Human capital development/community management and/or recognized cadre of domain experts
- Lack of a standard software T&E approach creates systemic weaknesses within the defense enterprise:
-Lack Policy, Guidance and Standards for:
- Test processes, methodologies and test design techniques
- Test documentation – planning, analysis, reporting
- Evaluation of software qualities – what is it and who does it
- Test tools, automation, M&S - assessment, certification, lifecycle usage
- Metrics and measurement
- Software and system integration
-Poorly developed Acquisition Documentation (RFPs, sections L and M, WBSs, SOWs):
- Vague, internally inconsistent
- Software T&E requirements absent or poorly stated across lifecycle
- Lack CDRLS for data reporting
- Minimal recognition of proper SW T&E resources
- No exit criteria measurements / metrics (for Tech Reviews, Milestones)
-Program / Project Execution:
- Low government participation
- Weak Risk based prioritization schemes
- Inadequate requirements validation and test-ability
- Lacking in DT&E roles definition and human capital management over the acquisition lifecycle
Summary List ofRecommendations
Here are some of the recommendations listed in no particular order that relate to the issues listed in the Summary Issues section above.
- Common Terms, Processes, Training, Career Path, etc
- Develop and provide standard language (template, checklist, lexicon) for all areas of Software test & evaluation(Recommendation 1)
- Define a set of competency models / skills for SW testing & SW-intensive system testing. Include all levels of test (unit, function, integration, system, SoS). Include cross-training in SW & SE(Recommendation 18)
- Develop 'Wikipedia-like' software T&E knowledge repository like DoD techipedia or e-learning on domain (by OSD/ATL DDR&E, NDIA) (Recommendation 22)
- Establish career path incentives for test engineers (Recommendation 74)
- Common RFP Language, approaches and guidance and other industry issues
- Policy/Guidancedevelopment to requirecritical SW deliverable documentation to support SW test & evaluation and list in RFP to include (several recommendations put together as well as derived statements):
- SW lifecycle test strategy within the context of the Program Integrated Test Strategy (i.e. coverage, types of testing (unit, functional, integration, system, SoS), and SW test automation and tools);
- SW T&E supporting SW Engineering in SE PROCESSES (requirements development / management, design and development, system reliability growth, risk identification, analysis and management).
- RFP development plan needs to demonstrate early involvement of test engineering. (Recommendation 2 & 3)
- In RFPs: identify the information required for making decisions that can be satisfied by test & evaluation (e.g., CDRLs) - also, what format (e.g,. Templates) (Recommendations 11 & 44)
- Develop model RFP language for SOA acquisitions (Recommendation 71)
- Create & fund a research activity in the area of software testing and test automation (Recommendation 41)
Related Commercial Event
In the Commercial Software Testing arenathere are many conferences held during the year that are very good and are attended by thousands of people. The Software Test Analysis and Review Conferences, convened annually by Software Quality Engineering (SQE), currently include a Leadership Summit on the last day of the weeklong event. This year the event was held October 9 in Anaheim, CA. During this year’s Leadership Summit they captured testing issues from the 120 plus participants. A large majority of the participants were from the Commercial sector but there is still a good correlation of the issues between this events findings and NDIA SW T&E Summit/Workshop findings. Appendix D includes the STARWEST Leadership Summit final report (with some clean up editing) and is attached to this report for informational purposes only.
Conclusions
Based on the results of this Workshop, the STARWEST Leadership Summit, the National Institute of Standards and Technology report titled, “The Economic Impact of an Inadequate Infrastructure in Software Testing” and many other white papers and articles the area of software testing is in need of improvement. The recommendations captured during this Workshop can go a long way in helping achieve this improvement.
Appendix A – Agenda
December 18, 2009 Page | 2
Software T&E Summit/Workshop Issues & Recommendations White Paper
Day 1
8:00 Introduction
8:10 Government Presentations – Framing DoD SW T&E Issues
-Mr. Chris DiPetto, Acting Director, DT&E
-Ms. Kristen Baldwin, Director for System Analysis, OD, DR&E
-Dr. Ernest A. Seglie, Chief Science Advisor, DOT&E
9:50 Break
10:15 DoD Industry Panel – Framing DoD Industry SW T&E Issues
-Mr. Edgar Doleman, CSC
-Mr. Bruce Casias, Raytheon
-Mr. Tom Wissink, Lockheed Martin
11:45 Lunch & Speaker – SW Security in Defense T&E
-Paco Hope – Cigital
12:45 SW Test Industry Experts
-Dr. Cem Kaner, Florida Institute of Technology, Challenges in the Evolution of Software Testing Practices in Mission-Critical Environments
-Dr. Adam Kolawa, Parasoft, Software Development Management
2:25 Break
2:50 SW Test Industry Experts
-Mr. Rex Black, RBCS, Risk-Based Testing
-Mr. Hung Nguyen, Logigear, Software Testing & Test Automation
4:30 Adjourn
Day 2
8:00 Re-Cap Day 1
8:10 DoD Services Panel
-Dr. James Streilein, US Army Test and Evaluation Command
-Dr. Steve Hutchison, Defense Information Systems Agency (DISA)
-Mr. Mike Nicol, Aeronautical Systems Center, Wright-Patterson AFB
9:45 Introduction of Workshops
10:00 Break
10:30 Workshops
12:00 Lunch & Speaker
1:00 Workshops
2:30 Break
3:00 Workshops
4:30 Adjourn
December 18, 2009 Page | 2
Draft SW T&E Summit/Workshop Issues & Recommendations White Paper
Day 3
8:00Re-Cap Day 2
8:10 Introduction of Workshop Leaders
8:15 Presentation of Issues and Recommendation by Workshop Leaders
9:45 Break
10:00 Way Forward Discussion & Final Q&A’s
11:00 Adjourn
Appendix B – List of Contributors During the Workshops
Workshop #1 – How Much Testing is Enough
December 10, 2009 Page | 8
Draft SW T&E Summit/Workshop Issues & Recommendations White Paper
Larry Baker, DAU
Rex Black, RBCS
Kini Cain, DISA
John Cockey, Raytheon
Pamela Crawford, NGSB
Phil Crescioli, GD-HIS
Ed Donovan, SAIC
Steve Henry, NGC
Beth Hylton, DISA/TEMC
Brian Kim, ATEC
Rick Kuhn, NIST
Joe Lawrence, NGC
LTC Robert Love, Army DIMHRS
James Lyden, NAVSEA
Frank Marotta, Army DT
Tom McGibbon, ITT
Chris Miller, SAIC (Government Lead)
Ajit, Narayan, NGC
David Nicholls, RIAC
Mike Nicol, USAF
Martha O’Conner, SIAC
Bob Ostrowski, COMOPTEVFOR
Marin Reyes, NSWC
David Rhodes, Raytheon (Facilitator)
Peggy Rogers, NOSSA
Jim Seeley, NGC
Yvette Solomn, DISA
Randy Southers, Lockheed Martin
Michael Teixeira, Seawatch
Tom Wissink, Lockheed Martin (Industry Lead)
December 10, 2009 Page | 8
Draft SW T&E Summit/Workshop Issues & Recommendations White Paper
Workshop #2 – Lifecycle and End-to-End SW T&E
December 10, 2009 Page | 8
Draft SW T&E Summit/Workshop Issues & Recommendations White Paper
Tony Avvenire, Booz Allen
Tim Boving, NGST
Angela Llamas Butler, SEI/CMLL
Kimi Cam, DISA T&E Mgmt Ctr
Bruce Casias, Raytheon (Industry Lead)
Jeremy Elder, Lockheed Martin
David Gulla, Raytheon
Gary Hafen, Lockheed Martin
John Hager, Lockheed Martin
Herb Hines, Lockheed Martin
Joe Hollon, Raytheon
Prem Jain, MITN
Melody A. Johnson, HQUSAF/TEP
Thomas Knott, OUSD(AT&L)
Abishek Krupanand, USAF
Mike Leite, SAIC/DOT&E
Scott Lucero, OSD/ATL(Government Lead)
Mark Moulding, JHU/APL
Hung Nguyen, Logigear
John O’Keefe, Northrup Grumman
Madhav Phadke, Phadke Assoc, Inc
Sarah Rogers, Independent (Facilitator)
Shawn Rahmani, Boeing
Amy Settle, NOSSA
Greg Stearsman, Lockheed Martin
Nat Subramoniam, IDA/DOT&E
Ray Thompson Sr, LM
Alison Tichenor, ATEC
Dave Underwood, BTA
J.Bruce Walkin, SAF/AQRE
Michael Welsh, Army DTC
December 10, 2009 Page | 8
Draft SW T&E Summit/Workshop Issues & Recommendations White Paper
Workshop #3 – Changing Paradigms
December 10, 2009 Page | 8
Draft SW T&E Summit/Workshop Issues & Recommendations White Paper
Trevor Colandrea, NSWCPHD
Tina Chow, AEC
Paul Croll, CSC (Industry Lead)
Edgar Doleman, CSC
Gary Downs, Lockheed Martin
Ernie Gonzalez, SAF-AQ
Henry Gruner, Raytheon IDS
Col Charles Harris, USA (DOT&E)
Beth Hylton, DISA TEMC
Sherida Jacob, NUWC
Cem Kaner, Florida Institute of Technology
LT Darain Kawamoto, Coast Guard
Greg Kerchner, LockheedMartin (Facilitator)
Paul Morrisseau, SRI
Maj Gus Muller, AEC
Chris Osthaus, NG
Gene Rosenbluth, NGIS
Dan Reuben, RTI/Lattice(Government Lead)
Zachary Slayton, GDAIS
Dr. Carol Sledge, SEI
Chirstinia Strahan, Alion Science and Technology
Ellen Walker, ITT Corp (DACS)
Lance Warden, UAH
Elwin Wong, BAH
Robert Zurn, Northrop Grumman
December 10, 2009 Page | 8
Software T&E Summit/Workshop Issues & Recommendations White Paper
Appendix C – Workshop Issues &Recommendation
Workshop #1: How Much Testing is Enough
/ Issues / Recommendations (NOTE: Some ‘ADDED’ during Workshop #1 Reconciliation)Focus Area #1: Review, revise, improve Request for Proposal (RFP) language (include T&E activities/deliverables including for Competitive Prototyping) Potential Stakeholders, Implementers: OSD/ATL, DT&E Committee / -Lack of Standard process and documentation deliverables requirements in RFP
-Not using standard terms in RFP
-Compliance with NR-KPP and what version
-Lack of T&E strategy in RFP for V&V of requirements
-TES/TEMP should be included in the RFP so that all activities are considered from the beginning
-Test planning done too late
-Are interfaces to existing systems specified?
-Competitive prototyping contracts have not included sufficient integration & test activities and deliverables
-Issue: standard RFP content for T&E (SW level) activities / deliverables not available
-Issue: source selection limitations / constraints (timeframes, page limits, etc) limit RFP content
-Who ensure RFP has adequate SW T&E content?
-RFP to be flexible enough for projects that are moving targets, but ensure that no drastic changes are made.
-Do requirements address operational needs
-What Enter Service Profiles (GESP) do you have to comply with?
-User 'mission' input to RFP
-Contractual requirement for deliverables - critical documentation
-What documentation deliverables are needed for T&E and decision-making
-Cost of deliverables / documentation
-Does developer employ robust SW management indicators?
-RFPs are internally inconsistent in T&E and SW testing throughout the package /
- Develop and provide standard language (template, checklist, lexicon) for SW test & evaluation to contracts personnel.
- Determine critical SW deliverable documentation to support SW test & evaluation and list in RFP
- In RFP, development plan needs to demonstrate early involvement of testers.
- "For certification & accreditation: provide TEMP cert & accreditation plan and/or test strategy for obtaining needed data. For Section L: have the contractor demonstrate how their test strategy investments, approach, processes & procedures will provide the needed info in a cost efective and efficient manner. For Section M: provide solid justification and planning for execution of the test efforts and reduce program risk"
- Provide test-related(?) requirements
- RFP requirements should state capabilities required as related to missions
- For Statement of Work: list a stadndard list of 'always' required technical (e.g,. Security, infrastructure) requirements in a boiler plate that can be tailored.
- Provide assessment criteria for SW test & evaluation content for RFPs to ensure adequacy
- In RFPs: identify the information required for making decisions that can be satisfied by test & evaluation (e.g., CDRLs) - also, what format (e.g,. Templates)
- Have an independent DT/OT review of the RFP test & evaluation information requirements before release
- Define recommended RFP (template) content for completion / satisfaction of competitive prototyping phase. (Refer to modified 'V' for Tuesday presentation)
Focus Area #2: Training, Competency Model, Human Capital Potential Stakeholders/Implementers: DAU, Industry / Training
-Do training plans address maintenance, sustainment?
-More training needed on SW testing tools (COTS)
-Universities need SW test courses required in the course list of SE Engineer at undergraduate level (in computer science program)
-No DAU SW engineering and SW testing courses - is it time?