Godfrey
Godfrey
Godfrey
GSFC
Sally Godfrey
Sara.H.Godfrey@nasa.gov
301-286-5706
SE Lecture Series 6/04 Slide 1
Agenda
• CMMI: What is it? Why use it?
• Summary
CMMI -CMM
Organizational process definition
Organizational training
Integrated project management
Risk management
Decision analysis and resolution
Integrated Supplier Management
Integrated Teaming
Software Requirements management
Project planning
Acquisition - 2 Managed Project monitoring and control
Configuration Management
CMM Supplier agreement management
Measurement and analysis
Product & Process Quality Assurance
1 Initial
SE Lecture Series 6/04 Slide 5
Capability Maturity Model
Integrated -Staged
Characteristics of the Level 5 Focus on process
Maturity levels “Optimizing” improvement.
Lower Risk -Higher
Level 4 Productivity/Quality
“Quantitatively Managed” Process measured and
controlled.
Maturity Levels
• CMMI addresses practices that are the framework for process improvement.
increasing exponentially
NASA programs and projects are likely to be
experiencing the same growth curve
The use of software as a technology is on a
much steeper growth curve than other
software
supporting technologies
If the Agency does nothing to improve
software engineering and acquisition, we
can expect commensurate growth in cost,
schedule, and defects
Uncontrolled growth of software dependencies
without prudent mitigations will result in a
real reductions in NASA’s capability to
fulfill it’s mission
Years
Level
Average Number of Hours
Per Service Request Level
4
Productivity
3 44 Hours
Level
Percent Satisfaction Increased By
2 With BIS Support 80% As Error
Rates
Decreased
1988 1990 1992 1994 1996 1998
70% 68 Hours
Level 1 and 2 Level 3
0% Product
Average Number of Hours Quality
Per Service Request
-70% 44 Hours
Percent Satisfaction Increased with
Cost Under Control
With BIS Support Rising Maturity
-140%
Based on 120 Projects in
Without Historical Data With Historical Data Boeing Information
Variance +20% to –145% Variance +20% to –20% Systems
Reference: Scott Griffin, Chief Information Officer, The Boeing Company, SEPG Conference, 2000.
Diaz, M. & King, J., “How CMM Impacts Quality, Productivity, Rework, and the Bottom Line”, Cross Talk: The
Journal of Defense Software Engineering, March 2002. General Dynamics Decision Systems, 3 Divisions, 1,500
Engineers / 360 SW Engineers, CRUD = Customer Reported Unique Defects, Largest RIO found to be from
levels 2 to 3 at 167% based on cost savings in rework.
SE Lecture Series 6/04 Slide 15
Early Success on the NASA Software Initiative
at MSFC: Reduced Cost
80
gLIMIT
70
UPA MSRR
60
SXI
50
52% increase 27% increase 23% increase
40
20
10
0
Flight software source lines of code per person-month of effort
Software development productivity increased at Marshall Space Flight
Center, the first Center to pilot SEI’s Capability Maturity Model (CMM) in
association with this Initiative
Strategy 4. Improve software engineers' knowledge and skills, and attract and
retain software engineers.
Focus of Plan - Improve the processes and practices in use at GSFC using the Capability
Maturity Model Integrated (CMMI) as a measure of progress
– GSFC Plan primarily addresses Strategy 1 in NASA Plan.
– FY04 Direction by Al Diaz: Achievement of specific CMMI goals
Domain FY04 FY05 FY06 FY07 FY08
Flight Software Branch Level 2 Level 3
ISD & Code 400 Level 2 Level 3
Mission Software
Any Code 600/900 Level 2 Level 3
Mission Software not
previously included
Scope of Plan - All projects defined by NPG 7120.5 (Mission Software) & identified by Center
Director will participate in this initial effort
Management
Oversight Group MOG
Linda Wilbanks -Lead
MOG
Institutional Draft
Consensus Process
EPG
Sally Godfrey -Lead
Product Integration PI 3
Requirements Development RD 3
Technical Solution TS 3
Va lidation VA L 3
Ve rification VER 3
Organizational Process Focus OPF 3
Organizational Process OPD 3
Definition
Integrated Project IPM 3
Management
Risk Management RSKM 3
Integrated Supplier ISM 3
Management
Organizational Training OT 3
Key
Reviewed
Capability Level 2
Capability Level 3
Table B-1: Process Area Schedule for Flight Software
Process Area Name Abbr ML FY04 FY05 FY06 FY07
Requirements REQM 2
Management
Measurement & Analysis MA 2
Project Monitoring & PMC 2
Control
Project Planning PP 2
Process & Product PPQA 2
Quality Assurance
Supplier Agreement SAM 2
Management
Configuration CM 2
Management
Decision Analysis & DAR 3
Resolution
Product Integration PI 3
Requirements RD 3
Development
Technical Solution TS 3
Va lidation VA L 3
Ve rification VER 3
Organizational Process OPF 3
Focus
Organizational Process OPD 3
Definition
Integrated Project IPM 3
Management
Risk Management RSKM 3
Integrated Supplier ISM 3
Management
Organizational Training OT 3
Key
Reviewed
Capability Level 2
Capability Level 3
Table C-1: Process Area Schedule for ISD/Code 400 Mission Software
GSFC Phase 2: Focus Activities
Beginning FY03
• Code 582: Flight Software:
– Documentation of existing best practices (& suggested improvements)
– Tools, checklist, templates to support consistent use of practices (e.g.
requirements inspection procedures, test plan/procedure templates)
– Training to support use of improved practices
– Identification & support for collection/analysis of measures
• Code 580: Using flight software practices as a basis, best practices will be
documented for all of ISD with assoc. work products & training
– Consistent approach to planning and tracking (WBS, Earned value, Risk
Management)
• Code 590: Have worked with NASA systems engineering group to pilot use of
CMMI for systems engineering appraisals (JPL was first pilot)
• Code 400: Software Acquisition improvements beginning with developing
improved RFP templates for software - Review at JPL/GSFC QMSW
workshop
• Code 300: Began improvements in Software Assurance
CCB Approved
4%
Not Started
Final
15% Outline
Draft Status of Tailored FSW Process Assets
Final
CCB Approved CCB Approved
23%
Not Started
Draft Not Started 36%
17% 56%
Final
8%
Outline
8%
Outline
5%
Draft
28%
Total number of assets = 118
Following Following
Slide Slide
Previous Following
Slide Slide
Systems Requirements
Design Implementation Testing
Engineering Engineering
Previous Previous
Slide Slide
– We are working towards achievement of CMMI Level 2 in a few process areas by early FY05 and CMMI Level 3 by late FY07
– We hope to coordinate with systems engineering improvements
• CMMI provides much more detail for guidance than ISO by including an extensive
set of “best practices”, developed in collaboration with industry/gov/SEI
-CMMI provides much better measure of quality of processes; ISO
focuses more on having processes
-CMMI puts more emphasis on continuous improvement
-CMMI allows you to focus on one or a few process areas for
improvement (It’s a model, not a standard, like ISO) --Can
rate just one area in CMMI
-CMMI and ISO are not in conflict: ISO helps satisfy CMMI
capabilities; CMMI more rigorous
SE Lecture Series 6/04 Slide 45
What is CMMI? What do levels of software
engineering maturity mean?
Level Description Process Areas Result
Source: Software
SE Lecture Engineering
Series 6/04 Institute Slide 46
Time History - Productivity
Productivity
Reduced Staff Support Per System = Increased Productivity
100
Increased Productivity
75 - 12%
- 26% Projects at
50
- 38% Maturity Level 3
Percent Reduction
25 In Staff Needed Per - 62% Increased
System Productivity 62%
0
Based on 120 Projects at
1992 1993 1994 1995 1996 Boeing Information
Level 1 Level 2 Level 3 Systems
Reference: Scott Griffin, Chief Information Officer, The Boeing Company, SEPG Conference, 2000.
Customer Satisfaction
Based on Semi-Annual Survey of Customers
100
36% Faster Support
75 68 Hours
50 Customer
Average Number of Hours Satisfaction
Per Service Request
25 44 Hours
Percent Satisfaction Increased with
With BIS Support CMM Level
0
Based on 3 Major
1992 1993 1994 1995 1996 Programs in Boeing
Level 1 Level 2 Level 3 Defense and Space Group
Reference: Scott Griffin, Chief Information Officer, The Boeing Company, SEPG Conference, 2000.
80
L2 Processes L3 Processes
70 Initiated Initiated
% Overrun Cost
60
50
40
30
-10 88 89 90 91 92 93 94 95 96 97
And
-20 Earlier
Project Start Date Legend
19 Finished Programs
Source: Software-related engineering projects completed for SAIC Aeronautical Systems Operation
during 1984 -1996 for all contract types and contract size $80K to $3.5M.
Task 4 WBS 3.6.5
SE Lecture Series 6/04 Slide 49
Time History – Schedule
80 L2
70 Processes
Initiated L3 Processes
% Overrun Schedule
60
Initiated
50
40
30
20
Schedule Under Control
10
-10 88 89 90 91 92 93 94 95 96 97
And
Earlier Legend
Project Start Date
18 Finished Programs
Source: Software-related engineering projects completed for SAIC Aeronautical Systems Operation
during 1984 -1996 for all contract types and contract size $80K to $3.5M.
Task 4 WBS 3.6.5
SE Lecture Series 6/04 Slide 50
Even Successful Missions
experience software problems
“A few days after the July 4th, 1997 landing, the Mars Pathfinder began experiencing
total system resets, each resulting in losses of data. The problem was a logical error
in the real-time scheduling system---a classic priority-inversion problem.
Fortunately, this problem was repairable from earth.
A malfunction in one of the on-board computers on Clementine on May 7, 1994 caused
a thruster to fire until it had used up all of its fuel, leaving the spacecraft spinning at
about 80 RPM with no spin control. This made the planned continuation of the
mission, a flyby of the near-Earth asteroid Geographos, impossible.
The Magellan spacecraft broke Earth lock and lost communications several times in
August 1990 (soon after entering Venus orbit). It took over six months to identify
the source of the problem, which was a timing error in the flight software.”
• Aug 27, 1998 failure of the Boeing Delta 3 launch vehicle (control
system attempted to correct a roll oscillation and the hydraulic fluid
used to move the nozzles on the solid-rocket motors with TVCs was
depleted. )