Nothing Special   »   [go: up one dir, main page]

Godfrey

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 52

Using CMMI for Improvement at

GSFC

Systems Engineering Lecture Series


6/01/04

Sally Godfrey
Sara.H.Godfrey@nasa.gov
301-286-5706
SE Lecture Series 6/04 Slide 1
Agenda
• CMMI: What is it? Why use it?

• NASA Improvement Initiatives


– Systems Engineering & CMMI
– Software Engineering & CMMI

• GSFC’s Use of CMMI for Software


– Phase 1: Piloting
What we learned during piloting (FY02)
– Phase 2: Implementation
Approach for implementing improvement (CMMI)
Progress to date

• Summary

SE Lecture Series 6/04 Slide 2


What is CMMI?

SE Lecture Series 6/04 Slide 3


What is CMMI?

The Capability Maturity Model Integrated (CMMI) is an integrated framework


for maturity models and associated products that integrates the two key
disciplines that are inseparable in a systems development activity: software
engineering and systems engineering.

A common-sense application of process management and quality improvement


concepts to product development, maintenance and acquisition

A set of best practices

A community developed guide

A model for organizational improvement

SE Lecture Series 6/04 Slide 4


Capability Maturity Model Integrated
(CMMI)-Staged
Level Process Areas

5 Optimizing Organization innovation and deployment


Causal analysis and resolution

4 Quantitatively Organizational process performance


Managed Quantitative project management
Requirements development
Technical solution
Product integration
Software - Systems Verification
Validation
CMM Engineering 3 Defined Organizational process focus

CMMI -CMM
Organizational process definition
Organizational training
Integrated project management
Risk management
Decision analysis and resolution
Integrated Supplier Management
Integrated Teaming
Software Requirements management
Project planning
Acquisition - 2 Managed Project monitoring and control
Configuration Management
CMM Supplier agreement management
Measurement and analysis
Product & Process Quality Assurance

1 Initial
SE Lecture Series 6/04 Slide 5
Capability Maturity Model
Integrated -Staged
Characteristics of the Level 5 Focus on process
Maturity levels “Optimizing” improvement.
Lower Risk -Higher
Level 4 Productivity/Quality
“Quantitatively Managed” Process measured and
controlled.

Level 3 Process characterized for the organization and is


“Defined” proactive. (Projects tailor their process from the
organization’s standard)

Level 2 Process characterized for projects and is often reactive.


“Managed”

Level 1 Processes unpredictable, poorly controlled and reactive


“Initial” Higher Risk - Lower Productivity/Quality
CMM was developed by the Software Engineering Institute (SEI), Carnegie Mellon University (CMU)

SE Lecture Series 6/04 Slide 6


Components of CMMI Model

Maturity Levels

Process Area 1 Process Area 2 Process Area 3

Specific Goals Generic Goals


Common Features

Commitment Directing Verifying


Ability to Perform
Specific To Perform Implementation Implementation
Practices
Generic
Practices

SE Lecture Series 6/04 Slide 7


Example Process Area:
Requirements Management
SG 1: Manage Requirements
SP 1.1: Obtain an Understanding of the Requirements
SP1.2: Obtain Commitment to the Requirements
SP1.3: Manage Requirements Changes
SP1.4: Maintain Bi-directional Traceability of Requirements
SP1.5: Identify Inconsistencies between Project Work & Reqmts

GG 2: Institutionalize a Managed Process


GP 2.1: Establish an Organizational Policy
GP 2.2: Plan the Process
GP 2.3: Provide Resources
GP 2.4: Assign Responsibility
SE Lecture Series 6/04 Slide 8
Example Process Area:
Requirements Management
GG 2: Institutionalize a Managed Process
GP 2.5: Train People
GP 2.6: Manage Configurations
GP 2.7: Identify & Involve Relevant Stakeholders
GP 2.8: Monitor and Control the Process
GP 2.9: Objectively Evaluate Adherence
GP 2.10: Review Status with Higher Level Management

GG 3: Define a Managed Process


GP 3.1:Establish a Defined Process
GP 3.2:Collect Improvement Information

SE Lecture Series 6/04 Slide 9


Why are we using CMMI?

SE Lecture Series 6/04 Slide 10


Why Use CMMI?

• In software and systems engineering, it is a benchmarking tool widely used


by industry and government, both in the US and abroad.

• CMMI acts as a roadmap for process improvement activities.

• It provides criteria for reviews and appraisals.

• It provides a reference point to establish present state of processes.

• CMMI addresses practices that are the framework for process improvement.

• CMMI is not prescriptive; it does not tell an organization how to improve.

SE Lecture Series 6/04 Slide 11


Growth Trend Problem: Dependency
on Software Technology

Indicator: Industry has reported that the


amount of software on passenger aircraft is
Increasing amount of project

increasing exponentially
NASA programs and projects are likely to be
experiencing the same growth curve
The use of software as a technology is on a
much steeper growth curve than other
software

supporting technologies
If the Agency does nothing to improve
software engineering and acquisition, we
can expect commensurate growth in cost,
schedule, and defects
Uncontrolled growth of software dependencies
without prudent mitigations will result in a
real reductions in NASA’s capability to
fulfill it’s mission

Years 

SE Lecture Series 6/04 Slide 12


Improvements with CMM:Time
History - Productivity/Error Rates
Productivity Rate and Quality Performance
* For Software Programs

68 Hours Error Rate Productivity Rate


Per KLOC SLOC per Person Day

Level
Average Number of Hours
Per Service Request Level
4
Productivity
3 44 Hours
Level
Percent Satisfaction Increased By
2 With BIS Support 80% As Error
Rates
Decreased
1988 1990 1992 1994 1996 1998

Source: Lockheed Martin SEPG Presentation 1999

Task 4 WBS 3.6.5


SE Lecture Series 6/04 Slide 13
Improvements with CMM:
Time History – Cost
Project Cost Estimates
Labor Hours Over- or Under-Estimated
140%
1992 1993 36% Faster
1994 1995 Support
1996

70% 68 Hours
Level 1 and 2 Level 3

0% Product
Average Number of Hours Quality
Per Service Request
-70% 44 Hours
Percent Satisfaction Increased with
Cost Under Control
With BIS Support Rising Maturity
-140%
Based on 120 Projects in
Without Historical Data With Historical Data Boeing Information
Variance +20% to –145% Variance +20% to –20% Systems

Reference: Scott Griffin, Chief Information Officer, The Boeing Company, SEPG Conference, 2000.

Task 4 WBS 3.6.5


SE Lecture Series 6/04 Slide 14
Project Performance vs. CMM
Level (General Dynamics)
Percent Rework Phase Containment CRUD Density per Productivity (X
CMM Level Effectiveness KSLOC Factor Relative)

2 23.2% 25.5% 3.20 1x

3 14.3% 41.5% 0.90 2x

4 9.5% 62.3% 0.22 1.9x

5 6.8% 87.3% 0.19 2.9x

Diaz, M. & King, J., “How CMM Impacts Quality, Productivity, Rework, and the Bottom Line”, Cross Talk: The
Journal of Defense Software Engineering, March 2002. General Dynamics Decision Systems, 3 Divisions, 1,500
Engineers / 360 SW Engineers, CRUD = Customer Reported Unique Defects, Largest RIO found to be from
levels 2 to 3 at 167% based on cost savings in rework.
SE Lecture Series 6/04 Slide 15
Early Success on the NASA Software Initiative
at MSFC: Reduced Cost
80
gLIMIT
70
UPA MSRR
60
SXI
50
52% increase 27% increase 23% increase
40

CMM Level 1  CMM Level 2 


30

20

10

0
Flight software source lines of code per person-month of effort
Software development productivity increased at Marshall Space Flight
Center, the first Center to pilot SEI’s Capability Maturity Model (CMM) in
association with this Initiative

SE Lecture Series 6/04 Slide 16


NASA Improvement Initiatives

SE Lecture Series 6/04 Slide 17


NASA Systems Engineering
Initiative
Directed by NASA Chief Engineer:
“…the Software Engineering Working Group is expected to…define and pilot a
methodology for assessment of the systems engineering capability, which addresses
knowledge and skill of the workforce, processes, and tools and methodology.”
Deputy Chief Engineer for Systems Engineering (Nov. 1, 2000)

Studied by NASA Systems Engineering Working Group (SEWG)


– Different assessment methods were be evaluated by the SEWG to determine best
methodology for benchmarking/improving Systems Engineering implementation
agency-wide.
– Initial “quick-look” at systems engineering at GSFC using CMMI in 2002

CMMI Pilot Appraisal at JPL in April 2004


– Did CMMI appraisal provide good benchmark of systems engineering capability?
– Was level of formality of CMMI appriasal used suitable for use at all Centers?

SE Lecture Series 6/04 Slide 18


NASA Software Engineering Initiative

Goal: Advance software engineering practices (development,


assurance, and management) to effectively meet the scientific
and technological objectives of NASA.
Strategy 1. Implement a continuous software process and product improvement
program across NASA and its contract community.

Strategy 2. Improve safety, reliability, and quality of software through the


integration of sound software engineering principles and standards.

Strategy 3. Improve NASA’s software engineering practices through research.

Strategy 4. Improve software engineers' knowledge and skills, and attract and
retain software engineers.

SE Lecture Series 6/04 Slide 19


GSFC Software
Process Improvement

SE Lecture Series 6/04 Slide 20


GSFC Software
Process Improvement Plan
GSFC has a Software Process Improvement Plan, signed by Al Diaz, 9/01

Focus of Plan - Improve the processes and practices in use at GSFC using the Capability
Maturity Model Integrated (CMMI) as a measure of progress
– GSFC Plan primarily addresses Strategy 1 in NASA Plan.
– FY04 Direction by Al Diaz: Achievement of specific CMMI goals
Domain FY04 FY05 FY06 FY07 FY08
Flight Software Branch Level 2 Level 3
ISD & Code 400 Level 2 Level 3
Mission Software
Any Code 600/900 Level 2 Level 3
Mission Software not
previously included

Scope of Plan - All projects defined by NPG 7120.5 (Mission Software) & identified by Center
Director will participate in this initial effort

SE Lecture Series 6/04 Slide 21


Infrastructure

Management
Oversight Group MOG
Linda Wilbanks -Lead
MOG

Institutional Draft
Consensus Process

Feedback Defined Process


Engineering Process Asset Management
Group Group
Projects Support AMG
EPG Metrics

EPG
Sally Godfrey -Lead

SE Lecture Series 6/04 Slide 22


Implementation Phases in
GSFC’s Improvement Plan
Phase 1: Pilot Phase (FY02)
– Benchmark several representative GSFC areas
– Estimate effort, cost to improve identified gaps
– Evaluate implementation approach
Phase 2: Implementation Phase (FY03-FY08)
– Implementation of PI on all critical projects
– Begin by working with new projects to field improvements
– Target CMMI Level 3 for Mission Software
Phase 3: Maintain Level and Continue Improvement
– Include other areas? (e.g. science processing)

FY02 FY03 FY04 FY05 FY05 FY06 FY07 FY08

PHASE 1 PHASE 2 PHASE 3


SE Lecture Series 6/04 Slide 23
GSFC Phase 1: Piloting
FY02
• Conducted 3 sets of CMMI pre-appraisals
– Appraisals were quick-look, Class B, C appraisals
– Purpose of appraisals:
• Evaluate use of CMMI, get better estimate of effort/ cost
• Get a benchmark against CMMI model, identify gaps
• Sets of projects for pre-appraisals:
– 2 flight software in-house led teams (included contractors)
– 3 spacecraft projects (2 largely contracted, 1 in-house)
– 2 ground support software in-house led teams
• CMMI appraisals identified a number of gaps that were independently
identified
– Actions from Code S/Y Colloquium produced a similar list
– Plans for Phase 2 were based on findings from Phase 1

SE Lecture Series 6/04 Slide 24


What is broken (gaps) in the Agency’s
software engineering capability?
Centers are almost universally weak in:
Project planning
Estimating cost, schedule, and resource requirements
for project requirements fulfilled by software
Monitoring and control of software engineering products
I.e., tracking progress and taking effective corrective
actions
Configuration management is not universally applied throughout the software
development process
Interface between software and system engineering processes is not well defined so
agreements, audits, and reviews are not well planned or performed to achieve the
most benefit
Software Quality Assurance is generally not well understood nor is its value
appreciated Findings by Raymond Kile, Authorized Lead Evaluator
Center for Systems Management, Sept 2002

GSFC’s gaps were similar to findings across the Agency

SE Lecture Series 6/04 Slide 25


GSFC Phase 2: Strategies
(FY03-FY08)
• Use of CMMI SE/SW/SS Continuous model-- Early implementation of
process areas that benefit us most
• Initial focus on software improvement --NASA Systems Engineering Working
Group still determining direction
• First software area will be on in-house flight software, then ISD/Code 400
• Acquisition improvement activities begin in mid-FY04, gradual phase in
• Assets will be developed “top-down/bottom-up”
– Top-Down: Define high level structure of documentation, training
– Bottom-Up: Develop low level products for deployment, use FSW best
practices to help develop high level process
• Phase in improvements on newer projects- Products developed as projects need
them
• Project Plan updated for new CMMI goals - in signature cycle

SE Lecture Series 6/04 Slide 26


Process Area Name Abbr ML FY04 FY05 FY06 FY07 FY08
Requirements Management REQM 2
Measurement & Analysis MA 2
Project Monitoring & Control PMC 2
Project Planning PP 2
Process & Product Quality PPQA 2
Assurance
Supplier Agreement SAM 2
Management
Configuration Management CM 2
Decision Analysis & DAR 3
Resolution

Product Integration PI 3
Requirements Development RD 3
Technical Solution TS 3
Va lidation VA L 3
Ve rification VER 3
Organizational Process Focus OPF 3
Organizational Process OPD 3
Definition
Integrated Project IPM 3
Management
Risk Management RSKM 3
Integrated Supplier ISM 3
Management
Organizational Training OT 3
Key
Reviewed
Capability Level 2
Capability Level 3
Table B-1: Process Area Schedule for Flight Software
Process Area Name Abbr ML FY04 FY05 FY06 FY07
Requirements REQM 2
Management
Measurement & Analysis MA 2
Project Monitoring & PMC 2
Control
Project Planning PP 2
Process & Product PPQA 2
Quality Assurance
Supplier Agreement SAM 2
Management
Configuration CM 2
Management
Decision Analysis & DAR 3
Resolution

Product Integration PI 3
Requirements RD 3
Development
Technical Solution TS 3
Va lidation VA L 3
Ve rification VER 3
Organizational Process OPF 3
Focus
Organizational Process OPD 3
Definition
Integrated Project IPM 3
Management
Risk Management RSKM 3
Integrated Supplier ISM 3
Management
Organizational Training OT 3
Key
Reviewed
Capability Level 2
Capability Level 3
Table C-1: Process Area Schedule for ISD/Code 400 Mission Software
GSFC Phase 2: Focus Activities
Beginning FY03
• Code 582: Flight Software:
– Documentation of existing best practices (& suggested improvements)
– Tools, checklist, templates to support consistent use of practices (e.g.
requirements inspection procedures, test plan/procedure templates)
– Training to support use of improved practices
– Identification & support for collection/analysis of measures
• Code 580: Using flight software practices as a basis, best practices will be
documented for all of ISD with assoc. work products & training
– Consistent approach to planning and tracking (WBS, Earned value, Risk
Management)
• Code 590: Have worked with NASA systems engineering group to pilot use of
CMMI for systems engineering appraisals (JPL was first pilot)
• Code 400: Software Acquisition improvements beginning with developing
improved RFP templates for software - Review at JPL/GSFC QMSW
workshop
• Code 300: Began improvements in Software Assurance

SE Lecture Series 6/04 Slide 29


Summary-Process Documentation Development
Progress (FSW & ISD) as of April 13, 2004

Status of ISD Process Assets

CCB Approved
4%
Not Started
Final
15% Outline
Draft Status of Tailored FSW Process Assets
Final
CCB Approved CCB Approved
23%

Not Started
Draft Not Started 36%
17% 56%

Final
8%

Outline
8%

Total number of assets = 115

Outline
5%
Draft
28%
Total number of assets = 118

SE Lecture Series 6/04 Slide 30


Overall Concepts-
Documentation
• Will be a “generic” set of procedures/processes for ISD/GSFC
• “Generic set” will be tailored for Branches (FSW) or classes of
software (e.g.-ground systems, science processing, research…)
Must use Tailoring Guidelines.
• Projects can also tailor, based on tailoring guidelines
• ISD/GSFC documentation will be on EPG web site
– Branch tailored documentation can be on Branch web sites
– Web sites will include use-aids: checklists, templates
• Training and tools will be available with processes

Organization Branch/Class Project

SE Lecture Series 6/04 Slide 31


Process Documentation
Structure-Top-Down View
Documentation is divided into three Process categories:
Project Management Processes, Product Development Processes,
Organizational Support Processes
Examples from Project
Management:
Project Formulation, Project Planning,
Processes Project Start-up, Project Monitoring & Control,
Project Closeout

Sub-Processes Software Estimation, Risk Management,


Cost Tracking
Guidelines for selecting a life cycle, Software
Procedures Estimates/Actuals Database, Risk Mgmt.
Templates, Plan Template
Tools
FSW Standard Life Cycle,
Tailored Versions FSW Risk Management
Procedure

SE Lecture Series 6/04 Slide 32


Description of Processes to be Documented

Project Product Org. Support


Management Development Processes

Following Following
Slide Slide

Project Project Monitoring &


Start-Up Closeout
Formulation Planning Control
Description of Processes to be Documented

Project Product Org. Support


Management Development Processes

Previous Following
Slide Slide

Systems Requirements
Design Implementation Testing
Engineering Engineering

Product Sustaining Eng.


Release & Maint.
Description of Processes to be Documented

Project Product Org. Support


Management Development Processes

Previous Previous
Slide Slide

Configuration Quality Measurement Process


Training
Management Assurance & Analysis Engineering
Search

GSFC SW Process Assets Training Measurement Lessons Learned


Improvement Library (PAL)

Welcome to the GSFC Process Assets Library


Process Asset The Process Assets Library (PAL) is the repository for all process
Library assets that have been approved for software development at GSFC.
+About the PAL Assets include policy, procedures, process descriptions, document
+PAL Feedback Form templates, guidelines, standards, checklists, and tools.
+PAL Help
+Glossary The initial set of assets has been developed for ISD, but will
ultimately be augmented to serve all GSFC projects.

PAL assets may be assessed in multiple ways. The following table


shows how these access routes, or “views” can help you find the
assets you need.
PAL Contents
+Project Management
+Product Development View What the view provides
+Organizational Contents A table of contents for the PAL
Support Index An alphabetical index into the PAL
Role A list of the roles of personnel working on
+PAL Index a typical software project, showing the
+Assets by Role
+Assets by Tailoring
process assets needed by each role and
+Assets by Type training courses for each role
Tailored A set of process assets that have been
created or “tailored for use on a specific
project or in a specific domain
Description High level descriptions of the 3 asset
+Policies
+Standards
categories & the processes they contain
Asset Type A set of all assets of the same type; e.g.,
all “templates” or all “checklists”
Features of SoftwareTraining
Web Page
Training Page Includes:
• Training Program Information
– Software Classes Calendar & GSFC Training Calendar
– Role Based Training Matrix
– On-line Training (self-paced, presentations, etc)
– Software Certification Information
• Software Conference Information
• “Ask an Expert” Feature
• Training Support Page
– Help in Developing a Class (Can request new class)
– Mentoring Information
– How to schedule a class, Feedback on Classes
• Other Training Links

SE Lecture Series 6/04 Slide 37


Other Features of Software
Web Site
Lessons Learned web page features:
• “Submit a Lesson”
• Software-Specific Lessons Learned Library with views by roles,
categories, phases
• Subscribe/Unsubscribe Features
• Lessons Learned Feedback
• Link to “Experts”
• Questions and Answer Forum
Measurement Repository web site features:
• On-line submission of measures
• Access to Measurement Database (for authorized users)
• Measurement Analysis and Charts
• Guidance in establishing and measurement programs

SE Lecture Series 6/04 Slide 38


Software Training Associated with
Process Improvement

Audience Focus Approach


Community/Others General Awareness -Overview info on CMMI,
Interested improvement initiative
-Lectures, teas, overview classes
Developers/Team ISD/GSFC specific -Role-based approach
Leads practices -Train on documented procedures,
guidelines, templates
Developers/Team Discipline expertise -Focus on general skills
Leads -University classes, 3rd party
classes, teas, conferences
Software Products, Software/ -Emphasis on products delivered &
Customers Customer Interface needs for producing products
-Use of products

SE Lecture Series 6/04 Slide 39


Progress Highlights in
FY03/FY04
• Flight Software:
– FSW “Standards” CCB; 27 products baselined and available
– Are developing products “in-time” to meet project needs
– Products in use on all new FSW projects, some existing
• ISD/Code 400:
– Have ISD CCB for processes; 7 products baselined and available
– Have developed templates for software parts of RFP’s
– Have developed a class to help project managers manage software
– Have sponsored classes in inspections, software configuration
management, software safety, software acquisition, quantitative
project management
• Code 300:
– Have developed processes and checklists
– Training for better software assurance

SE Lecture Series 6/04 Slide 40


Plans for FY04/FY05
• First pre-appraisal in mid-August on Flight Software: Plan to look at (gap
analysis):

Project Planning Project Monitoring & Control

Requirements Management Requirements Development

Configuration Management Software Assurance

Risk Management Organizational Process Focus

• Target SCAMPI (formal appraisal) in October for a few process areas


• Rest of level 2 processes for FSW in FY05, some of level 3 processes
• Will phase in level 2 processes for ISD ASAP, target capability level 2
appraisal in FY05

SE Lecture Series 6/04 Slide 41


Summary
• GSFC is moving forward to improve our software processes and products using CMMI as an improvement model

– Phase 1 identified many potential areas for improvements


– Phase 2 work has started work in a variety of areas and is beginning to deploy software improvements

– We are working towards achievement of CMMI Level 2 in a few process areas by early FY05 and CMMI Level 3 by late FY07
– We hope to coordinate with systems engineering improvements

“Better Software/Systems Engineering to Support Our Projects”

SE Lecture Series 6/04 Slide 42


Back-up Slides

SE Lecture Series 6/04 Slide 43


What Now?
• GSFC Software Improvement Site: http://software.gsfc.nasa.gov

• For CMMI model reference go to:


http://www.sei.cmu.edu/cmmi/products/models.html
• Can Download CMMI-SE/SW(IPPD)/SS V1.1 Staged

• Attend a CMMI Overview class or an Introduction to CMMI class for


more details

• What you really need to know is what processes you should be


using to do your job well
– Define and use a good process
– Measure against the CMMI model
– Improve your process

SE Lecture Series 6/04 Slide 44


CMMI and ISO
• ISO is a standard, CMMI is a model
• ISO is broad- focusing on more aspects of the business. Initially for manufacturing
• CMMI is “deep”- provides more in-depth guidance in more focused areas
(Software/Systems Engineering/Software Acquisition-SW/SE/SA)
• Both tell you “what” to do, but not “how” to do it
• But CMMI tells you what “expected” practices are for a capable, mature organization

• CMMI provides much more detail for guidance than ISO by including an extensive
set of “best practices”, developed in collaboration with industry/gov/SEI
-CMMI provides much better measure of quality of processes; ISO
focuses more on having processes
-CMMI puts more emphasis on continuous improvement
-CMMI allows you to focus on one or a few process areas for
improvement (It’s a model, not a standard, like ISO) --Can
rate just one area in CMMI
-CMMI and ISO are not in conflict: ISO helps satisfy CMMI
capabilities; CMMI more rigorous
SE Lecture Series 6/04 Slide 45
What is CMMI? What do levels of software
engineering maturity mean?
Level Description Process Areas Result

Improvement institutionalized- Causal Analysis & Resolution


Optimizing
routinely fed Organizational Innovation & High
5 Deployment Productivity
back into the process
& Quality
Product and process are Organizational Process Performance
Quantitatively quantitatively controlled Quantitative Project Management
Managed
4
Organizational Process Focus
Organization Process Definition
Software engineering and Organizational Training
Defined management processes Integrated Project Management
Technical Solution/Product Integration
defined and integrated
3 - processes standardized
Integrated Supplier Management
Verification/ Validation
Risk Management
Decision Analysis Resolution
Requirements Management
Basic project management in place; Project Planning
Project Monitoring and Control/ Supplier
Managed performance is repeatable Agreement Management
2 Process & Product Quality Assurance
Configuration Management
Measurement & Analysis

Initial Ad Hoc Processes are informal High


and unpredictable
1 Risk

Source: Software
SE Lecture Engineering
Series 6/04 Institute Slide 46
Time History - Productivity

Productivity
Reduced Staff Support Per System = Increased Productivity
100
Increased Productivity

75 - 12%

- 26% Projects at
50
- 38% Maturity Level 3
Percent Reduction
25 In Staff Needed Per - 62% Increased
System Productivity 62%
0
Based on 120 Projects at
1992 1993 1994 1995 1996 Boeing Information
Level 1 Level 2 Level 3 Systems

Reference: Scott Griffin, Chief Information Officer, The Boeing Company, SEPG Conference, 2000.

Task 4 WBS 3.6.5


SE Lecture Series 6/04 Slide 47
Time History – Satisfaction

Customer Satisfaction
Based on Semi-Annual Survey of Customers
100
36% Faster Support

75 68 Hours

50 Customer
Average Number of Hours Satisfaction
Per Service Request
25 44 Hours
Percent Satisfaction Increased with
With BIS Support CMM Level
0
Based on 3 Major
1992 1993 1994 1995 1996 Programs in Boeing
Level 1 Level 2 Level 3 Defense and Space Group

Reference: Scott Griffin, Chief Information Officer, The Boeing Company, SEPG Conference, 2000.

Task 4 WBS 3.6.5


SE Lecture Series 6/04 Slide 48
Time History – Cost

100 Level 2 Level 3


Assessment Assessment
90

80
L2 Processes L3 Processes
70 Initiated Initiated
% Overrun Cost

60

50

40

30

20 Cost Under Control


10

-10 88 89 90 91 92 93 94 95 96 97
And
-20 Earlier
Project Start Date Legend
19 Finished Programs

Source: Software-related engineering projects completed for SAIC Aeronautical Systems Operation
during 1984 -1996 for all contract types and contract size $80K to $3.5M.
Task 4 WBS 3.6.5
SE Lecture Series 6/04 Slide 49
Time History – Schedule

100 Level 2 Level 3


Assessment Assessment
90

80 L2
70 Processes
Initiated L3 Processes
% Overrun Schedule

60
Initiated
50

40

30

20
Schedule Under Control
10

-10 88 89 90 91 92 93 94 95 96 97
And
Earlier Legend
Project Start Date
18 Finished Programs

Source: Software-related engineering projects completed for SAIC Aeronautical Systems Operation
during 1984 -1996 for all contract types and contract size $80K to $3.5M.
Task 4 WBS 3.6.5
SE Lecture Series 6/04 Slide 50
Even Successful Missions
experience software problems
“A few days after the July 4th, 1997 landing, the Mars Pathfinder began experiencing
total system resets, each resulting in losses of data. The problem was a logical error
in the real-time scheduling system---a classic priority-inversion problem.
Fortunately, this problem was repairable from earth.
A malfunction in one of the on-board computers on Clementine on May 7, 1994 caused
a thruster to fire until it had used up all of its fuel, leaving the spacecraft spinning at
about 80 RPM with no spin control. This made the planned continuation of the
mission, a flyby of the near-Earth asteroid Geographos, impossible.
The Magellan spacecraft broke Earth lock and lost communications several times in
August 1990 (soon after entering Venus orbit). It took over six months to identify
the source of the problem, which was a timing error in the flight software.”

- Ricky Butler, NASA Langley’s Formal Methods Research Program Overview


SE Lecture Series 6/04 Slide 51
Launch Failures Caused by Design
Errors
• “The April 30, 1999 loss of a Titan I, which cost the taxpayers $1.23-
billion, was due to incorrect software (incorrectly entered roll rate
filter constant)

• Aug 27, 1998 failure of the Boeing Delta 3 launch vehicle (control
system attempted to correct a roll oscillation and the hydraulic fluid
used to move the nozzles on the solid-rocket motors with TVCs was
depleted. )

• On 4 June 1996, the maiden flight of the Ariane 5 launcher


exploded (a software exception was caused during a data
“Three )”
conversion successive Titan IV mission failures, an Athena failure
and two straight mission losses of the large new commercial
Delta III, including its latest mishap May 4, mark the worst string
of major U.S. launch accidents in 13 years.”

- Ricky Butler, NASA Langley’s Formal Methods Research


Program Overview
SE Lecture Series 6/04 Slide 52

You might also like