Software Process Guidelines and Improvement

advertisement
University of Southern California
Center for Systems and Software Engineering
RDCR ARB
University of Southern California
Center for Systems and Software Engineering
TRR ARB Schedule
Tuesday Feb 10
3:30 – 4:50
5:00 – 6:20
6:30 – 7:50
Wednesday Feb 11
Thursday Feb 12
Team 03
SnapValet
Team 12
Cash Doctor
Team 10
Refersy
Team 07
Mission Science
Team 04
FlowerSeeker
Team 01
We Are Trojans
Location: SAL 322
Printing:
• Bring a copy of presentation for your client.
• If your client wants a copy of your documents, please print them out.
• Make sure you post all documents on your team website.
All team members and client(s) must be available during your
presentation time
© 2015 USC-CSSE
2
University
University of
of Southern
SouthernCalifornia
California
Centerfor
Center
forSystems
Systems
and
and
Software
Software
Engineering
Engineering
RDCR ARB – Architected Agile Teams
(x,y):
(8, 10)
(presentation time, total time)
Acceptance Test Plan and Cases; Team's strong and weak points +
Shaping & Overall Project Evaluation; Full test plan and cases
(2, 3)
OCD. System purpose; changes in current system and deficiencies,
proposed new system, system boundary, and desired capabilities and
goals; top-level scenarios If there is no change, just present system
boundary diagram
(10,15) Prototype Update. Changes in significant capabilities (especially those with
high risk if gotten wrong)
(8, 10)
Architecture. Overall and detailed architecture; design if critical;
COTS/reuse selections (NOT JUST CHOICES)
(6, 8)
Life Cycle Plan. Project plan- at lease until CCD or as appropriate;
Team members’ roles & responsibilities in 577b, Full Iteration Plan
(5, 10)
Feasibility Evidence. Focus on Risk Analysis. Traceability Matrix. Definition
of Done, 2 Metrics results, Technical Debt
• Plan on 2 minutes per briefing chart, except title
• Focus on changes (particularly new things) since DCR
• You may vary from the above: please notify ARB board members IN ADVANCE
© 2015 USC-CSSE
3
University of Southern California
Center for Systems and Software Engineering
Grading Guidelines
Total = 65 points
(30)
(10)
(5)
(20)
Progress of your work
Presentation
Risk Management
Quality
© 2015 USC-CSSE
4
University of Southern California
Center for Systems and Software Engineering
Quick answers from your IC1
University of Southern California
Center for Systems and Software Engineering
What is role of Quality Focal Point?
• To ensure the quality of the product
• Defect prevention and Defect detection
– Check traceability & consistency
– Lead on peer review
– Bug / defect tracking
• Bigger scale or mission-critical projects
– Mission assurance
– Testing, Exercise, and Rehearsal
– Metrics and measurements analysis
© 2015 USC-CSSE
6
University of Southern California
Center for Systems and Software Engineering
How to write a successful user manual
or make proper documents ?
• Use template for consistency
• Ask / observe users what they want
• In the industry (IBM / Oracle)
– Use a technical writer with degree in BA in
English / Linguistics
– Minimal screenshots
• For evolving products, for translation
© 2015 USC-CSSE
7
University of Southern California
Center for Systems and Software Engineering
How to count SLOC
•
•
•
1 Physical Line of Code (LOC)
2 Logical Lines of Code (LLOC) (for statement and printf statement)
1 comment line
•
•
•
5 Physical Lines of Code (LOC)
2 Logical Line of Code (LLOC)
1 comment line
© 2015 USC-CSSE
8
University of Southern California
Center for Systems and Software Engineering
What about ESLOC ?
• Equivalent / Effective Source Lines of Code
• Equivalent size is a quantification of the effort required to produce
a software product. It may not be correlated to the total product
size.
ESLOC =(1.0 * new code) + (0..5 * modified code) + (0.1 * reused code)
ESLOC = SLOC* AAF
AAF = 0.4*DM + 0.3*CM + 0.3*IM
DM - % of design modified, CM - % of code modified, IM - % of Integration effort
© 2015 USC-CSSE
https://dap.dau.mil/aap/pages/qdetails.aspx?cgiSubjectAreaID=6&cgiQuestionID=116005
9
University of Southern California
Center for Systems and Software Engineering
How to estimate SLOC?
• Agile Techniques
– Story points
– Planning Poker
• Traditional Techniques
– Expert Judgment
– Function Points
– Application Points
• Uncertainty treatment
– PERT Sizing (use distribution, Pessimistic/Op/Most likely)
– Wideband Delphi (discuss between experts)
– COCOMO-U (Bayesian Belief network)
• Codecount tool
© 2015 USC-CSSE
10
University of Southern California
Center for Systems and Software Engineering
Productivity
Translate SLOC into hours
• Mythical man month
– 10 lines per developer day
• Mission Critical system
– 0.82 ESLOC per hour
• Re-development - 7.08 SLOC per hour
• New development - 13.65 SLOC per hour
• Enhancement – 5.27 SLOC per hour
Mei He, et.al, “An Investigation of Software Development productivity in China”, ICSSP 2008
© 2015 USC-CSSE
11
University of Southern California
Center for Systems and Software Engineering
Software process guidelines
and improvement
CS 577b Software Engineering II
Supannika Koolmanojwong
University of Southern California
Center for Systems and Software Engineering
CMMI
© 2015 USC-CSSE
13
University of Southern California
Center for Systems and Software Engineering
What is CMMI?
C onsultant
M oney
M aking
I nitiative
© 2015 USC-CSSE
14
University of Southern California
Center for Systems and Software Engineering
CMMI Models V1.3
CMMI® (Capability Maturity Model® Integration) models are
collections of best practices that help organizations to
improve their processes
Procedures and methods
defining the relationship of
tasks
B
A
D
C
PROCESS
People
with skills,
training, and
motivation
Tools and
equipment
© 2015 USC-CSSE
[Ref: CMMI]
15
University of Southern California
Center for Systems and Software Engineering
Brief Characterization of “CMMI”
CMMI (Capability Maturity Model Integration) is….
• A framework of management, engineering, and support best
practices organized into topics, called Process Areas
• An approach to improving the capability of any of the
Process Areas by incrementally applying a set of Generic
Practices that enhance the functioning of an individual
process
• Best thought of as a set of “process requirements” that help
guide an organization who is defining and implementing
processes related to the topics
• NOT a pre-defined, implementable “as is” process definition
© 2015 USC-CSSE
[Ref: Garcia 2005]
16
University of Southern California
Center for Systems and Software Engineering
Process Areas
Configuration Management (CM)
Causal Analysis and Resolution (CAR)
Decision Analysis and Resolution (DAR)
Integrated Project Management (IPM)
Measurement and Analysis (MA)
Organizational Performance Management (OPM)
Organizational Process Definition (OPD)
Organizational Process Focus (OPF)
Organizational Process Performance (OPP)
Organizational Training (OT)
Process and Product Quality Assurance (PPQA)
Product Integration (PI)
Project Monitoring and Control (PMC)
Project Planning (PP)
Quantitative Project Management (QPM)
Requirements Development (RD)
Requirements Management (REQM)
Risk Management (RSKM)
Supplier Agreement Management (SAM)
Technical Solution (TS)
Validation (VAL)
Verification (VER)
© 2015 USC-CSSE
17
University of Southern California
Center for Systems and Software Engineering
Requirements Management Process Area
Purpose: to manage requirementsExample
of the project’s
products and product
Work Products
components and to ensure alignment
between
requirements
and the
1. Lists
of criteriathose
for distinguishing
appropriate
project’s plans and work products.requirements providers
2. Criteria for evaluation and acceptance of requirements
Specific Goal 1 Requirements are3.managed
and inconsistencies
with plans
Results of analyses
against criteria
and work products are identified. 4. A set of approved requirements
• Specific Practice 1.1 Develop an understanding with the requirements
providers on the meaning of the requirements.
–
•
•
•
Subpractices
1. Establish criteria for distinguishing appropriate requirements providers.
2. Establish objective criteria for the evaluation and acceptance of requirements.
3. Analyze requirements to ensure that established criteria are met.
4. Reach an understanding of requirements with requirements providers so that
project participants can commit to
them. of evaluation and acceptance criteria
Examples
include the following:
SP 1.2
•Clearly and properly stated
.........
•Complete
•Consistent with one another
SP 1.5
•Uniquely identified
•……………………
© 2015 USC-CSSE
18
University of Southern California
Center for Systems and Software Engineering
CMMI terminologies
CMMI
Process Area
• Requirements Management
• Project Planning
ICSM
Practice
• System and Software Requirements Dev Practice
• Life Cycle Planning Practice
Specific goal
Task
Specific practice
Step
Subpractice
Detailed step
Work Product
Work Product / Artifact / Output
• A set of approved requirements
• Agreed Win Conditions
© 2015 USC-CSSE
19
University of Southern California
Center for Systems and Software Engineering
Example of a CMMI Process Area
Specific Goal
Specific Practice
Example Work Product
Example Box
Reference
Subpractice
© 2015 USC-CSSE
20
University of Southern California
Center for Systems and Software Engineering
CMMI-DEV
CMMI - SVC
Causal Analysis and Resolution (CAR)
Configuration Management (CM)
Decision Analysis and Resolution (DAR)
Measurement and Analysis (MA)
Organizational Process Definition (OPD)
Organizational Process Focus (OPF)
CMMI - ACQ
Organizational Performance Management (OPM)
Organizational Process Performance (OPP)
Organizational Training (OT)
Process and Product Quality Assurance (PPQA)
Requirements Management (REQM)
Risk Management (RSKM)
Project Planning (PP)
Work Planning (WP)
Project Planning (PP)
Project Monitoring and Control
Work Monitoring and Control (WMC)
Project Monitoring and Control (PMC)
Integrated Project Management
Integrated Work Management (IWM)
Integrated Project Management (IPM)
Quantitative Project Management
Quantitative Work Management (QWM)
Quantitative Project Management (QPM)
Supplier Agreement
Agreement Management
Management (SAM)
(SAM)
Supplier
Product Integration (PI)
Requirements Development (RD)
Technical Solution (TS)
Validation (VAL)
Verification (VER)
Capacity and Availability Management
(CAM)
Incident Resolution and Prevention (IRP)
Service Continuity (SCON)
Service Delivery (SD)
Service System Development (SSD)
Service System Transition (SST)
Strategic Service Management (STSM)
© 2015 USC-CSSE
Agreement Management (AM)
Acquisition Requirements Development
(ARD)
Acquisition Technical Management (ATM)
Acquisition Validation (AVAL)
Acquisition Verification (AVER)
Solicitation and Supplier Agreement
Development (SSAD)
21
University of Southern California
Center for Systems and Software Engineering
Low Maturity Organizations
• Highly dependent on current
practitioners
• Improvised by practitioners and
management
• Not rigorously followed
• Results difficult to predict
• Low visibility into progress and
quality
• Compromise of product
functionality and quality to meet
schedule
• Use of new technology is risky
High Maturity Organizations
• A disciplined approach for
development and management
• Defined and continuously
improving
• Supported by management and
others
• Well controlled
• Supported by measurement
• Basis for disciplined use of
technology Institutionalized
© 2015 USC-CSSE
[Ref: Rudge]
22
University of Southern California
Center for Systems and Software Engineering
Process Area
Information:
Purpose Statement,
Introductory Notes,
Related Process Areas
Specific Goals
Specific Practices
•Example Work Products
•Subpractices
Generic Goals
Generic Practices
• Subpractices
• Generic Practice Elaborations
© 2015 USC-CSSE
23
University of Southern California
Center for Systems and Software Engineering
SGs and # of SGs are different
from process area to process area
GGs for every process area are the same
© 2015 USC-CSSE
24
University of Southern California
Center for Systems and Software Engineering
Two improvement paths using levels
• Capability levels,
– continuous representation
– process improvement achievement in individual process areas
– These levels are a means for incrementally improving the processes
corresponding to a given process area.
– 4 capability levels, numbered 0 through 3.
• Maturity levels
– staged representation
– process improvement achievement across multiple process areas
– These levels are a means of improving the processes corresponding to
a given set of process areas
– 5 maturity levels, numbered 1 through 5
© 2015 USC-CSSE
25
University of Southern California
Center for Systems and Software Engineering
Using Continuous Representation
• When you know the processes that need to
be improved
• Improve the performance of a single
process area (the trouble spot) or several
process areas
• Allow an organization to improve different
processes at different rates.
© 2015 USC-CSSE
[Ref: Lazaris]
26
University of Southern California
Center for Systems and Software Engineering
Factors in your decision
• Business Factors
– Mature knowledge of its own business objectives
(continuous)
– Product-line focus; entire organization (staged)
• Cultural Factors
– Depend on org’s capability
• Process-based – Continuous
• Little experience in process improvement - staged
• Legacy
– Continuation from using other model
© 2015 USC-CSSE
[Ref: Lazaris]
27
University of Southern California
Center for Systems and Software Engineering
Comparison of Capability and Maturity Levels
Level
Continuous Representation
Capability Levels
Staged Representation
Maturity Levels
Level 0
Incomplete
Level 1
Performed
Initial
Level 2
Managed
Managed
Level 3
Defined
Defined
Level 4
Quantitatively Managed
Level 5
Optimizing
© 2015 USC-CSSE
28
University of Southern California
Center for Systems and Software Engineering
To achieve a capability level,
pick a process area
Capability Level 1: Performed
- accomplishes the needed work to
produce work products; the specific
goals of the process area are satisfied
Capability Level 2: Managed
- A managed process is a performed
process that is planned and executed
in accordance with policy; employs
skilled people having adequate
resources to produce controlled
outputs; involves relevant
stakeholders; is monitored, controlled,
and reviewed; and is evaluated for
adherence to its process description.
Capability Level 3: Defined
- A defined process is a managed
© 2015 USC-CSSE
process that is tailored from the
organization’s set of standard
processes according to the
organization’s tailoring guidelines; has
a maintained process description; and
contributes process related
experiences to the organizational
process assets
29
University of Southern California
Center for Systems and Software Engineering
Capability Levels
• Capability Level 0: Incomplete
– not performed or is partially performed.
– One or more of the specific goals of the process area are
not satisfied
– no generic goals exist for this level since there is no
reason to institutionalize a partially performed process
• Capability Level 1: Performed
– accomplishes the needed work to produce work
products; the specific goals of the process area are
satisfied
– Although capability level 1 results in important
improvements, those improvements can be lost over time
if they are not institutionalized
© 2015 USC-CSSE
[Ref: CMMI]
30
University of Southern California
Center for Systems and Software Engineering
Capability Levels
• Capability Level 2: Managed
– A managed process is a performed process that is
planned and executed in accordance with policy; employs
skilled people having adequate resources to produce
controlled outputs; involves relevant stakeholders; is
monitored, controlled, and reviewed; and is evaluated for
adherence to its process description.
• Capability Level 3: Defined
– A defined process is a managed process that is tailored
from the organization’s set of standard processes
according to the organization’s tailoring guidelines; has a
maintained process description; and contributes process
related experiences to the organizational process assets
© 2015 USC-CSSE
[Ref: CMMI]
31
University of Southern California
Center for Systems and Software Engineering
CMMI Maturity Levels
© 2015 USC-CSSE
[Ref: Buchholtz 2003]
32
University of Southern California
Center for Systems and Software Engineering
Categories of Process Areas
Process Area
Product Integration (PI)
Requirements Development (RD)
Technical Solution (TS)
Validation (VAL)
Verification (VER)
Organizational Process Definition (OPD)
Organizational Process Focus (OPF)
Organizational Performance Management (OPM)
Organizational Process Performance (OPP)
Organizational Training (OT)
Integrated Project Management (IPM)
Project Monitoring and Control (PMC)
Project Planning (PP)
Quantitative Project Management (QPM)
Requirements Management (REQM)
Risk Management (RSKM)
Supplier Agreement Management (SAM)
Causal Analysis and Resolution (CAR)
Configuration Management (CM)
Decision Analysis and Resolution (DAR)
Measurement and Analysis (MA)
Process and Product Quality Assurance (PPQA)
© 2015 USC-CSSE
Category
Engineering
Engineering
Engineering
Engineering
Engineering
Process Management
Process Management
Process Management
Process Management
Process Management
Project Management
Project Management
Project Management
Project Management
Project Management
Project Management
Project Management
Support
Support
Support
Support
Support
33
University of Southern California
Center for Systems and Software Engineering
Process Areas by Maturity Levels
Process Area
Project Monitoring and Control (PMC)
Project Planning (PP)
Requirements Management (REQM)
Supplier Agreement Management (SAM)
Configuration Management (CM)
Measurement and Analysis (MA)
Process and Product Quality Assurance (PPQA)
Product Integration (PI)
Requirements Development (RD)
Technical Solution (TS)
Validation (VAL)
Verification (VER)
Organizational Process Definition (OPD)
Organizational Process Focus (OPF)
Organizational Training (OT)
Integrated Project Management (IPM)
Risk Management (RSKM)
Decision Analysis and Resolution (DAR)
Organizational Process Performance (OPP)
Quantitative Project Management (QPM)
Organizational Performance Management (OPM)
Causal Analysis and Resolution (CAR)
Category
Project Management
Project Management
Project Management
Project Management
Support
Support
Support
Engineering
Engineering
Engineering
Engineering
Engineering
Process Management
Process Management
Process Management
Project Management
Project Management
Support
Process Management
Project Management
Process Management
Support
© 2015 USC-CSSE
Maturity Level
2
2
2
2
2
2
2
3
3
3
3
3
3
3
3
3
3
3
4
4
5
5
34
University of Southern California
Center for Systems and Software Engineering
CMMI Process Areas (Staged)
Level
Project Management
Engineering
CAR: Causal Analysis and
Resolution
5 Optimizing
4 Quantitatively
Managed
3 Defined
Support
OPM: Organizational
Performance Management
OPP: Organizational
Process Performance
QPM: Quantitative
Project Management
IPM: Integrated Project
Management
RD: Requirements
Development
RSKM: Risk
Management
TS: Technical Solution
DAR: Decision Analysis
and Resolution
PI: Product Integration
OT: Organizational
Training
VAL: Validation
PP: Project Planning
OPF: Organizational
Process Focus
OPD: Organizational
Process Definition
VER: Verification
MA: Measurement and
Analysis
PMC: Project Monitoring
and Control
2 Managed
Process Management
PPQA: Process & Product
Quality Assurance
SAM: Supplier Agreement
Management
CM: Configuration
Management
REQM: Requirements
Management
1 Initial
© 2015 USC-CSSE
[Based on Ref: Rudge]
35
University of Southern California
Center for Systems and Software Engineering
Categories of Process Areas
Process Area
Category
Level
Integrated Project Management (IPM)
Project Management
Advanced - 3
Project Monitoring and Control (PMC)
Project Management
Basic - 2
Project Planning (PP)
Project Management
Basic - 2
Quantitative Project Management (QPM)
Project Management
Advanced - 4
Requirements Management (REQM)
Project Management
Basic - 2
Risk Management (RSKM)
Project Management
Advanced - 3
Supplier Agreement Management (SAM)
Project Management
Basic - 2
© 2015 USC-CSSE
36
University of Southern California
Center for Systems and Software Engineering
Categories of Process Areas
Process Area
Product Integration (PI)
Requirements Development (RD)
Technical Solution (TS)
Validation (VAL)
Verification (VER)
© 2015 USC-CSSE
Category Level
Engineering 3
Engineering 3
Engineering 3
Engineering 3
Engineering 3
37
University of Southern California
Center for Systems and Software Engineering
Categories of Process Areas
Process Area
Category
Causal Analysis and Resolution (CAR)
Support
Advanced - 5
Configuration Management (CM)
Support
Basic - 2
Decision Analysis and Resolution (DAR)
Support
Advanced - 3
Measurement and Analysis (MA)
Support
Basic - 2
Process and Product Quality Assurance (PPQA) Support
Basic - 2
© 2015 USC-CSSE
Level
38
University of Southern California
Center for Systems and Software Engineering
Categories of Process Areas
Process Area
Category
Level
Organizational Process Definition (OPD)
Process Management
Basic - 3
Organizational Process Focus (OPF)
Process Management
Basic - 3
Organizational Performance Management (OPM) Process Management Advanced - 5
Organizational Process Performance (OPP)
Process Management Advanced - 4
Organizational Training (OT)
Process Management
© 2015 USC-CSSE
Basic - 3
39
University of Southern California
Center for Systems and Software Engineering
CMMI Appraisal
• Identify gap analysis between given
process areas and your process
– Practice
• Activities
• Action Items
– Work Product
• Artifacts
• Documents
• Reports
© 2015 USC-CSSE
40
University of Southern California
Center for Systems and Software Engineering
Requirements Management
© 2015 USC-CSSE
41
University of Southern California
Center for Systems and Software Engineering
Requirements Management
SP 1.2 Obtain Commitment to Requirements
SP 1.3 Manage Requirements Changes
SP 1.4 Maintain Bidirectional Traceability of Requirements
SP 1.5 Ensure Alignment Between Project Work and Requirements
© 2015 USC-CSSE
42
University of Southern California
Center for Systems and Software Engineering
CMMI Appraisal - PPQA
Does our 577 process comply with
CMMI requirements? If yes, please
state an evidence
Adherence of the performed process and associated work products and services to
applicable process descriptions, standards, and procedures is objectively evaluated.
Objectively evaluate the designated performed processes against the applicable process
descriptions, standards, and procedures
Promote an environment (created as part of project
management) that encourages employee
participation in identifying and reporting quality
issues.
Product and Process Quality Assurance (PPQA) Process Area
SG 1
SP 1.1
SP 1.1-1
SP 1.1-2
Establish and maintain clearly stated criteria for the
evaluations.
SP 1.1-3
Use the stated criteria to evaluate performed
processes for adherence to process descriptions,
standards, and procedures.
SP 1.1-4
Identify each noncompliance found during the
evaluation.
SP 1.1-5
Identify lessons learned that could improve
processes for future products and services.
SP 1.1-WP 1
SP 1.1-WP 2
SP 1.1-WP 3
Evaluation reports
Noncompliance reports
Corrective actions
© 2015 USC-CSSE
43
University of Southern California
Center for Systems and Software Engineering
CMMI Appraisal
Requirements Development
SP 1.1
Elicit stakeholder needs, expectation, constraints, and interfaces for all phases of
the product lifecycle
SP 1.1-1
Engage relevant stakeholders using
methods for eliciting needs, expectations,
constraints, and external interfaces.
SP 1.2
Transform stakeholder needs, expectations, constraints, and interfaces into
customer requirements
SP 1.2-1
Translate the stakeholder needs,
expectations, constraints, and interfaces
into documented customer requirements.
SP 1.2-2
Define constraints for verification and
validation.
SP 1.2-WP 1 Customer requirements
SP 1.2-WP 2 Customer constraints on the conduct of
verification
SP 1.2-WP 3 Customer constraints on the conduct of
validation
© 2015 USC-CSSE
44
University of Southern California
Center for Systems and Software Engineering
CMMI Appraisal
Requirements Development
SP 2.1
Establish and maintain product and product-component requirements, which
are based on the customer requirements
SP 2.1-1
Develop requirements in technical
terms necessary for product and
product component design.
SP 2.1-2
Derive requirements that result
from design decisions
SP 2.1-3
Establish and maintain
relationships between requirements
for consideration during change
management and requirements
allocation
SP 2.1-WP 1 Derived requirements
SP 2.1-WP 2 Product requirements
SP 2.1-WP 3 Product component requirements
© 2015 USC-CSSE
45
University of Southern California
Center for Systems and Software Engineering
Now – workshop – CMMI
Appraisal
• Form groups of 3 people
– Try not to team with your own team member
• Identify gap analysis between your project and a
given process area
–
–
–
–
Risk Management (RSKM)
Validation (VAL)
Verification (VER)
Produce and Process Quality Assurance (PPQA)
• Off-campus student, pick one process area and
complete gap analysis
• Resources http://www.sei.cmu.edu/reports/10tr033.pdf
© 2015 USC-CSSE
46
University of Southern California
Center for Systems and Software Engineering
What is Six Sigma?
© 2015 USC-CSSE
47
University of Southern California
Center for Systems and Software Engineering
What is six sigma?
• Sigma - a statistical term that measures how
far a given process deviates from
perfection.
• if you can measure how many "defects" you
have in a process, you can systematically
figure out how to eliminate them and get as
close to "zero defects" as possible
• To achieve Six Sigma, a process must not
produce more than 3.4 defects per million
opportunities or 99.9997% perfect.
© 2015 USC-CSSE
48
University of Southern California
Center for Systems and Software Engineering
Think about a pizza delivery
• Not to deliver later than 25 minutes
• If achieve 68% of the time, you are
running at 1 Sigma
• if achieve 99.9997% of the time then
you are at 6 Sigma
• Six sigma measures quality.
• It measures the Variance and does
not rely on the Mean.
© 2015 USC-CSSE
49
University of Southern California
Center for Systems and Software Engineering
Examples of the Sigma Scale
In a world at 3 sigma. . .
In a world at 6 sigma. . .
• There are 964 U.S. flight
cancellations per day.
• 1 U.S. flight is cancelled
every 3 weeks.
• The police make 7 false
arrests every 4 minutes.
• There are fewer than 4 false
arrests per month.
• In MA, 5,390 newborns are
dropped each year.
• 1 newborn is dropped every
4 years in MA.
• In one hour, 47,283
• It would take more than
international long distance
2 years to see the same
calls are accidentally
number of dropped
disconnected.
international calls.
© 2015 USC-CSSE
50
University of Southern California
Center for Systems and Software Engineering
Lean Six Sigma
• Lean + Six Sigma
• Six Sigma
– recognize and eliminate defects and or low profit margins.
– recognize that variations in analyzing and measuring can
hinder or often block the ability to deliver high quality services.
– Focus on data
– Need a team of professionals (champion, black / green belts)
• Lean Six Sigma
– focus is on maximizing products or perform things faster by
removing the wastes
– seven forms of waste (Defects, overproduction,
overprocessing, motion, transportation, inventory and waiting)
– Six Sigma Quality + Lean Speed
© 2015 USC-CSSE
51
University of Southern California
Center for Systems and Software Engineering
Lean Six Sigma
• Measurement activity of the 6δ DMAIC
takes a long time and lots of data
• Lean 6δ does not ignore measurement, will
do as necessary
© 2015 USC-CSSE
52
University of Southern California
Center for Systems and Software Engineering
Lean Thinking provides a sharp degree of focus on customer value, and provides
mechanisms for rapid improvement
Six Sigma is the statistical control and performance prediction capability associated with
stable processes
© 2015 USC-CSSE
[Ref: CrossTalk2010]
53
University of Southern California
Center for Systems and Software Engineering
ITIL - IT Infrastructure Library
• V3 - consists of 5 volumes:
–
–
–
–
–
Service Strategy
Service Design
Service Transition
Service Operation
Continual Service Improvement.
© 2015 USC-CSSE
users.ox.ac.uk/~tony/itilv3.ppt
54
University of Southern California
Center for Systems and Software Engineering
The Service Lifecycle
• Service Strategy
– Strategy generation
– Financial management
– Service portfolio
management
– Demand management
management
– Change management
– Knowledge Management
• Service Operation
• Service Design
– Problem & Incident
management
– Request fulfilment
– Event & Access
management
– Capacity, Availability,
Info Security
Management
• Continual Service
– Service level & Supplier
Improvement
Management
– Service measurement &
• Service Transition
reporting
– Planning & Support
– 7-step improvement
process
– Release & Deployment
– Asset & Config
© 2015 USC-CSSE
users.ox.ac.uk/~tony/itilv3.ppt
55
University of Southern California
Center for Systems and Software Engineering
How the Lifecycle stages fit
together
users.ox.ac.uk/~tony/itilv3.ppt
© 2015 USC-CSSE
56
University of Southern California
Center for Systems and Software Engineering
© 2015 USC-CSSE
Service Strategy
57
University of Southern California
Center for Systems and Software Engineering
Service Strategy has four
activities
Define the Market
Develop the Offerings
Develop Strategic Assets
Prepare for Execution
users.ox.ac.uk/~tony/itilv3.ppt
© 2015 USC-CSSE
58
University of Southern California
Center for Systems and Software Engineering
© 2015 USC-CSSE
59
University of Southern California
Center for Systems and Software Engineering
© 2015 USC-CSSE
60
University of Southern California
Center for Systems and Software Engineering
Service Design
© 2015 USC-CSSE
61
University of Southern California
Center for Systems and Software Engineering
Service Design
•
•
•
•
How are we going to provide it?
How are we going to build it?
How are we going to test it?
How are we going to deploy it?
Holistic approach to determine the impact
of change introduction on the existing
services and management processes
users.ox.ac.uk/~tony/itilv3.ppt
© 2015 USC-CSSE
62
University of Southern California
Center for Systems and Software Engineering
© 2015 USC-CSSE
63
University of Southern California
Center for Systems and Software Engineering
© 2015 USC-CSSE
64
University of Southern California
Center for Systems and Software Engineering
Service Transition
© 2015 USC-CSSE
65
University of Southern California
Center for Systems and Software Engineering
Service Transition
•
•
•
•
•
Build
Deployment
Testing
User acceptance
Bed-in (phased or big bang)
users.ox.ac.uk/~tony/itilv3.ppt
© 2015 USC-CSSE
66
University of Southern California
Center for Systems and Software Engineering
Service Operation
© 2015 USC-CSSE
67
University of Southern California
Center for Systems and Software Engineering
© 2015 USC-CSSE
68
University of Southern California
Center for Systems and Software Engineering
© 2015 USC-CSSE
69
University of Southern California
Center for Systems and Software Engineering
CMMI vs ITIL
CMMI
ITIL
Origins
CMU
United Kingdom’s Office of
Government Commerce
Scope
maturity model, best
practices applied in the
development of software
codes of best practices, and
controlling and managing all
aspects of IT related operations
Application
focused toward software
development, maintenance,
and product integration
broader in scope and provides a
framework for IT service
management and operations
including a hardware life cycle.
Structure
not a process but a
description of effective
process characteristics.
provides solutions on how to
undertake each process area. E.g.
how to do reqm mgnt
© 2015 USC-CSSE
http://www.brighthubpm.com/monitoring-projects/72298-differences-in-cmmi-vs-itil/
70
University of Southern California
Center for Systems and Software Engineering
Space and Missile Systems Center
(SMC)
•
•
•
•
SMC Standard SMC-S-002 - Configuration Management
SMC Standard SMC-S-012 - Software Development for Space Systems
SMC Standard SMC-S-013 - Reliability Program for Space systems
SMC Standard SMC-S-016 - Test Requirements for Launch, UpperStage and Space Vehicles
• SMC Standard SMC-S-023 - Human Computer Interface Design Criteria,
Volume 1, User Interface Requirements
• Similar guidelines
– MIL-STD-498 ; http://en.wikipedia.org/wiki/MIL-STD-498
– IEEE Standard 12207 ; http://en.wikipedia.org/wiki/IEEE_12207
© 2015 USC-CSSE
71
University of Southern California
Center for Systems and Software Engineering
SMC S-012
• Standard for the government’s requirements and
expectations for contractor performance in
defense system acquisitions and technology
developments.
• To establish uniform requirements for software
development activities
© 2015 USC-CSSE
72
University of Southern California
Center for Systems and Software Engineering
© 2015 USC-CSSE
73
University of Southern California
Center for Systems and Software Engineering
SMC-S012 coverage
© 2015 USC-CSSE
74
University of Southern California
Center for Systems and Software Engineering
Project Planning and Oversight
5.1 Project planning and oversight
The developer shall perform project planning and oversight in accordance with the following
requirements.
Note: If a system or software item is developed in multiple builds, planning for each build
should be interpreted to include:
a) Overall planning for the contract,
b) Detailed planning for the current build, and
c) Planning for future builds to a level of detail compatible with the information available.
5.1.2 Software item test planning
The developer shall develop and record plans for conducting software item qualification
testing. This planning shall include all applicable items in the Software Test Plan (STP)
DID, as defined in the SDP (see Section 5.1.1). The software test planning shall address all
verification methods (i.e., Inspection (I), Analysis (A), Demonstration (D), and Test (T)) and
levels (e.g., unit, item, subsystem, element, segment, or system) necessary to fully
verify the software requirements, including the software interface requirements.
Qualification testing covers
- Nominal and off-nominal scenarios (extreme cases)
- Bi-directional traceability
© 2015 USC-CSSE
75
University of Southern California
Center for Systems and Software Engineering
Software Design
5.6.3 Software item detailed design
The developer shall develop and record a description of each software
unit. The result shall include all applicable items in the detailed design
section of the Software Design Description (SDD) DID. Designs
pertaining to interfaces shall include all items in the Interface Design
Description (IDD). Designs of software units that are databases or are
service units that access or manipulate databases shall include all items
in the Database Design Description (DBDD).
© 2015 USC-CSSE
76
University of Southern California
Center for Systems and Software Engineering
Unit Testing
5.7.2 Preparing for unit testing
• The developer shall establish test cases (in terms of inputs, expected
results, and evaluation criteria), test procedures, and test data for testing
the software corresponding to each software unit.
• The test cases shall cover the unit’s design, including, as a minimum, the
correct execution of all statements and branches; all error and exception
handling; all software unit interfaces including limits and boundary
conditions; start-up, termination, and restart (when applicable); and all
algorithms.
• Legacy reuse software shall be unit tested for all modified reuse software
units, for all reuse software units where the track record indicates potential
problems (even if the reuse units have not been modified), and all critical
reuse software units (even if the reuse units have not been modified). The
developer shall record this information in the appropriate SDFs.
© 2015 USC-CSSE
77
University of Southern California
Center for Systems and Software Engineering
SMC S-012 - Software Mission Assurance
• To ensure Software Mission Assurance
– Building quality in
• Focus on finding and removing defects within each
development activity
• Peer review, QFP, ARB
– Conducting a robust software test program
• Focus on finding defects that escaped the quality
gates for earlier software development activities
• Several kinds of testing
© 2015 USC-CSSE
78
University of Southern California
Center for Systems and Software Engineering
Now – workshop – SMC S-012 Appraisal
• Form groups of 3 people
– Try not to team with your own team member
• Identify gap analysis between your project and
given process areas
–
–
–
–
5.3 System Requirement Analysis
5.6 Software Design
5.8 Unit Integration and testing
5.16 Software Quality Assurance
• Off-campus student, pick one process area and
complete gap analysis
• Resources http://everyspec.com/USAF/USAFSMC/download.php?spec=SMC-S-012_13JUN2008.021535.PDF
© 2015 USC-CSSE
79
University of Southern California
Center for Systems and Software Engineering
References
• [CSSE 2002] USC CSE Annual Research Review 2002
• [CMMI]Software Engineering Institute's CMMI website:
http://www.sei.cmu.edu/cmmi/
• [CMMIRocks] http://cmmirocks.ning.com/
• [CrossTalk 2010] CrossTalk Magazines Jan/Feb 2010
• [Garcia 2002] Suz Garcia, Are you prepared for CMMI ?
• [Garcia 2005] Suzanne Garcia, SEI CMU, Why Should you Care about CMMI?
• [Garcia 2005b] Suz Garcia, Thoughts on Applying CMMI in small settings
• [Rudge ]CMMI® : St George or the Dragon?, T. Rudge, Thales
• [SMC S-012] http://everyspec.com/USAF/USAF-SMC/download.php?spec=SMC-S012_13JUN2008.021535.PDF
• Owen, K, et.al. An Increase in Software testing robustness: Enhancing the
software development standard for space systems http://gsaw.org/wpcontent/uploads/2013/07/2011s12d_eslinger.pdf
© 2015 USC-CSSE
80
University of Southern California
Center for Systems and Software Engineering
Back up slides
University of Southern California
Center for Systems and Software Engineering
When Project Planning isn’t done well…
What you’ll see…
•
•
•
•
Poor estimates that lead to cost and schedule overruns
Unable to discover deviations from undocumented plans
Resources aren’t available/applied when needed
Unable to meet commitments
Why Should You Care? Because….
• Customers don’t trust suppliers who waste their resources -loss of future business
• No lessons learned for future projects means making the
same mistakes on multiple projects
• Unhappy customers, employees ,and stockholders means a
short life for the business
• If you fail to plan then you plan to fail!
© 2015 USC-CSSE
[Ref: Garcia 2005]
82
University of Southern California
Center for Systems and Software Engineering
When Project Monitoring and Control isn’t
done well….
What you’ll see
•
•
•
•
•
Crisis management
High rework levels throughout the project
Lots of time spent in meetings trying to “discover” project status
rather than reporting on it
Data needed for management decisions is unavailable when needed
Actions that should have been taken early on aren’t discovered until
it’s too late
Why Should You Care? Because….
•
•
•
If you don’t know what’s going on, corrective action can’t be taken
early when it’s least expensive
Lack of management insight/oversight makes project results highly
unpredictable, even later in the project
If your confidence in the status you give to your customer is low,
they probably perceive that!
© 2015 USC-CSSE
[Ref: Garcia 2005]
83
University of Southern California
Center for Systems and Software Engineering
When Requirements Management isn’t done well
What you’ll see:
•
•
•
•
High levels of re-work throughout the project
Requirements accepted by staff from any source they deem to be
authoritative
“Galloping” requirements creep
Inability to “prove” that the product meets the approved
requirements
Why Should You Care? Because….
•
•
•
•
Lack of agreement among stakeholders as to what are the “real”
requirements increases time and cost to complete the
Project
You’re highly likely to deliver an incorrect or incomplete product
Revisiting requirements changes over and over is a waste of
resource highly visible to the customer
© 2015 USC-CSSE
[Ref: Garcia 2005]
84
University of Southern California
Center for Systems and Software Engineering
© 2015 USC-CSSE
[Ref: Garcia 2005]
85
University of Southern California
Center for Systems and Software Engineering
© 2015 USC-CSSE
[Ref: Garcia 2005]
86
University of Southern California
Center for Systems and Software Engineering
© 2015 USC-CSSE
[Ref: Garcia 2005]
87
University of Southern California
Center for Systems and Software Engineering
Top new 8 concepts in CMMI1.3
8. Organizational-level contracts
–
Mentioning of preferred suppliers in SAM
7. Prioritized customer requirements
–
Prioritized customer requirements in RD
6. Lifecycle needs and standards
–
Acknowledging standards e.g. ISO 12207 in OPD
5. Customer satisfaction
–
Emphasize the importance of customer satisfaction
© 2015 USC-CSSE
[Ref: CMMIRocks]
88
University of Southern California
Center for Systems and Software Engineering
Top new 8 concepts in CMMI1.3
4. Causal analysis at low levels of maturity
–
–
Explicit encouragement of using causal analysis
New: QPM-SP 2.3 Perform Root Cause Analysis
3. Teaming concepts
–
–
–
IPPD (Integrated Process and Product Development) is gone
“Teams” is not addition/optional anymore
New: IPC-SP1.6 Establish and maintain teams
© 2015 USC-CSSE
[Ref: CMMIRocks]
89
University of Southern California
Center for Systems and Software Engineering
Top new 8 concepts in CMMI1.3
2. Modernized development practices
–
–
–
Adding concepts of LOS, product line, release
increments, architecture-centric, technology
maturation
Glossary updates
Informative material updates in all three
constellations (especially in RD, REQM, VAL, and
VER) to bring more balance to functional vs. nonfunctional requirements (e.g., quality attributes)
© 2015 USC-CSSE
[Ref: CMMIRocks]
90
University of Southern California
Center for Systems and Software Engineering
Top new 8 concepts in CMMI1.3
1. Agile interpretive guidance
–
–
–
Help those who use Agile methods to interpret
CMMI
Add introductory notes about agile to the following
process areas in CM, PI, PMC, PP, PPQA, RD,
REQM, RSKM, TS, and VER.
Example: "In agile environments... Teams plan, monitor, and
adjust plans in each iteration as often as it takes (e.g., daily).
Commitments to plans are demonstrated when tasks are assigned
and accepted during iteration planning, user stories are elaborated
or estimated, and iterations are populated with tasks from a
maintained backlog of work.
© 2015 USC-CSSE
[Ref: CMMIRocks]
91
University of Southern California
Center for Systems and Software Engineering
Basic Project Management Category
Status, issues, and results of process and
product evaluations; measures and analyses
Corrective action
PMC
Corrective action
What to
monitor
n
ne
SAM
Supplier
agreement
REQM
nd o
t a o mp
c
du t c nts
Prooduc eme
pr quir
re
What to build
Replan
Status, issues,
and results of
reviews and
monitoring
Plans
t
PP
What to do
Commitments
Product and
product
component
requirements
Engineering and Support
process areas
Measurement
needs
Product component requirements, technical
issues, completed product components, and
acceptance reviews and tests
Supplier
PMC = Project Monitoring and Control
PP = Project Planning
REQM = Requirements Management
SAM = Supplier Agreement Management
© 2015 USC-CSSE
92
University of Southern California
Center for Systems and Software Engineering
Advanced Project Management Category
Statistical management data
QPM
Risk exposure due to
unstable processes
Quantitative objectives,
subprocesses to statistically
manage, project’s composed
and defined process
Organization’s standard processes,
work environment standards, and
supporting assets
RSKM
IPM
Process Management
process areas
Product architecture
for structuring teams
Project’s defined
process and work
environment
Coordination,
commitments, and issues
to resolve
Teams for performing engineering
and support processes
Engineering and Support
process areas
Risk taxonomies
and parameters,
risk status, risk
mitigation plans,
and corrective
action
Basic Project Management
process areas
IPM= Integrated Project Management
QPM = Quantitative Project Management
RSKM = Risk Management
© 2015 USC-CSSE
93
University of Southern California
Center for Systems and Software Engineering
Engineering Category
Project Management
process areas
Product and
product component
requirements
Requirements
Product
components
Alternative
solutions
TS
RD
Product
PI
Customer
Requirements
Requirements, Product
components, work products,
verification and validation reports
VER
VAL
Customer needs
PI = Product Integration
RD = Requirements Development
TS = Technical Solution
VAL = Validation
VER = Verification
© 2015 USC-CSSE
94
University of Southern California
Center for Systems and Software Engineering
Basic Support Category
Quality and
noncompliance
issues
Measurements
and analyses
MA
PPQA
All process areas
Information
needs
Controlled
configuration
items, baselines,
and audit reports
Configuration
items and
change
requests
Processes and
work products,
standards, and
procedures
CM
CM = Configuration Management
MA = Measurement and Analysis
PPQA = Process and Product Quality Assurance
© 2015 USC-CSSE
95
University of Southern California
Center for Systems and Software Engineering
Advanced Support Category
CAR
Process
improvement
proposals
Defects,
other problems,
and successes
All process areas
Selected
issues
Formal
evaluations
DAR
CAR = Causal Analysis and Resolution
DAR = Decision Analysis and Resolution
© 2015 USC-CSSE
96
University of Southern California
Center for Systems and Software Engineering
O
pr rga
an oc niz
d e ss a t
ob n io
je ee n’s
ct d
i ve s
s
Basic Process Management Category
Senior
management
Organization’s
business
objectives
Training for projects and
support groups in standard
process and assets
OT
Standard
process
and other
assets
Tra
in
ing
nee
ds
Standard process, work,
environment standards, and
other assets
OPF
Resources and
coordination
OPD
Project Management,
Support, and Engineering
process areas
Improvement information
(e.g., lessons learned, data,
and artifacts
Process improvement proposals; participation in
defining, assessing, and deploying processes
OPD = Organizational Process Definition
OPF = Organizational Process Focus
OT = Organizational Training
© 2015 USC-CSSE
97
University of Southern California
Center for Systems and Software Engineering
Advanced Process Management Category
Improvements
Organization
ce
an
m
r
rfo lity
Pe p a b i
ca
Senior
management
Business objectives
fit
ne ted
be ilo s
nd p nt
t a rom me
os f e
C ta ov
da pr
im
OPM
Quality and process
measures, baselines,
Performance objectives,
and models
OPP
Project Management,
Support, and Engineering
process areas
Quality and process
objectives
on
mm res
C o e a su
m
Ability to develop and
deploy standard process
and other assets
Performance
capability data
Basic
Process Management
process areas
OPM = Organizational Performance Management
OPP = Organizational Process Performance
© 2015 USC-CSSE
98
University of Southern California
Center for Systems and Software Engineering
CMMI vs ITIL
© 2015 USC-CSSE
http://www.dtic.mil/ndia/2007cmmi/Tues7Mitryk_Presentation.pdf
99
University of Southern California
Center for Systems and Software Engineering
CMMI vs ITIL
© 2015 USC-CSSE
http://www.dtic.mil/ndia/2007cmmi/Tues7Mitryk_Presentation.pdf
100
University of Southern California
Center for Systems and Software Engineering
CMMI vs ITIL
© 2015 USC-CSSE
http://www.dtic.mil/ndia/2007cmmi/Tues7Mitryk_Presentation.pdf
101
University of Southern California
Center for Systems and Software Engineering
CMMI vs ITIL
© 2015 USC-CSSE
http://www.dtic.mil/ndia/2007cmmi/Tues7Mitryk_Presentation.pdf
102
Download