CMMI and Metrics OXFORD SOFTWARE ENGINEERING Clifford Shelley

advertisement
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
CMMI and Metrics
BCS SPIN SG, 19 February 2008
Clifford Shelley
OXFORD SOFTWARE ENGINEERING Ltd
9 Spinners Court, 53 West End,
Witney,
Oxfordshire
OX28 1NH
www.osel.co.uk
info@osel.co.uk
Tel. +44 (0) 1993 700878
© OSEL 2008
Slide 1
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Objective
1. Present the background and history of measurement within
CMMI
2. Place measurement within the context of CMMI (or CMMI within
the context of measurement?)
3. Identify common issues and concerns with measurement within
the context of CMMI, and their resolution
4. Look at some measurement approaches that can be really useful
(it is assumed here that ‘measurement’ is equivalent to ‘metrics’)
© OSEL 2008
Slide 2
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Background…
• Original SW CMM (a tool to measure software capability)
– Included measurement and analysis as a ‘common feature’ – briefly
described
– Expected product size estimates in planning
– Measurement of processes (problematic at L2, expected at L3)
– SPC like KPA at L4
© OSEL 2008
Slide 3
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
…Background…
Carried forward and developed to CMMI
•
Staged Representation
– ML2
• Measurement and Analysis is a PA in its own right
– NB: Includes GQM as SP 1.1-1.4
• PP requires product sizing
– ML3
• OPD SP 1.4 - process data defined and collected (using MA)
• IPM SP 1.2 – use historical data for estimating
– ML4
• SPC expectations (OPP)
• Control common causes of variation?
• Exploited by QPM
– ML5
• Quantitiative Process Improvement – pervasive measurement
• CAR uses measurement as analysis tool
• OID uses measurement as an analysis tool
© OSEL 2008
Slide 4
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
…Background
Continuous Representation
– CL2:
• All PAs expected to be monitored and controlled (GP2.8 –
measurement is expected)
– CL3:
• Standard process measures [defined and] stored (GP3.2)
– CL4:
• Quantitative objectives for processes established (GP4.1)
• SPC required - stabilize sub-process performance (GP4.2)
– Control special causes of variation – part of process capability as
understood by production engineering
– CL5:
• Establish quantitative process improvement (GP5.1 sub-practice)
• Manage common causes too – to enable process capability
© OSEL 2008
Slide 5
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Measurement within CMMI…
• MA PA is the enabler
• “…is to develop and sustain a measurement capability that is used to
support management information needs”
• Interpretation?
© OSEL 2008
Slide 6
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
…Measurement within CMMI…
• MA Scope
– SG1 Align Measurement and Analysis Activities
• SP 1.1 Establish Measurement Objectives
• SP 1.2 Specify Measures
• SP 1.3 Specify Data collection and Storage Procedures
• SP 1.4 Specify Analysis Procedures
– SG2 Provide Measurement Results
• SP 2.1 Collect Measurement Data (includes verification)
• SP 2.2 Analyze Measurement Data
• SP 2.3 Store Data and Results
• SP 2.4 Communicate Results (to aid decision making)
© OSEL 2008
Slide 7
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
…Measurement within CMMI
• MA PA – Applicability for ML2
– “…support management information needs…”
–
–
–
–
Project management (initially)
“…at multiple levels in the organization…”
and process management (GP 2.8)
and products (product components provided by suppliers)
• Not considered explicitly
– testing? – testers tend to be a measurement ‘centre of excellence’
– development? – developers don’t (design not amenable to
measurement?)
© OSEL 2008
Slide 8
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Practical concerns 1:
• Fear
– ‘I don’t want to be measured’
• It is abstract and can be difficult (Pfleeger)
– Distinguish between metrics designers and metrics users
– and train accordingly
• Where does it fit?
– Tactical capability or organizational infrastructure, or mix?
© OSEL 2008
Slide 9
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Practical concerns 2:
• What should we measure?
– RTFM
– SG1
– Ask what do you* need to know? Why?
• Who monitors and controls (measures)processes
– At ML2, at ML3?
• Who monitors and controls the MA process?
• Metrics Repository
– Central/organizaton, or local/project
* ‘you’, perhaps ‘we’, but not ‘they’
© OSEL 2008
Slide 10
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Common Circumstances 1:
•
•
•
Product sizing rarely done, or done well
–
Difficulty with identifying entities and their quantitative attributes
–
Fix: analysts extract identities, counts and attributes of system elements from
developer/estimators – start with task based estimation spreadsheet and work upstream, ‘What
were they thinking?’
Organizational Measurement Infrastructure in place
–
–
Collects lots of data (if it moves measure it)
No traceability back to objectives (SG1 missing) data is orphaned from rationale – can’t reuse
–
Fix: discard collected data without definitions or rationale, then reverse engineer objectives
(SG1) for existing data collection systems – usually results in opportunity to shed data
collection activity – reduce costs – although rarely taken up
Organizational measurement data is unverified – of unknown accuracy
–
–
–
–
used for admin/billing
Not credible, known to be invalid (timesheets)
Not used by collectors
Data is for reporting, not using
–
Fix: verify data – presumes SG1
© OSEL 2008
Slide 11
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Common Circumstances 2:
• Good measurement
– Developed, owned and used locally, within teams
– Can be undervalued (seems obvious)
– MA SG1 implicit
• There is a limited ‘information horizon’
– Visibility is limited, and may be better that way
– Measurement data doesn’t travel well
– ‘Drill down’ is limited – even if it looks like it isn’t
© OSEL 2008
Slide 12
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Good Measurement 1:
1. Purpose clear and understood by collectors, analysts and decision makes
2. Measures are defined (not just described)
3. Data collectors are users (short feedback loops)
4. Accuracy and validity known (as minimal requirement)
5. Can stop collecting when no longer needed
© OSEL 2008
Slide 13
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Good Measurement 2:
1.
KISS
–
–
–
Minimal arithmetic, especially multiplication and division
(this includes percentages)
Arithmetic obscures much, reveals little
2.
Non parametric approaches
3.
Consider ‘Exploratory Data Analysis’ (EDA)
4.
–
–
–
Robust, widely applicable to ‘messy’ software engineering data
Not the usual statistical approach
Tukey
Use Graphics
•
–
–
But not pie charts
Let data show its information content – patterns, trends, outliers
Tufte
5.
GQMGtutu
6.
SPC
–
–
–
–
Goal Question Metric, Graphics – guided by Tufte and Tukey
SEI GQ(I)M
Later, much later
SPC Rule #1, know what you’re doing
© OSEL 2008
Slide 14
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
© OSEL 2008
Slide 15
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
OXFORD
S O FTWAR E E N G I N E E R I N G
LIMITED
9 Spinners Court, 53 West End,
Witney,
Oxfordshire
OX28 1NH
www.osel.co.uk
shelley@osel.netkonect.co.uk
Tel. +44 (0) 1993 700878
© OSEL 2008
Slide 16
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
© OSEL 2008
Slide 17
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Supplementary Material…
© OSEL 2008
Slide 18
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Empirical
relational
system
© OSEL 2008
Slide 19
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
© OSEL 2008
real world
mathematical world
Empirical
relational
system
Formal
relational
system
measurement
Slide 20
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
real world
mathematical world
Empirical
relational
system
Formal
relational
system
measurement
mathematics
and
statistics
Results
© OSEL 2008
Slide 21
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
real world
mathematical world
Empirical
relational
system
Formal
relational
system
measurement
mathematics
and
statistics
Relevant
empirical
information
© OSEL 2008
interpretation
Results
Slide 22
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
real world
mathematical world
Empirical
relational
system
Formal
relational
system
measurement
decisions
and
actions
Relevant
empirical
information
© OSEL 2008
mathematics
and
statistics
interpretation
Results
From Pfleeger 1998
Slide 23
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
real world
mathematical world
Empirical
relational
system
Formal
relational
system
refined
measurement
better
decisions
and
actions
Relevant
empirical
information
© OSEL 2008
mathematics
and
statistics
improved
interpretation
Results
Slide 24
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
© OSEL 2008
Slide 25
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
I
X
© OSEL 2008
II
Y
X
III
Y
X
IV
Y
X
Y
10.00
8.04
10.00
9.14
10.00
7.46
8.00
6.58
8.00
6.95
8.00
8.14
8.00
6.77
8.00
5.76
13.00
7.58
13.00
8.74
13.00
12.74
8.00
7.71
9.00
8.81
9.00
8.77
9.00
7.11
8.00
8.84
11.00
8.33
11.00
9.26
11.00
7.81
8.00
8.47
14.00
9.96
14.00
8.10
14.00
8.84
8.00
7.04
6.00
7.24
6.00
6.13
6.00
6.08
8.00
5.25
4.00
4.26
4.00
3.10
4.00
5.39
19.00
12.50
12.00
10.84
12.00
9.13
12.00
8.15
8.00
5.56
7.00
4.82
7.00
7.26
7.00
6.42
8.00
7.91
5.00
5.68
5.00
4.74
5.00
5.73
8.00
6.89
‘Anscombe’s Quartet’ – American Statistician
1973
Slide
26
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
N = 11
Mean of X’s = 9.0
Mean of Y’s = 7.5
Regression line = Y + 0.5X
Standard error of estimate of slope = 0.118
t = 4.24
_
Sum of squares X – X = 110.0
Regression of sum of squares = 27.50
Residual sum of squares of Y = 13.75
Correlation coefficient = 0.82
R2 = 0.67
© OSEL 2008
Slide 27
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
I
II
12.00
10.00
10.00
8.00
8.00
6.00
6.00
4.00
4.00
2.00
2.00
0.00
0.00
2.00
4.00
6.00
8.00
10.00 12.00 14.00 16.00
0.00
0.00
2.00
4.00
III
14.00
12.00
12.00
10.00
10.00
8.00
8.00
6.00
6.00
4.00
4.00
2.00
2.00
2.00
© OSEL 2008
4.00
6.00
8.00
8.00
10.00 12.00 14.00 16.00
IV
14.00
0.00
0.00
6.00
10.00 12.00 14.00 16.00
0.00
0.00
5.00
10.00
15.00
20.00
Slide 28
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
© OSEL 2008
Slide 29
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
Process Capability:
Indices for measuring process goodness
• Cp = USL - LSL / 6 or 2T / 6
– Cp < 1 process is incapable
– Cp > 1 process is capable (6 processes have Cp of 2)
– does not account for process drift so...
=
=
• Cpk = the lesser of (USL - X) / 3 or (X - LSL) / 3
© OSEL 2008
Slide 30
OXFORD SOFTWARE ENGINEERING
Software Engineering Services & Consultancy
© OSEL 2008
Slide 31
Download