Lecture 3 Lero SQ & PI - Measuring Processes

advertisement

Lero © 2010

Software Quality & Process Improvement

Dr. Ita Richardson

Lero – the Irish Software Engineering Research Centre and Department of Computer Science & Information Systems

University of Limerick

1

Lero © 2010

LECTURE 3:

MEASURING THE SOFTWARE

PROCESS

Why Measure?

• Business & Industry use quantitative approach

• Measure:

– Product quality

– Productivity

– Efficiency

– Machine reliability

• Why should Software Engineering be different?

Lero © 2010

Quantitative Measures

• Length of project activity?

• Time remaining in project?

• Productivity of software engineer?

• Bottlenecks in system?

• % of time for test vs development

• No of errors detected

• Time spent fixing errors

Lero © 2010

Measurement as feedback

• Performance indicators

• Match against best practice

• Make required improvements

• Results in continuous process performance

Lero © 2010

Measurement

• Measurement in chaotic environment is generally not useful

• Need consistent and standard process

• Use balanced set of indicators

Lero

• Measure versus goals

Types of measures

• Process-related measures

– Number of change requests

– Cycle time

– Effectiveness of the process

• Project-related measures

– Project development time

– Cost of work to date

– Deviation of costs

– Staff productivity

Lero © 2010

Types of measures

• Product / Customer –related measures

– User-reported bugs

– Customer satisfaction

– Source code size

Lero © 2010

Principles of SP Measurement

• Define clear objectives

– Key performance indicators

• Management support

– Supporting Business Goals

• Roles and responsibilities

• Cover both process and product

• Reflect business & engineering objectives

Lero © 2010

Issues to be considered

• How to measure business value of SPI?

– Increased productivity

– Early error detection and correction

– Cost of SPI

• How to measure qualitative benefits?

– Product availability

– Customer satisfaction

– Employee morale

Lero © 2010

Measurement Tools

• Understanding Data

– Histograms

– Bar Charts

– Pareto Charts

– Scatter diagrams

– Run charts

• Analyse Data

– Control Charts

Lero © 2010

Using the Data

• How do we analyse the data?

• What can we do within the organisation?

• Do we need more information?

• Should cause us to ask questions as well as answer questions

Lero © 2010

Histograms

• Frequency counts

• Distribution of observed values

• Easy to compare distributions

Lero © 2010

Lero © 2010

Histogram

Number of Program Integrations (Pr. A)

18

16

14

12

10

8

2

0

6

4

0-20 20-40 40-60 60-80 80-100 100-

120

No of Errors

No.of Programs

Bar Charts

• Sets of discrete values

• Data associated with individual entities

Lero © 2010

Lero © 2010

Bar Chart

Change Requests

40

35

30

25

20

15

10

5

0

Q3

99

Q4

99

Q1

00

Q2

00

Q3

00

Quarter

Q4

00

Q1

01

Q2

01

Requested

Changed

Refused

Future Versions

Change Requests

100

80

60

40

20

0

Q3

99

Q4

99

Q1

00

Q2

00

Q3

00

Requests

Q4

00

Q1

01

Q2

01

% Changed

% Refused

% Future Versions

Scatter Diagrams

• Comparing one variable with another

• ‘Dependent’ / ‘Independent’

• Cause and Effect

Lero © 2010

Lero © 2010

Scatter Diagram

LoC/Member vs Hours

PROJECT

Alpha

Beta

Gamma

Delta

Duration

(Hours)

150

264

81

426

LoC/

Member

43

117

50

65

140

120

100

80

60

40

20

0

0 100 200

Hours

300 400 500

Run Charts

• Values arranged in time sequence

• Basis for Control Charts

• Temporal

• Behaviour of Process

Lero © 2010

Run Chart

<Presenter>

Cost of Test as % Project Time

40

35

30

25

20

15

10

5

0

A B C D E F G H J

Project

% Total Project

Time

– <Venue>, <City>, <Date> © S-Cube – 20/<Max>

Sequential Run Chart

• As Run Charts

BUT

• some Grouping of Data exists

Lero © 2010

Sequential Run Chart

Lero © 2010

When did the training occur?

Programmer Efficiency

0.9

0.8

0.7

0.6

0.5

0.4

0.3

0.2

0.1

0

M N P Q R S T U V W X Y Z

Project

Pareto Charts

• Ranks outcomes

• Frequency counts in descending order

• Temporal

Lero © 2010

Variation

Total Variation =

Common Cause Variation

+

Assignable Cause Variation

Lero © 2010

Common Cause Variation

• Natural

• Results are common across groups

• Stable consistent pattern

• Predictable

• Unexpected results are rare

Lero © 2010

Assignable Cause Variation

• Causes which could have been prevented

• Outside influences

• Shifts in quality

– People

– Processes

– Tools

Lero © 2010

Stable process

• In statistical control

• No assignable causes

– As they have been removed from the process

• Variability due to Common Causes

• Use Statistical Process Control

– Control Charts

Lero © 2010

Control Charts

• How process has behaved

• R-Chart – Range charts

– Smallest: Largest value

– Don’t expect to exceed limits of range variation

• X-Chart – in control

– Sub-group averages don’t fall outside upper/lower variation control

• For examples of Control Charts: see the internet

Lero © 2010

Statistical Process Control

• Improve process though causal analysis

• Control Limits

– Estimated for process

– Cannot assign arbitrarily

– +- 3 sigma (Std Deviations)

• Centreline

– Observed process average

Lero © 2010

Statistical Process Control

• Common Cause Variation

– Assignable cause must be eliminated

• Rational Sub-Grouping & Rational Sampling

– To eliminate common cause variation

– To ensure narrow control limits

Lero © 2010

Out-of-control Tests

Instability where one of the following exist:

• Single point outside 3-sigma

• Two / three successive values fall on the same side and outside 2-sigma

• Four / five successive values fall on the same side and outside 1-sigma

• Eight successive values fall on the same side of centreline

– Tests 2-4 for X-Charts only

Lero © 2010

What do we do with Results?

• Remove assignable causes then

• Change the process then

• Continually improve

Lero © 2010

Remove assignable causes

• Find reasons for their existence

• Prevent their recurrence

• Introduces stability to process

• May use qualitative data

• May use other quantitative data

Lero © 2010

Change the process

Removing assignable causes requires:

• Identifying changes

• Designing changes

• Implementing changes

Lero © 2010

Continually improve

• Reduce variability

• Improves quality

• Improves cost

• Improves time-to-market

Ultimate requirement is STABLE & CAPABLE process

- Need to consider Change Management

Lero © 2010

Framework for Change Management

(Willman, 1996)

Pressure for

Change

Pressure for change

Leadership &

Vision

Leadership &

Vision

Capable

People

Capable

People

Pressure for change

Leadership &

Vision

Capable

People

Pressure for change

Pressure for change

?

?

Leadership &

Vision

?

Capable

People

Leadership &

Vision

Capable

People

Lero © 2010

Effective

Rewards

?

Effective

Rewards

Effective

Rewards

Effective

Rewards

Effective

Rewards

=

Successful Implementation

Evaporation

=

= Frustration

= Disengagement

= Disillusionment

= Disinterest

Measurement as feedback

• Performance indicators

• Match against best practice

• Make required improvements

• Results in continuous process performance

Lero © 2010

Lero © 2010

Acknowledgement

• The information presented in these slides has been collected from a variety of sources including:

– Software Quality Assurance: From Theory to Implementation by Daniel Galin, 2003

– Software Process Improvement: Practical Guidelines for Business Success by Sami Zahran, 1998

– Research carried out by post-doctoral researchers and PhD students at Lero – the Irish Software

Engineering Research Centre, Ireland under the supervision of Dr. Ita Richardson

– Software Process: Improvement and Practice (journal)

– The SPIRE Handbook: Better, Faster, Cheaper Software Development in Small Organisations, edited by Marty Sanders (Version 1, 1998) and Jill Pritchet (Version 2, 2000)

• The research presented in this lecture has been partially supported by Science

Foundation Ireland funded through Global Software Development in SMEs Cluster Grant

(no 03/IN3/1408C) and Lero – the Irish Software Engineering Research Centre (CSET grant no 03CE2/I303.1).

Download