Benchmarking

advertisement
The value of benchmarking
IT projects
Harold van Heeringen
Software Cost Engineer, Sogeti Nederland B.V.
ISBSG president
NESMA board
The Russian Managers Association
Location: Moscow
Date: October 2013
Overview
Scope: Software Development
Benchmarking
Software Project Industry
Functional Size Measurement
Software Metrics
Historical Data: ISBSG
Project Benchmark Example
Organization Benchmark
Other uses for Benchmarking
Conclusions and final remarks
3
Benchmarking (wikipedia)
Benchmarking is the process of comparing one's business processes
and performance metrics to industry bests or best practices from other
industries.
Benchmarking is used to measure performance using a specific indicator (cost
per unit of measure, productivity per unit of measure, cycle time of x per unit
of measure or defects per unit of measure) resulting in a metric of
performance that is then compared to others
This then allows organizations to develop plans on how to make
improvements or adapt specific best practices, usually with the aim of
increasing some aspect of performance. Benchmarking may be a one-off
event, but is often treated as a continuous process in which organizations
continually seek to improve their practices.
4
Where are we?
“Even the most detailed navigation map of an area is
useless if you don’t know where you are”
?
?
?
5
Benchmarking
Senior Management of IT departments/organizations need
to make decisions need to make decisions based on
‘where they are’ and ‘where they want to go’.
Benchmarking is about determining ‘where you are’
compared to relevant peers, in order to make informed
decisions.
But how to measure and determine where you are?
6
Software project industry
Low ‘performance metrics’ maturity
Few Performance Measurement Process implemented
Few Benchmarking processes implemented
Most organizations don’t know how good or how bad they
are in delivering or maintaining software.
These organizations are not able to assess their
competitive position, nor able to make informed strategic
decisions to improve their competitive position.
7
But…
Best in Class organizations deliver software up to 30 times
more productively than Worst in Class organizations
High Productivity, High Quality
More functionality for the users against lower costs
Shorter Time to Market – competitive advantage!
Worst in Class organizations will find themselves in trouble
in an increasingly competitive market
Outperformed by competition
Internal IT departments get outsourced
Commercial software houses fail to win new contracts
Important to know where you are! Benchmark is essential!
8
Performance means balance
Delivery to time and budget
Actual vs. Estimated Cost
Actual vs. Estimated Duration
Project Speed
Size / Duration
Project Productivity
Size / Effort
Project Quality
Post delivery defects / Size
9
Difficulty – low industry maturity
How to measure metrics like productivity, quality, time-tomarket in such a way that a meaningful comparison is
possible?
Comparing apples to apples
10
11
Functional Size Measurement
Function Point Analysis (NESMA, IFPUG or COSMIC)
Measure the functional user requirements – size in function points;
ISO standards – objective, independent, verifiable, repeatable;
Strong relation between functional size and project effort needed;
What to do with the results?
Project effort/duration/cost estimation
Benchmarking/performance measurement
Use in Request for Proposal management (answer price/FP questions)
What about historical data?
Company data (preferably for estimation)
Industry data (necessary for external benchmarking)
12
Unit of Measure (UoM)
Why are Function Points the best UoM to use in Benchmarking?
Functionality is of value for the client/business. More functionality means more
value. More Lines of code (technical size) is not necessarily of value.
Function Points are measured independent from technical requirements
500 FP of functionality implemented in Java SOA architecture
=
500 FP of functionality implemented in Cobol mainframe
Function Points are measured independent from implementation method
500 FP delivered in an agile development project
=
500 FP delivered in a COST package implementation
13
Software metrics – some examples
Productivity
Productivity Rate: #Function points per staff month
PDR: #Effort hours per function Point
Quality
Defect Density: #Defects delivered per 1000 function points
Time to Market
Speed: #Function points delivered per calendar month
14
Performance Measurement
Measure the size of completed projects
Project size in Function Points
Product size in Function Points
Collect and analyze the data
Effort hours, duration, defects
Normalize the data when necessary
Store the data in the corporate database
Benchmark the project, internally and externally
Report metrics and trends
Different reports for different stakeholders
Depending on goals of the stakeholder
15
Historical data: ISBSG repositories
International Software Benchmarking Standards Group
Independent and not-for-profit
Grows and exploits two repositories of software data:
New development projects and enhancements (> 6000 projects)
Maintenance and support (> 1000 applications)
Everybody can submit project data
DCQ on the site
Anonymous
Free benchmark report in return
16
ISBSG
Mission: “To improve the management of IT resources by both
business and government, through the provision and exploitation of
public repositories of software engineering knowledge that are
standardized, verified, recent and representative of current
technologies”.
All ISBSG data is
validated and rated in accordance with its quality guidelines
current
representative of the industry
independent and trusted
captured from a range of organization sizes and industries
Industry leaders around the world contribute to the ISBSG’s
development, offering the highest metrics expertise worldwide
17
www.isbsg.org
18
Example – project benchmark
Project X was completed and the following data was collected:
Primary programming language: Java
Effort hours spent: 5730
Duration: 11 months
Defects found after delivery: 23
The functional size of the project was measured: 411 FP
Software metrics:
Project Delivery Rate: 5730/411 = 13,9 h/FP
Project Speed: 411/11 = 37,4 FP per calendar month
Defect Density: (23/411) *1000 = 56,0 defects/1000 FP
19
Example: Benchmark
ISBSG ‘New Developments & Enhancements’
Select the right ‘peer group’
Data Quality A or B
Count approach: IFPUG 4.x or NESMA
Primary Programming Language = ‘Java’
300 FP < Project Size < 500 FP
20
Results – project benchmark
PDR
N
Minimum
Percentile 10
Percentile25
Median
Percentile 75
Percentile 90
Maximum
Average
488
0,1
2,5
4,7
9,8
18,4
28,9
621,3
15,2
Speed
N
Minimum
Percentile 10
Percentile 25
Median
Percentile 75
Percentile 90
Maximum
Average
428
9,4
23,1
32,5
53,8
95,4
130,2
476,0
70,9
Defect Density
N
Minimum
Percentile 10
Percentile 25
Median
Percentile 75
Percentile 90
Maximum
Average
154
0,0
0,0
0,0
3,7
17,9
40,1
366,5
18,6
Project Delivery Rate: 5730/411 = 13,9 h/FP
Project Speed: 411/11 = 37,4 FP per calendar month
Defect Density: (23/411) *1000 = 56,0 defects/1000 FP
This project was carried out less productive and slower than market
average, and the quality is worse than average.
21
Organization Benchmark
Organization Y - Speed index
Organization Y - Productivity lndex
2,0
1,8
Speed Index
Productivity Index
2,2
2,2
2,0
1,8
1,6
1,4
1,2
1,0
0,8
0,6
0,4
1,6
1,4
1,2
1,0
0,8
0,6
<2009
2009
2010
2011
2012
0,4
<2009
Organization Y PI
Target (baseline +50%)
Industry Productivity
Lower bound (baseline -40%)
Quality Index
Organization Y - Quality lndex
2009
2010
2011
2012
Organization Y Speed index
Target (baseline +50%)
Industry Speed
Lower bound (baseline -40%)
Analysis:
2,0
1,8
1,6
1,4
1,2
1,0
0,8
0,6
0,4
0,2
- Until 2010, the organization was improving
- After 2010/2011, the trends go the wrong way
- Recommendation: find the cause and draw up
improvement plan
<2009
2009
2010
2011
2012
organization Y - Quality index
Target (baseline +50%)
Industry Quality level
Lower bound (baseline -50%)
22
Other uses for Benchmarking
Vendor selection, based on productivity, speed or quality metrics
Definition of SLA agreements (or other KPI’s) based on market
average performance
Establish a baseline from which to measure future improvement
Explain to the client/business that a project was carried out in a
‘better-than-average’ way, while the client may perceive otherwise
23
Conclusions and final remarks
Benchmarking is essential in the strategic management of an
organization. It helps understanding the competitive position and it
help to identify the ‘problem areas’, or ‘improvement areas’.
There are proven tools, techniques, models and historical data
available to carry out benchmarks in a fairly low-cost way.
This presentation focused on software development, because this
is usually the main concern of the business/client. Benchmarking
of other aspects of IT is similar, but with other performance metrics.
24
Harold van Heeringen
harold.van.heeringen@sogeti.nl
@haroldveendam
haroldveendam
haroldvanheeringen
Software Cost Engineer
NESMA board member
ISBSG president
COSMIC International Advisory Council, NL representative
Download