COCOMO II Status and Extensions - Center for Software Engineering

advertisement
University of Southern California
Center for Systems and Software Engineering
Software Metrics and
Measurements
Supannika Koolmanojwong
CS510
1
University of Southern California
Center for Systems and Software Engineering
Outline
•
•
•
•
General Concepts about Metrics
Example of Metrics
Agile Metrics
Metrics from Empirical Data
2
University of Southern California
Center for Systems and Software Engineering
Measurements in daily life
3
University of Southern California
Center for Systems and Software Engineering
Why do we measure?
4
University of Southern California
Center for Systems and Software Engineering
Objectives of software measurement
• “You can not control what you cannot
measure.” – Tom DeMarco
• “Not everything that counts can be
counted. Not everything that is counted
counts.” – Albert Einstein
5
University of Southern California
Center for Systems and Software Engineering
Software Metrics
• Numerical data related to software
development
• Strongly support software project management
activities
• Can be directly observable quantities or can be
derived from one
6
University of Southern California
Center for Systems and Software Engineering
A simplified measurement
information model
Information
products
Decisions /
Actions
Measurements
Information
needs
Information
Needs,
Objectives,
Control
Attributes
Process
Ref: Ebert and Dumke 2007
Results
Work Products
7
University of Southern California
Center for Systems and Software Engineering
How the software
measurements are used?
•
•
•
•
Understand and communicate
Specify and achieve objectives
Identify and resolve problems
Decide and Improve
8
University of Southern California
Center for Systems and Software Engineering
Measurement Standard
How to do
How to do better
ISO/IEC 12207
Software Life Cycle Processes
ISO/IEC 15288
System Life Cycle processes
SWEBOK
Software Engineering Body of Knowledge
PMBOK
Project Management Body of Knowledge
CMMI
Capability Maturity Model Integration
ISO 15504
Software Process Capability Determination
ISO 9001
Quality Management System
ISO/IEC 9126
Software Product Quality
TL 9000, AS 9100, etc.
Objectives adaptations
How to measure what you are doing
ISO/IEC 15939:2002
Software Measurement Process
9
University of Southern California
Center for Systems and Software Engineering
Ground rules for a Metrics
• Metrics must be
–
–
–
–
–
–
–
–
Understandable to be useful
Economical
Field tested
Highly leveraged
Timely
Must give proper incentives for process improvement
Evenly spaced throughout all phases of development
Useful at multiple levels
http://www.stsc.hill.af.mil/resources/tech_docs/gsam3/chap13.pdf
10
University of Southern California
Center for Systems and Software Engineering
Measurements for Senior Management
• Easy and reliable visibility of business
performance
• Forecasts and indicators where action is
needed
• Drill-down into underlying information and
commitments
• Flexible resource refocus
11
University of Southern California
Center for Systems and Software Engineering
Measurements for Project Management
• Immediate project reviews
• Status and forecasts for quality, schedule,
and budget
• Follow-up action points
• Report based on consistent raw data
12
University of Southern California
Center for Systems and Software Engineering
Project management supporting metrics
1. Planning - Metrics serve as a basis of cost
estimating, training planning, resource planning,
scheduling, and budgeting.
2. Organizing - Size and schedule metrics influence a
project's organization.
3. Controlling - Metrics are used to status and track
software development activities for compliance to
plans.
4. Improving - Metrics are used as a tool for process
improvement and to identify where improvement
efforts should be concentrated and measure the
effects of process improvement efforts.
13
University of Southern California
Center for Systems and Software Engineering
Measurements for Engineers
• Immediate access to team planning and
progress
• Get visibility into own performance and
how it can be improved
• Indicators that show weak spots in
deliverables
• Focus energy on software development
14
University of Southern California
Center for Systems and Software Engineering
The E4-Measurement Process
Objectives,
needs
Decisions, redirection,
updated plans
Business
Process
Environment,
resources
1. Establish
Ref: Ebert and Dumke 2007
2. Extract
3. Evaluate
4. Execute
15
University of Southern California
Center for Systems and Software Engineering
Aggregation of information
Enterprise
Cash flow, Shareholder
value, Operations cost
Division
Cost reduction, Sakes,
margins, Customer Service
Product Line/
Department
Sales, cost reduction,
innovative products, level of
customization
Projects
Cycle time, quality, cost,
productivity, customer
satisfaction, resources, skills
16
University of Southern California
Center for Systems and Software Engineering
SMART goals
•
•
•
•
•
Specific - precise
Measurable - tangible
Accountable – in line with individual responsibilities
Realistic - achievable
Timely – suitable for the current needs
17
University of Southern California
Center for Systems and Software Engineering
What do you want to measure?
• Processes
– Software-related activities
• Products
– Artifacts, deliverables, documents
• Resources
– The items which are inputs to the process
18
University of Southern California
Center for Systems and Software Engineering
Components of software measurements
19
University of Southern California
Center for Systems and Software Engineering
Example of Metrics
•
•
•
•
•
Progress / Effort / Cost Indicator
Earned value management
Requirements / Code Churn
Defect-related metrics
Test-related metrics
20
University of Southern California
Center for Systems and Software Engineering
Size
• How big is the healthcare.gov website?
– http://www.informationisbeautiful.net/visualizati
ons/million-lines-of-code/
21
University of Southern California
Center for Systems and Software Engineering
Size
• Earth System Modeling Framework Project
22
http://www.earthsystemmodeling.org/metrics/sloc.shtml
University of Southern California
Center for Systems and Software Engineering
Progress Indicator
23
University of Southern California
Center for Systems and Software Engineering
Effort Indicator
24
University of Southern California
Center for Systems and Software Engineering
Cost Indicator
25
University of Southern California
Center for Systems and Software Engineering
Earned value management
•
•
Planned Value (PV) or Budgeted Cost of Work Scheduled (BCWS)
Earned Value (EV) or Budgeted Cost of Work Performed (BCWP)
http://en.wikipedia.org/wiki/Earned_value_management
26
University of Southern California
Center for Systems and Software Engineering
Burndown Chart
http://en.wikipedia.org/wiki/Burn_down_chart
27
University of Southern California
Center for Systems and Software Engineering
Requirements Churn/
Requirements Creep/
Requirements Volatility
• number of changes to system requirements
in each phase/week/increment
28
University of Southern California
Center for Systems and Software Engineering
Code Churn
•
•
•
•
•
Software change history
Large / recent changes
Total added, modified and deleted LOC
Number of times that a binary was edited
Number of consecutive edits
29
University of Southern California
Center for Systems and Software Engineering
Code Complexity
•
•
•
•
•
•
•
•
•
•
Gathered from code itself
Multiple complexity values
Cyclomatic complexity
Fan-In / Fan-Out of functions
Lines of Code
Weighted methods per class
Depth of Inheritance
Coupling between objects
Number of subclasses
Total global variables
30
University of Southern California
Center for Systems and Software Engineering
Code coverage
• Degree to which the source code is tested
• Statement coverage
– Has each node in the program been executed?
• Branch coverage
– Has each control structure been evaluated both
to true and false?
31
University of Southern California
Center for Systems and Software Engineering
Code Coverage
http://www.firstlinesoftware.com/metrics_group2.html
32
University of Southern California
Center for Systems and Software Engineering
JUnit Code Coverage
Tool instruments byte code with extra code to measure
which statements are and are not reached.
Line Level Coverage
A Code Coverage Report
Package Level Converage
33
http://www.cafeaulait.org/slides/albany/codecoverage/Measuring_JUnit_Code_Coverage.html
University of Southern California
Center for Systems and Software Engineering
Defect reporting metric
• Can be categorized by
– Status
• Remaining / Resolved / Found
– Defect Sources
• Requirements / Design / Development
– Defect found
• Peer review / unit testing / sanity check
– Time
• Defect arrival rate / Defect age
34
University of Southern California
Center for Systems and Software Engineering
Defect Status
35
University of Southern California
Center for Systems and Software Engineering
Defect Density
36
University of Southern California
Center for Systems and Software Engineering
Test Pass Coverage
http://www.jrothman.com/Papers/QW96.html
37
University of Southern California
Center for Systems and Software Engineering
Defect Density
38
University of Southern California
Center for Systems and Software Engineering
Defect Per LOC
39
University of Southern California
Center for Systems and Software Engineering
Developer Code Review
After 60‒90 minutes,
our ability to find
defects drops off
precipitously
40
http://answers.oreilly.com/topic/2265-best-practices-for-developer-code-review/
University of Southern California
Center for Systems and Software Engineering
As the size of the code
under review increases,
our ability to find all the
defects decreases. Don’t
review more than 400
lines of code at a time.
41
http://answers.oreilly.com/topic/2265-best-practices-for-developer-code-review/
University of Southern California
Center for Systems and Software Engineering
Top 6 Agile Metrics
Ref: Measuring Agility, Peter Behrens
42
University of Southern California
Center for Systems and Software Engineering
Velocity = Work Completed per sprint
Ref: Measuring Agility, Peter Behrens
43
University of Southern California
Center for Systems and Software Engineering
44
University of Southern California
Center for Systems and Software Engineering
Measurements in organizational level
• Empirical analysis
• Change from the top
45
University of Southern California
Center for Systems and Software Engineering
Richard W. Selby, Northrop Grumman Space Technology, ICSP '09
Title: "Synthesis, Analysis, and Modeling of Large-Scale Mission-Critical Embedded Software Systems"
46
University of Southern California
Center for Systems and Software Engineering
47
University of Southern California
Center for Systems and Software Engineering
48
University of Southern California
Center for Systems and Software Engineering
Measurements for progress vs predictions
Project Phase
For Measurements
For Predictions
Project
Management
-Effort and Budget Tracking
- Requirements Status
-Task Status
-Top 10 risks
- Cost to complete
- Schedule evolution
Quality
Management
-Code Stability
-Open defects
-Review status and follow up
-Residual defects
-Reliability
-Customer satisfaction
Requirements
Management
-Analysis status
-Specification progress
-Requirements volatility /
completeness
Construction
-Status of documents
- Change requests
-Review status
-Design progress of reqm
- Cost to complete
-Time to complete
Test
Test progress (defects,
coverage, efficiency, stability
- Residual defects
- reliability
Transition,
deployment
-Field performance (failure,
corrections)
- maintenance effort
-Reliability
-Maintenance effort
Ref: Ebert and Dumke, 2007
49
University of Southern California
Center for Systems and Software Engineering
Recommended books
Practical Software Measurement:
Objective Information for Decision Makers
by John McGarry, David Card, Cheryl
Jones and Beth Layman (Oct 27, 2001)
Software Measurement: Establish Extract - Evaluate – Execute by Christof
Ebert, Reiner Dumke (2010)
50
University of Southern California
Center for Systems and Software Engineering
References
•
•
•
•
•
•
•
•
http://sunset.usc.edu/classes/cs577b_2001/metricsguide/metrics.html
Fenton NE, Software Metrics: A Rigorous Approach, Chapman and Hall,
1991.
http://www.stsc.hill.af.mil/resources/tech_docs/gsam3/chap13.pdf
Christof Ebert, Reiner Dumke, Software measurement: establish, extract,
evaluate, execute, Springer 2007
http://se.inf.ethz.ch/old/teaching/2010-S/0276/slides/kissling.pdf
Richard W. Selby, Northrop Grumman Space Technology, ICSP '09
Title: "Synthesis, Analysis, and Modeling of Large-Scale Mission-Critical
Embedded Software Systems“
Measuring Agility, Peter Behrens
[Nikora 91] Nikora, Allen P. Error Discovery Rate by Severity Category and
Time to Repair Software Failures for Three JPL Flight Projects. Software
Product Assurance Section, Jet Propulsion Laboratory, 4800 Oak Grove
Drive, Pasadena, CA 91109-8099, November 5, 1991.
51
Download