Economics of System and Software Reliability Assurance

advertisement
USC
C S E
University of Southern California
Center for Software Engineering
The Economics of Systems and
Software Reliability Assurance
Barry Boehm, USC
CS 510, Fall 2015
Fall 2015
©USC-CSSE
1
USC
C S E
University of Southern California
Center for Software Engineering
Outline
• The business case for systems and software
reliability
– Example: net value of test aids
– Value depends on stakeholder value propositions
• What qualities do stakeholders really value?
– Safety, accuracy, response time, …
– Attributes often conflict
• Systems and software reliability in a
competitive world: values and tradespaces
• Conclusions
Fall 2015
©USC-CSSE
2
USC
C S E
University of Southern California
Center for Software Engineering
Software Reliability Business Case
• Systems and software reliability investments
compete for resources
– With investments in functionality, response time,
adaptability, speed of development, …
• Each investment option generates curves of net
value and ROI
– Net Value NV = PV(benefit flows-cost flows)
– Present Value PV = Flows at future times discounted
– Return on Investment ROI = NV/PV(cost flows)
• Value of benefits varies by stakeholder and role
Fall 2015
©USC-CSSE
3
USC
C S E
University of Southern California
Center for Software Engineering
Testing Business Case
• Vendor proposition
– Our test data generator will cut your test costs in half
– We’ll provide it to you for 30% of your test costs
– After you run all your tests for 50% of your original costs,
you’re 20% ahead
• Any concerns with vendor proposition?
Fall 2015
©USC-CSSE
4
USC
C S E
University of Southern California
Center for Software Engineering
Testing Business Case
• Vendor proposition
– Our test data generator will cut your test costs in half
– We’ll provide it to you for 30% of your test costs
– After you run all your tests for 50% of your original costs,
you’re 20% ahead
• Any concerns with vendor proposition?
– Test data generator is value-neutral*
– Every test case, defect is equally important
– Usually, 20% of test cases cover 80% of business value
* As are most current engineering techniques
Fall 2015
©USC-CSSE
5
USC
University of Southern California
C S E
Center for Software Engineering
20% of Features Provide 80% of Value:
Focus Testing on These (Bullock, 2000)
100
80
% of
Value
for
Correct
Customer
Billing
60
Automated test
generation tool
- all tests have equal value
40
20
5
10
15
Customer Type
Fall 2015
©USC-CSSE
6
USC
University of Southern California
C S E
Center for Software Engineering
Value-Based Testing Provides More Net Value
60
(30, 58)
Value-Based Testing
40
Net
Value 20
NV
0
(100, 20)
20
40
60
80
100
Percent of tests run
-20
Test Data Generator
% Tests
Test Data Generator
Value-Based Testing
Cost
Value
NV
Cost
Value
NV
0
30
0
-30
0
0
0
10
35
10
-25
10
50
40
20
40
20
-20
20
75
55
30
45
30
-15
30
88
58
40
50
40
-10
40
94
54
….
….
….
….
….
….
….
100
80
100
+20
100
100
0
-40
Fall 2015
©USC-CSSE
7
USC
University of Southern California
C S E
Center for Software Engineering
Value-Neutral Defect Fixing Is Even Worse
Pareto 80-20 Business Value
100
Automated test
generation tool
- all tests have equal value
80
% of
Value
for
Correct
Customer
Billing
60
40
Value-neutral defect fixing:
Quickly reduce # of defects
20
5
10
15
Customer Type
Fall 2015
©USC-CSSE
8
USC
C S E
University of Southern California
Center for Software Engineering
Industrial Research Results
Study 2: Prioritize testing scenarios to be
executed [Li TR1 2011]
• Case Study:
– Galorath Inc. (2011 Summer)
– Project: Installation Process Automation
– Challenge: Testing all scenarios is impossible under
limited testing resources (69 scenarios)
Fall 2015
©USC-CSSE
9
USC
University of Southern California
C S E
Center for Software Engineering
Industrial Research Results
Study 2: Prioritize testing scenarios to be executed
APBIE: Accumulated Percentage of Business Importance Earned
Case Study Results:
100.00%
90.00%
Value-based
93.83%
90.12%
100.00%
95.06%
83.95%
87.65%
77.78%
80.00%
Value-neutral
70.00%
74.07%
58.02%
61.73%
60.00%
58.02%
51.85%
50.00%
PBIE-1
45.68%
40.00%
Value-neutral (worst
case)
39.51%
30.00%
PBIE-2
PBIE-3
35.80%
– H-t1: the value-based
prioritization does not
increase APBIE
– reject H-t1
– Value-based prioritization
can improve the costeffectiveness of testing
25.93%
20.00%
22.22%
16.05%
10.00%
9.88%
4.94%
6.17%
0.00%
8
10
12
Fall 2015
14
16Stop 18
Testing20
22
24
26
28
APBIE-1
70.99%
APBIE-2
10.08%
APBIE-3
32.10%
30
©USC-CSSE
10
USC
University of Southern California
C S E
Center for Software Engineering
There is No Universal Reliability Metric
• Different stakeholders rely on different value attributes
–
–
–
–
–
Protection: safety, security, privacy
Robustness: reliability, availability, survivability
Quality of Service: performance, accuracy, ease of use
Adaptability: evolvability, interoperability
Affordability: cost, schedule, reusability, maintainability
• Value attributes continue to tier down
– Performance: response time, resource consumption (CPU,
memory, communications, power, fuel)
• Value attributes are scenario-dependent
– 5 seconds normal response time; 2 seconds in crisis
• Value attributes often conflict
– Most often with performance and affordability
Fall 2015
©USC-CSSE
11
USC
C S E
University of Southern California
Center for Software Engineering
Implications for Reliability Engineering
• There is no universal quality metric to optimize
• Need to identify system’s success-critical stakeholders
– And their quality priorities
• Need to balance satisfaction of stakeholder dependencies
– Stakeholder win-win negotiation
– Quality attribute tradeoff analysis
• Need value-of-quality models, methods, and tools
Fall 2015
©USC-CSSE
12
USC
C S E
University of Southern California
Center for Software Engineering
Outline
• The business case for systems and software
reliability
– Example: net value of test aids
– Value depends on stakeholder value propositions
• What qualities do stakeholders really value?
– Safety, accuracy, response time, …
– Attributes often conflict
• Systems and software reliability in a
competitive world: values and tradespaces
• Conclusions
Fall 2015
©USC-CSSE
13
USC
University of Southern California
C S E
Center for Software Engineering
Competing on Cost and Quality
- adapted from Michael Porter, Harvard
ROI
Too
cheap
(low Q)
Fall 2015
Cost
leader
(acceptable Q)
Stuck in
the middle
Quality
leader
(acceptable C)
©USC-CSSE
Overpriced
Price
or Cost
Point
14
USC
University of Southern California
C S E
Center for Software Engineering
Competing on Schedule and Quality
- A risk analysis approach
• Risk Exposure RE = Prob (Loss) * Size (Loss)
– “Loss” – financial; reputation; future prospects, …
• For multiple sources of loss:
RE =  [Prob (Loss) * Size (Loss)]source
sources
Fall 2015
©USC-CSSE
15
USC
C S E
University of Southern California
Center for Software Engineering
Example RE Profile: Time to Ship
- Loss due to unacceptable quality
Many defects: high P(L)
Critical defects: high S(L)
RE =
P(L) * S(L)
Few defects: low P(L)
Minor defects: low S(L)
Time to Ship (amount of testing)
Fall 2015
©USC-CSSE
16
USC
C S E
University of Southern California
Center for Software Engineering
Example RE Profile: Time to Ship
- Loss due to unacceptable quality
- Loss due to market share erosion
Many defects: high P(L)
Critical defects: high S(L)
Many rivals: high P(L)
Strong rivals: high S(L)
RE =
P(L) * S(L)
Few rivals: low P(L)
Weak rivals: low S(L)
Few defects: low P(L)
Minor defects: low S(L)
Time to Ship (amount of testing)
Fall 2015
©USC-CSSE
17
USC
C S E
University of Southern California
Center for Software Engineering
Example RE Profile: Time to Ship
- Sum of Risk Exposures
Many defects: high P(L)
Critical defects: high S(L)
Many rivals: high P(L)
Strong rivals: high S(L)
Sweet
Spot
RE =
P(L) * S(L)
Few rivals: low P(L)
Weak rivals: low S(L)
Few defects: low P(L)
Minor defects: low S(L)
Time to Ship (amount of testing)
Fall 2015
©USC-CSSE
18
USC
C S E
University of Southern California
Center for Software Engineering
Comparative RE Profile:
Safety-Critical System
Higher
S(L): defects
High-Q
Sweet
Spot
RE =
P(L) * S(L)
Mainstream
Sweet
Spot
Time to Ship (amount of testing)
Fall 2015
©USC-CSSE
19
USC
C S E
University of Southern California
Center for Software Engineering
Comparative RE Profile:
Internet Startup
Higher
S(L): delays
Low-TTM
Sweet
Spot
RE =
P(L) * S(L)
Mainstream
Sweet
Spot
TTM:
Time to Market
Time to Ship (amount of testing)
Fall 2015
©USC-CSSE
20
USC
C S E
University of Southern California
Center for Software Engineering
How Much Testing is Enough?
- Early Startup: Risk due to low dependability
- Commercial: Risk due to low dependability
- High Finance: Risk due to low dependability
- Risk due to market share erosion
Combined Risk Exposure
1
Market Share
Erosion
0.8
Early Startup
0.6
RE =
P(L) * S(L)
Sweet
Spot
0.4
Commercial
High Finance
0.2
0
VL
L
N
H
RELY
VH
COCOMO II:
0
12
22
34
54
Added % test time
COQUALMO:
1.0
.475
.24
.125
0.06
P(L)
Early Startup:
.33
.19
.11
.06
.03
S(L)
Commercial:
1.0
.56
.32
.18
.10
S(L)
High Finance:
3.0
1.68
.96
.54
.30
S(L)
Market Risk:
.008
.027
.09
.30
1.0
REm
Fall 2015
©USC-CSSE
21
USC
University of Southern California
C S E
Center for Software Engineering
Current COQUALMO System
COCOMO II
COQUALMO
Software Size Estimate
Software platform,
Project, product and
personnel attributes
Defect
Introduction
Model
Software
development effort,
cost and schedule
estimate
Number of residual defects
Defect density per unit of size
Defect removal
profile levels
Automation,
Reviews, Testing
Fall 2015
Defect
Removal
Model
©USC-CSSE
22
USC
University of Southern California
C S E
Center for Software Engineering
Defect Removal Rating Scales
COCOMO II p.263
Very Low
Low
Nominal
High
Very High
Extra High
Automated
Analysis
Simple compiler
syntax checking
Basic compiler
capabilities
Compiler
extension
Basic req. and
design
consistency
Intermediatelevel module
Simple
req./design
More elaborate
req./design
Basic distprocessing
Formalized
specification,
verification.
Advanced distprocessing
Peer Reviews
No peer review
Ad-hoc informal
walk-through
Well-defined
preparation,
review, minimal
follow-up
Formal review
roles and Welltrained people
and basic
checklist
Root cause
analysis, formal
follow
Using historical
data
Extensive review
checklist
Statistical control
Execution
Testing and Tools
No testing
Ad-hoc test and
debug
Basic test
Test criteria
based on
checklist
Well-defined test
seq. and basic
test coverage tool
system
More advance
test tools,
preparation.
Dist-monitoring
Highly advanced
tools, modelbased test
Fall 2015
©USC-CSSE
23
USC
C S E
University of Southern California
Center for Software Engineering
Defect Removal Estimates
- Nominal Defect Introduction Rates
70
60
60
50
Delivered Defects
/ KSLOC
40
30
28.5
20
14.3
10
7.5
0
VL
Low
Nom
High
3.5
VH
1.6
XH
Composite Defect Removal Rating
Fall 2015
©USC-CSSE
24
USC
C S E
University of Southern California
Center for Software Engineering
Interim Conclusions
• Unwise to try to compete on both cost/schedule
and reliability
– Some exceptions: major technology or marketplace
edge
• There are no one-size-fits-all
cost/schedule/reliability strategies
• Risk analysis helps determine how much testing
(prototyping, formal verification, etc.) is enough
– Buying information to reduce risk
• Important to understand tradeoffs among ilities
– Some recent research discussed next
Fall 2015
©USC-CSSE
25
USC
C S E
University of Southern California
Center for Software Engineering
SERC Value-Based ilities Hierarchy
Based on ISO/IEC 9126, 25030; JCIDS; previous SERC research
• Individual ilities
– Mission Effectiveness: Speed, Physical Capability, Cyber Capability,
Usability, Accuracy, Impact, Endurability, Maneuverability,
Scalability, Versatility
– Resource Utilization: Cost, Duration, Personnel, Scarce Quantities
(capacity, weight, energy, …); Manufacturability, Sustainability
– Protection: Security, Safety
– Robustness: Reliability, Availablilty, Maintainability, Survivability
– Flexibility: Modifiability, Tailorability, Adaptability
– Composability: Interoperability, Openness, Service-Orientatio
• Composite ilities
–
–
–
–
Comprehensiveness/Suitability: all of the above
Dependability: Mission Effectiveness, Protection, Robustness
Resilience: Protection, Robustness, Flexibility
Affordability: Mission Effectiveness, Resource Utilization
Fall 2015
©USC-CSSE
26
USC
C S E
Means-Ends Framework: Affordability
University of Southern California
Center for Software Engineering
Staffing, Incentivizing, Teambuilding
Facilities, Support Services
Kaizen (continuous improvement)
Get the Best from People
Make Tasks More Efficient
Affordability
Improvements and
Tradeoffs
Tools and Automation
Work and Oversight Streamlining
Collaboration Technology
Lean and Agile Methods
Eliminate Tasks
Task Automation
Model-Based Product Generation
Eliminate Scrap, Rework
Early Risk and Defect Elimination
Evidence-Based Decision Gates
Modularity Around Sources of Change
Incremental, Evolutionary Development
Value-Based, Agile Process Maturity
Risk-Based Prototyping
Simplify Products (KISS)
Value-Based Capability Prioritization
Satisficing vs. Optimizing Performance
Reuse Components
Domain Engineering and Architecture
Composable Components,Services, COTS
Legacy System Repurposing
Reduce Operations, Support Costs
Automate Operations Elements
Design for Maintainability, Evolvability
Streamline Supply Chain
Anticipate, Prepare for Change
Value- and Architecture-Based
Tradeoffs and Balancing
Fall 2015
©USC-CSSE
27
USC
C S E
Fall 2015
University of Southern California
Center for Software Engineering
Architecture Strategy
Synergy-Conflict Matrix
©USC-CSSE
28
USC
University of Southern California
C S E
Center for Software Engineering
Software Development Cost vs. Reliability
1.4
1.3
Relative
Cost to
Develop
1.26
1.2
1.1
1.10
1.0
1.0
0.9
0.8
0.92
0.82
Very
Low
Low
Nominal
High
COCOMO II RELY Rating
MTBF (hours)
Fall 2015
1
10
300
©USC-CSSE
10,000
Very
High
300,000
29
USC
C S E
University of Southern California
Center for Software Engineering
“Quality is Free”
• Did Philip Crosby’s book get it all wrong?
• Investments in reliable systems
– Cost extra for simple, short-life systems
– Pay off for high-value, long-life systems
Fall 2015
©USC-CSSE
30
USC
University of Southern California
C S E
Center for Software Engineering
Software Ownership Cost vs. Reliability
1.4
1.3
Relative
Cost to
Develop,
Maintain,
Own and
Operate
1.2
Operational-defect cost at Nominal dependability
= Software life cycle cost
VL = 2.55
L = 1.52
1.26
1.23
1.20
1.1
1.0
1.10
1.10
1.05
1.07
70%
Maint.
1.11
Operational defect cost = 0
1.07
0.99
0.9
0.8
0.92
0.76
0.82
Very
Low
0.69
Low
Nominal
High
COCOMO II RELY Rating
MTBF (hours)
Fall 2015
1
10
300
©USC-CSSE
10,000
Very
High
300,000
31
USC
University of Southern California
C S E
Center for Software Engineering
Outline
• The business case for systems and software quality
• What qualities do stakeholders really value?
• Systems and software quality in a competitive world
–
–
–
–
Is market success a monotone function of quality?
Is quality really free?
Is “faster, better, cheaper” really achievable?
Value-based vs. value-neutral methods
• Conclusions
Fall 2015
©USC-CSSE
32
USC
University of Southern California
C S E
Center for Software Engineering
Tradeoffs Among Cost, Schedule, and
Reliability: COCOMO II
9
(RELY, MTBF (hours))
8
(VL, 1)
Cost ($M)
7
(L, 10)
6
5
(N, 300)
4
(H, 10K)
3
(VH, 300K)
•For 100-KSLOC set of features
•Can “pick all three” with 77-KSLOC set of features
2
1
-- Cost/Schedule/RELY:
“pick any two” points
0
0
10
20
30
40
50
Development Time (Months)
Fall 2015
©USC-CSSE
33
USC
University of Southern California
C S E
Center for Software Engineering
Value-Based Ontology
• Critical nature of system qualities (SQs)
–
–
–
–
–
Or non-functional requirements (NFRs); ilities
Major source of project overruns, failures
Significant source of stakeholder value conflicts
Poorly defined, understood
Underemphasized in project management
• Need for and nature of SQs ontology
– Nature of an ontology; choice of IDEF5 structure
– Stakeholder value-based, means-ends hierarchy
– Synergies and Conflicts matrix and expansions
• Example means-ends hierarchy: Affordability
Fall 2015
©USC-CSSE
34
USC
C S E
University of Southern California
Example of SQ Value Conflicts:
Security IPT
Center for Software Engineering
• Single-agent key distribution; single data copy
– Reliability: single points of failure
• Elaborate multilayer defense
– Performance: 50% overhead; real-time deadline problems
• Elaborate authentication
– Usability: delays, delegation problems; GUI complexity
• Everything at highest level
– Modifiability: overly complex changes, recertification
Fall 2015
©USC-CSSE
35
USC
C S E
University of Southern California
Example of Current Practice
Center for Software Engineering
• “The system shall have a Mean Time Between
Failures of 10,000 hours”
• What is a “failure?”
– 10,000 hours on liveness
– But several dropped or garbled messages per hour?
• What is the operational context?
– Base operations? Field operations? Conflict operations?
• Most management practices focused on functions
– Requirements, design reviews; traceability matrices; work
breakdown structures; data item descriptions; earned value
management
• What are the effects of or on other SQs?
– Cost, schedule, performance, maintainability?
Fall 2015
©USC-CSSE
36
USC
University of Southern California
C S E
Center for Software Engineering
Need for SQs Ontology
• Oversimplified one-size-fits all definitions
– ISO/IEC 25010, Reliability: the degree to which a system ,
product, or component performs specified functions under
specified conditions for a specified period of time
– OK if specifications are precise, but increasingly “specified
conditions” are informal, sunny-day user stories.
• Satisfying just these will pass “ISO/IEC Reliability,” even if
system fails on rainy-day user stories
– Need to reflect that different stakeholders rely on different
capabilities (functions, performance, flexibility, etc.) at
different times and in different environments
• Proliferation of definitions, as with Resilience
• Weak understanding of inter-SQ relationships
– Security Synergies and Conflicts with other qualities
Fall 2015
©USC-CSSE
37
USC
C S E
University of Southern California
Center for Software Engineering
Initial SERC SQs Ontology
• Modified version of IDEF5 ontology framework
– Classes, Subclasses, and Individuals
– Referents, States, Processes, and Relations
• Top classes cover stakeholder value propositions
– Mission Effectiveness, Resource Utilization, Dependability,
Flexibiity
• Subclasses identify means for achieving higher-class ends
– Means-ends one-to-many for top classes
– Ideally mutually exclusive and exhaustive, but some exceptions
– Many-to-many for lower-level subclasses
• Referents, States, Processes, Relations cover SQ variation
•
•
•
States: IReferents: Sources of variation by stakeholder value nternal
(beta-test); External (rural, temperate, sunny)
Processes: Operational scenarios (normal vs. crisis; experts vs. novices)
Relations: Impact of other SQs (security as above, synergies & conflicts)
Fall 2015
©USC-CSSE
38
USC
C S E
Fall 2015
University of Southern California
Center for Software Engineering
Matrix Expanded to 7x7,
176 Strategies
©USC-CSSE
39
USC
C S E
University of Southern California
Center for Software Engineering
Conclusions: System/Software Reliability (SSR)
• There is no universal SSR metric to optimize
– Need to balance stakeholder SSR value propositions
• Increasing need for value-based approaches to SSR
– Methods and models emerging to address needs
• “Faster, better, cheaper” is feasible
– If feature content can be a dependent variable
• “Quality is free” for stable, high-value, long-life systems
– But not for dynamic, lower-value, short life systems
• Future trends intensify SSR needs and challenges
– Criticality, complexity, decreased control, faster change
• System Qualities Ontology addresses challenges
Fall 2015
©USC-CSSE
40
USC
C S E
University of Southern California
Center for Software Engineering
Backup Charts
Fall 2015
©USC-CSSE
41
USC
C S E
University of Southern California
Center for Software Engineering
Major Information System Reliability Stakeholders
•Dependents
- passengers, patients
•Information Suppliers
-citizens, companies
•Information Brokers
-financial services,
news media
Information System •Information Consumers
-decisions, education,
entertainment
•Mission Controllers
-pilots, distribution
controllers
Developers, Acquirers, Administrators
Fall 2015
©USC-CSSE
42
USC
University of Southern California
C S E
Center for Software Engineering
Overview of Stakeholder/Value
Dependencies
•
Strength of direct dependency on value attribute
**- Critical ;
*-Significant; blank-insignificant or indirect
Stakeholders
Info. Suppliers, Dependents
Info. Brokers
Info. Consumers
Mission Controllers, Administrators
Developers, Acquirers
Fall 2015
©USC-CSSE
**
**
*
*
*
**
**
**
*
*
**
*
**
*
**
*
*
**
**
**
43
USC
University of Southern California
C S E
Center for Software Engineering
Cost, Schedule, Quality: Pick any Two?
C
S
Fall 2015
Q
©USC-CSSE
44
USC
C S E
University of Southern California
Center for Software Engineering
Cost, Schedule, Quality: Pick any Two?
• Consider C, S, Q as Independent Variable
– Feature Set as Dependent Variable
C
S
Fall 2015
C
S
Q
©USC-CSSE
Q
45
USC
C S E
University of Southern California
Center for Software Engineering
C, S, Q as Independent Variable
• Determine Desired Delivered Defect Density (D4)
– Or a value-based equivalent
• Prioritize desired features
– Via QFD, IPT, stakeholder win-win
• Determine Core Capability
– 90% confidence of D4 within cost and schedule
– Balance parametric models and expert judgment
• Architect for ease of adding next-priority features
– Hide sources of change within modules (Parnas)
• Develop core capability to D4 quality level
– Usually in less than available cost and schedule
• Add next priority features as resources permit
• Versions used successfully on 92% of over 100 USC webservices projects
Fall 2015
©USC-CSSE
46
USC
C S E
University of Southern California
Center for Software Engineering
iDAVE Model
Fall 2015
©USC-CSSE
47
USC
University of Southern California
C S E
Center for Software Engineering
ROI Analysis Results Comparison
- Business Application vs. Space Application
16
14
12
ROI
10
8
6
4
2
0. 32
0
- 0. 98
-2
N- >H
H- >VH
VH- >XH
Dependability Investment Levels
Sierra Order Processing
Fall 2015
©USC-CSSE
Planetary Rover
48
USC
C S E
University of Southern California
Center for Software Engineering
Cost of Downtime Survey
•
•
•
•
•
•
•
•
•
•
Industry Sector
Revenue/Hour
Energy
$2.8 million
Telecommunications $2.0 million
Manufacturing $1.6 million
Financial Institutions $1.4 million
Information Technology $1.3 million
Insurance $1.2 million
Retail $1.1 million
Pharmaceuticals $1.0 million
Banking $996,000
•
Source: IT Performance Engineering & Measurement Strategies:
Quantifying Performance Loss, Meta Group, October 2000.
Fall 2015
©USC-CSSE
49
USC
C S E
University of Southern California
Center for Software Engineering
Examples of Utility Functions: Response Time
Value
Value
Critical
Region
Time
Time
Mission Planning,
Competitive Time-to-Market
Real-Time Control;
Event Support
Value
Value
Time
Time
Event Prediction
- Weather; Software Size
Fall 2015
Data Archiving
Priced Quality of Service
©USC-CSSE
50
USC
University of Southern California
C S E
Center for Software Engineering
How much Architecting is Enough?
- A COCOMO II Analysis
100
Percent of Time Added to Overall Schedule
90
10000
KSLOC
Percent of Project Schedule Devoted to
Initial Architecture and Risk Resolution
80
Added Schedule Devoted to Rework
(COCOMO II RESL factor)
Total
70
Total % Added Schedule
60
Sweet Spot
50
40
100 KSLOC
Architecting
30
20
Sweet Spot Drivers:
Rapid Change: leftward
10 KSLOC
Rework
High Assurance: rightward
10
0
0
10
20
30
40
50
60
Percent of Time Added for Architecture and Risk Resolution
Fall 2015
©USC-CSSE
51
USC
C S E
University of Southern California
Center for Software Engineering
Effect of Volatility and Criticality on Sweet Spots
Fall 2015
©USC-CSSE
52
USC
C S E
University of Southern California
Center for Software Engineering
Value-Based vs. Value Neutral
Methods
• Value-based defect reduction
• Constructive Quality Model (COQUALMO)
• Information Dependability Attribute Value
Estimation (iDAVE) model
Fall 2015
©USC-CSSE
53
USC
C S E
University of Southern California
Center for Software Engineering
Value-Based Defect Reduction
Example:
Goal-Question-Metric (GQM) Approach
Goal:Our supply chain software packages have too
many defects. We need to get their defect rates
down
Question: ?
Fall 2015
©USC-CSSE
54
USC
C S E
University of Southern California
Center for Software Engineering
Value-Based GQM Approach – I
Q: How do software defects affect system value goals?
ask why initiative is needed
-
Order processing
Too much downtime on operations critical path
Too many defects in operational plans
Too many new-release operational problems
G: New system-level goal: Decrease
software-defect-related losses in operational
effectiveness
- With high-leverage problem areas above as specific
subgoals
New Q: ?
Fall 2015
©USC-CSSE
55
USC
C S E
University of Southern California
Center for Software Engineering
Value-Based GQM Approach – II
New Q: Perform system problem-area root cause analysis:
ask why problems are happening via models
Example: Downtime on critical path
Order Validate
order
items
Validate
items in
stock
Schedule
packaging,
delivery
Produce
status
reports
Prepare
delivery
packages
Deliver
order
Where are primary software-defect-related delays?
Where are biggest improvement-leverage areas?
Reducing software defects in Scheduling module
Reducing non-software order-validation delays
Taking Status Reporting off the critical path
Downstream, getting a new Web-based order entry system
Ask “why not?” as well as “why?”
Fall 2015
©USC-CSSE
56
USC
University of Southern California
C S E
Center for Software Engineering
Value-Based GQM Results
• Defect tracking weighted by system-value priority
– Focuses defect removal on highest-value effort
• Significantly higher effect on bottom-line business value
– And on customer satisfaction levels
• Engages software engineers in system issues
– Fits increasing system-criticality of software
• Strategies often helped by quantitative models
– COQUALMO, iDAVE
Fall 2015
©USC-CSSE
57
Download