Improving Affordability via Value-Based Testing

advertisement
University of Southern California
Center for Systems and Software Engineering
Improving Affordability via Value-Based
Testing
27th International Forum on COCOMO® and
Systems/Software Cost Modeling
USC-CSSE/ISCAS
Qi Li, Barry Boehm, Qing Wang, Ye Yang, Song Wang
October 16, 2012
University of Southern California
Center for Systems and Software Engineering
Outline
•
•
•
•
•
10/16/2012
Research Motivation
Research Method
Case Studies
Future Work
Conclusion
COCOMO Forum 2012
2
University of Southern California
Center for Systems and Software Engineering
Research Motivation
• Value-neutral SE methods are increasingly risky [Boehm, 2003]
–
–
–
–
Every requirement, use case, object, test case, and defect is equally important
System value-domain problems are the chief sources of software project failures
Existing prioritization is intuitional, not systematic and comprehensive
“Earned Value” Systems don’t track business value
• Testing & Inspection resources are expensive and scarce
– 30%-50%, even higher for high reliability projects [Ramler, 2005]
– Time-to-market [Boehm, Huang, 2005]
• Empirical Findings [Bullock 2000, Boehm & Basili 2001 ]
– About 20 percent of the features provide 80 percent of business value
– About 80 percent of the defects come from 20 percent of the modules
– …
• Value-based Software Engineering 4+1 theorem [Boehm, 2005]
10/16/2012
COCOMO Forum 2012
3
University of Southern California
Center for Systems and Software Engineering
Outline
•
•
•
•
•
10/16/2012
Research Motivation
Research Method
Case Studies
Future Work
Conclusion
COCOMO Forum 2012
4
University of Southern California
Center for Systems and Software Engineering
Value-Based Software Test Prioritization
What to be prioritized?
•Testing items: Testing Scenarios, Testing Features, Test Cases
How to prioritize?
•Value-Based (Business Importance, Risk, Cost)
•Dependency Aware
How to Measure?
•Average Percentage of Business Importance Earned (APBIE)
University of Southern California
Center for Systems and Software Engineering
Research Method: Value-Based
• Risk Exposure (RE)
– Where Size (Loss) is the risk impact size of loss if the
outcome is unsatisfactory, Pro (Loss) is the probability
of an unsatisfactory outcome
• Risk Reduction Leverage (RRL)
– Where REbefore is the RE before initiating the risk
reduction effort and REafter is the RE afterwards.
– RRL is a measure of the cost-benefit ratio of
performing a candidate risk reduction or defect
removal activity
10/16/2012
COCOMO Forum 2012
6
University of Southern California
Center for Systems and Software Engineering
Research Method: Value-Based
• Value-Based Prioritization Drivers:
– Business Case Analysis
– Business Value
– Stakeholder
Prioritization
– Impact of Defect
– Size of
Loss
Defect Criticality
Risk Exposure
– Experience Base
– Defect-prone
Components,
Performers
– Probability
of Loss
Testing items are to be ranked by how well they can reduce RE
10/16/2012
COCOMO Forum 2012
7
University of Southern California
Center for Systems and Software Engineering
Research Method: Value-Based
• Combining with the testing items’ relative costs
• =>Priority Trigger:
• This proposed strategy enables them to be prioritized in
terms of Risk Reduction Leverage (RRL) or ROI
• Supposed to improve the lifecycle cost-effectiveness of
defect removal techniques
10/16/2012
COCOMO Forum 2012
8
University of Southern California
Center for Systems and Software Engineering
Research Method: Dependency Aware
Value of software product to organization
Natural speech input
Tertairy application
functions
Animated displays
Secondary application functions
User amenities
Main application functions
Operating System
Investment
Basic application functions
Data management system
High-payoff
Diminishing returns
Cost of software product [Boehm, 1981]
10/16/2012
COCOMO Forum 2012
9
University of Southern California
Center for Systems and Software Engineering
Research Method: Dependency Aware
• Dependency:
– Example: dependencies
among test cases to be
executed
– Solution: Prioritization
Algorithm (greedy alg)
• Select the one with the
highest RRL
• Check dependency
10/16/2012
COCOMO Forum 2012
9->3->9->5->9->4->7
10
University of Southern California
Center for Systems and Software Engineering
Research Method: Metrics
• Testing Cost Effectiveness
– Average Percentage of Business Importance Earned (APBIE)
10/16/2012
COCOMO Forum 2012
11
University of Southern California
Center for Systems and Software Engineering
Outline
•
•
•
•
•
10/16/2012
Research Motivation
Research Method
Case Studies
Future Work
Conclusion
COCOMO Forum 2012
12
University of Southern California
Center for Systems and Software Engineering
Case Studies Results
• Exercise Test Prioritization based on Risk
Reduction Level (RRL)
 software testing scenarios to be walked through in
Galorath.Inc
 software features to be tested in a Chinese company
 software test cases to be executed in USC SE course
projects
All of them show positive results
10/16/2012
COCOMO Forum 2012
13
University of Southern California
Center for Systems and Software Engineering
Case Studies Results (Galorath Inc.)
Prioritize testing scenarios to be walked through
• Case Study:
– Galorath Inc. (2011 Summer)
– Project: Installation Process Automation
– Challenge: Testing all scenarios is impossible under limited
testing resources (69 scenarios)
10/4/2012
Qi Li _Defense
14
University of Southern California
Center for Systems and Software Engineering
Case Studies Results (Galorath Inc.)
Prioritize testing scenarios to be walked through
100.00%
90.00%
100.00%
93.83%
90.12%
Value-based
95.06%
83.95%
87.65%
77.78%
80.00%
70.00%
74.07%
58.02%
61.73%
60.00%
Value-neutral
58.02%
51.85%
50.00%
PBIE-1
45.68%
40.00%
PBIE-2
39.51%
30.00%
PBIE-3
35.80%
Value-inverse
(worst case)
25.93%
20.00%
22.22%
16.05%
10.00%
9.88%
4.94%
APBIE-1
70.99%
APBIE-2
10.08%
APBIE-3
32.10%
6.17%
0.00%
8
10
12
14
16
18
Stop Testing
20
22
24
26
28
30
– Value-based prioritization can improve the cost-effectiveness of testing
10/16/2012
COCOMO Forum 2012
15
University of Southern California
Center for Systems and Software Engineering
Case Studies Results (USC Course Projects)
Prioritize software test cases to be executed
• Experiment
– USC-CSCI 577ab
– 18 teams (5 2011 Spring teams + 13 Fall teams)
– Acceptance testing phase
10/16/2012
COCOMO Forum 2012
16
University of Southern California
Center for Systems and Software Engineering
Case Studies Results (USC Course Projects)
Prioritize software test cases to be executed
• Experiment Results (Quantitative)
APBIE-1
81.9%
100.0%
Value-based
90.0%
80.0%
PBIE
70.0%
– Project 1 as an example,
– Value-based prioritization
can improve the costeffectiveness of testing
60.0%
50.0%
40.0%
30.0%
20.0%
10.0%
0.0%
APBIE-2
52%
APBIE-3
46%
100.0%
90.0%
80.0%
PBIE
70.0%
Value-based
60.0%
50.0%
40.0%
Value-neutral
30.0%
20.0%
10.0%
0.0%
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28
Test Case Order
10/16/2012
COCOMO Forum 2012
17
University of Southern California
Center for Systems and Software Engineering
Case Studies Results (USC Course Projects)
Prioritize software test cases to be executed
• An automatic tool for facilitating prioritization
http://greenbay.usc.edu/dacs/vbt/testlink/index.php
10/16/2012
COCOMO Forum 2012
18
University of Southern California
Center for Systems and Software Engineering
Case Studies Results (USC Course Projects)
Prioritize software test cases to be executed (Quantitative)
APBIE
Delivered Value Comparison when Cost is fixed
APBIE
# of TCs
Value-Based
ValueNeutral
Improvement
PBIE
1/2 # of TCs
Value-Based
Value-Neutral
Improvement
2011S_T01
28
56.41%
46.38%
10.03%
2011S_T01
14
60%
40%
20.00%
2011S_T02
29
54.94%
53.80%
1.14%
2011S_T02
15
61%
58%
3.00%
2011S_T03
22
51.76%
50.75%
1.01%
2011S_T03
11
52%
50%
2.00%
2011S_T05
31
54.36%
51.87%
2.49%
2011S_T05
16
56%
50%
6.00%
2011S_T06
39
53.07%
50.40%
2.67%
2011S_T06
20
59%
51%
8.00%
2011F_T01
19
51.93%
45.98%
5.95%
2011F_T01
10
60%
45%
15.00%
2011F_T03
14
52.15%
50.33%
1.82%
2011F_T03
7
50%
50%
0.00%
2011F_T04
24
61.95%
53.62%
8.33%
2011F_T04
12
70%
50%
20.00%
2011F_T05
77
63.21%
42.07%
21.14%
2011F_T05
39
70%
40%
30.00%
2011F_T06
31
59.22%
53.31%
5.91%
2011F_T06
16
65%
50%
15.00%
2011F_T07
10
57.25%
56.25%
1.00%
2011F_T07
5
53%
52%
1.00%
2011F_T08
7
55.71%
54.76%
0.95%
2011F_T08
4
60%
50%
10.00%
2011F_T09
10
57.27%
51.51%
5.76%
2011F_T09
5
58%
45%
13.00%
2011F_T10
18
62.08%
57.23%
4.85%
2011F_T10
9
63%
55%
8.00%
2011F_T11
25
53.16%
51.39%
1.77%
2011F_T11
13
55%
50%
5.00%
2011F_T12
6
58.33%
58.33%
0.00%
2011F_T12
3
50%
50%
0.00%
2011F_T13
31
53.64%
53.25%
0.39%
2011F_T13
16
51%
50%
1.00%
2011F_T14
29
57.24%
48.17%
9.07%
2011F_T14
15
60%
40%
20.00%
56.32%
51.63%
4.68%
Average
58.50%
48.67%
9.83%
F-test
0.5745
F-test
0.3822
T-test
0.000661
T-test
0.000083
Average
19
University of Southern California
Center for Systems and Software Engineering
Case Studies Results (USC Course Projects)
Prioritize software test cases to be executed (Quantitative)
Cost Comparison when Delivered Value is fixed
# of TCs when
gaining 50% BI
Value-Based
Value-Neutral
# of TCs
Value-Based Cost%
Value-Neutral Cost%
Cost saving %
2011S_T01
12
17
28
42.86%
60.71%
17.86%
2011S_T02
13
13
29
44.83%
44.83%
0.00%
2011S_T03
11
11
22
50.00%
50.00%
0.00%
2011S_T05
13
16
31
41.94%
51.61%
9.68%
2011S_T06
18
21
39
46.15%
53.85%
7.69%
2011F_T01
9
11
19
47.37%
57.89%
10.53%
2011F_T03
7
7
14
50.00%
50.00%
0.00%
2011F_T04
8
14
24
33.33%
58.33%
25.00%
2011F_T05
21
51
77
27.27%
66.23%
38.96%
2011F_T06
11
16
31
35.48%
51.61%
16.13%
2011F_T07
5
5
10
50.00%
50.00%
0.00%
2011F_T08
4
4
7
57.14%
57.14%
0.00%
2011F_T09
4
6
10
40.00%
60.00%
20.00%
2011F_T10
7
9
18
38.89%
50.00%
11.11%
2011F_T11
11
13
25
44.00%
52.00%
8.00%
2011F_T12
3
3
6
50.00%
50.00%
0.00%
2011F_T13
16
16
31
51.61%
51.61%
0.00%
2011F_T14
12
18
29
41.38%
62.07%
20.69%
44.01%
54.33%
10.31%
F-test
0.2616
T-test
0.000517
Average
20
University of Southern California
Center for Systems and Software Engineering
Case Studies Results (USC Course Projects)
Prioritize software test cases to be executed
• Experiment Results (Qualitative)
“Before doing the prioritization, I had a vague idea of which test cases are
important to clients. But after going through the Value-Based testing, I had a
better picture as to which ones are of critical importance to the client.”
“I prioritized test cases mainly based on the sequence of the system work
flow, which is performing test cases with lower dependencies at first before
using value-based testing. I like the value-based process because it can
save time by letting me focus on more valuable test cases or risky ones.
Therefore, it improves testing efficiency.”
“Value-based testing is very useful in complex systems with hundreds or
thousands of test-cases. However in 577 it should not be difficult to run
every test-case in every test iteration, making the prioritization less useful.
The impact of value-based testing and automated test management on
software quality is entirely dependent on the complexity of the project. If
complete test coverage is possible in the time given, the benefit of VBST to
software quality is minimal.”
10/16/2012
COCOMO Forum 2012
21
University of Southern California
Center for Systems and Software Engineering
Case Studies Results (USC Course Projects)
Prioritize software test cases to be executed
• Some lessons learned from this case study
– Differentiating the priority factor levels is the prerequisite of value-based
prioritization
– Small size project might have difficulties to prioritize those equally important
core capabilities
– Strong positive correlation between the size of test cases and the
improvements
– Even small percentage of improvement might result saving lots of budget in
reality
10/4/2012
Qi Li _Defense
22
University of Southern California
Center for Systems and Software Engineering
Outline
•
•
•
•
•
10/16/2012
Research Motivation
Research Method
Case Studies
Future Work
Conclusion
COCOMO Forum 2012
23
University of Southern California
Center for Systems and Software Engineering
Outline
•
•
•
•
•
10/16/2012
Research Motivation
Research Method
Case Studies
Future Work
Conclusion
COCOMO Forum 2012
24
University of Southern California
Center for Systems and Software Engineering
Conclusion
• Propose a Real “Earned Value” System to Track Business Value
of Testing and Measure Testing Efficiency in terms of APBIE
• Propose a Systematic Strategy for Value-based, Dependency
Aware Test Processes
• Apply This Strategy to a Series of Empirical Studies with
different granularities of Prioritizations
• Elaborate Decision Criteria of Testing Priorities Per Project
Contexts, Which are Helpful for Real Industry Practices
• Implement an automatic tool for its application on large-scale
industrial projects
10/16/2012
COCOMO Forum 2012
25
University of Southern California
Center for Systems and Software Engineering
Question and Answer
10/16/2012
COCOMO Forum 2012
26
Download