COCOMO Estimation at CCD - Software Engineering II

advertisement
University of Southern California
Center for Systems and Software Engineering
Software Testing
Supannika Koolmanojwong
CSCI 577
Ref: Qi Li’s lectures
1
University of Southern California
Center for Systems and Software Engineering
Continuous
Integration
HCI
Software Metrics
MoSCoW
Software Reliability
Software Test
Automation
TDD
“Bug” is found
Code Inspection
V&V standard
Walkthrough
Agile
Testing
Defect
Tracking Tool
Personas
CRUD
Analysis
SW design
pattern
http://www.testingreferences.com/testingtimeline/testingtimeline.jpg
2
University of Southern California
Center for Systems and Software Engineering
Positive and Negative Testing
•
Positive Testing
– Do “normal” user actions
– Find cases where the program does not do what it is supposed to do
– Test with valid input
•
Negative Testing
– Do “abnormal” user actions
– Find cases where the program does things it is not supposed to do
– Test with invalid input
3
University of Southern California
Center for Systems and Software Engineering
Outline
• Software Test in General
• Value-based Software Test
4
University of Southern California
Center for Systems and Software Engineering
Most Common Software problems






Incorrect calculation
Incorrect data edits & ineffective data edits
Incorrect matching and merging of data
Data searches that yields incorrect results
Incorrect processing of data relationship
Incorrect coding / implementation of business
rules
 Inadequate software performance
5
University of Southern California
Center for Systems and Software Engineering
 Confusing or misleading data
 Software usability by end users & Obsolete
Software
 Inconsistent processing
 Unreliable results or performance
 Inadequate support of business needs
 Incorrect or inadequate interfaces
with other systems
 Inadequate performance and security
controls
 Incorrect file handling
6
University of Southern California
Center for Systems and Software Engineering
Cost to fix faults
60* to 100*
1.5* to 6*
Cost
1*
Definition
Development
Post Release
7
University of Southern California
Center for Systems and Software Engineering
Objectives of testing
• Executing a program with the intent of finding an
error.
• To check if the system meets the requirements
and be executed successfully in the Intended
environment.
• To check if the system is “ Fit for purpose”.
• To check if the system does what it is expected to
do.
8
University of Southern California
Center for Systems and Software Engineering
A good test :
• A good test case is one that has a probability of
finding an as yet undiscovered error.
• A successful test is one that uncovers a yet
undiscovered error.
• A good test is not redundant.
• A good test should be “best of breed”.
• A good test should neither be too simple nor too
complex.
9
University of Southern California
Center for Systems and Software Engineering
Objective of a Software Tester
• Find bugs as early as possible and make sure they get
fixed.
• To understand the application well.
• Study the functionality in detail to find where the bugs are
likely to occur.
• Study the code to ensure that each and every line of code
is tested.
• Create test cases in such a way that testing is done to
uncover the hidden bugs and also ensure that the
software is usable and reliable
10
University of Southern California
Center for Systems and Software Engineering
How do you know you are a
good tester?
a.
sources: Google images
b.
11
University of Southern California
Center for Systems and Software Engineering
How do you know you
are a good tester?
Signs that you are dating a tester
•
Your love letters get returned to you marked up with
red ink, highlighting your grammar and spelling
mistakes.
•
When you ask him how you look in a dress, he’ll actually tell you.
•
He won’t help you change a broken light bulb because his job is simply to
report and not to fix.
•
He’ll keep bringing up old problems that you’ve since resolved just to make
sure that they’re truly gone
12
University of Southern California
Center for Systems and Software Engineering
Static and Dynamic Verification
• Software reviews, inspections and
walkthroughs - Concerned with analysis
of the static system representation to
discover problems (static verification)
• Software testing with test cases Concerned with exercising and
observing product behaviour (dynamic
verification)
– The system is executed with test data and
its operational behaviour is observed
13
University of Southern California
Center for Systems and Software Engineering
Inspection
• Fagan Inspection
– Process
• Planning  Overview meeting  Preparation 
Inspection meeting  Rework  Follow-up
– Inspection roles
• Author, Moderator, Reader, Recorder, Inspector
– Inspection types
• Code review, peer review
– Inspect
• Req Spec, Sys Arch, Programming, Test scripts
14
University of Southern California
Center for Systems and Software Engineering
Inspections and testing
• Inspections and testing are complementary and
not opposing verification techniques
• Both should be used during the V & V process
• Inspections can check conformance with a
specification but not conformance with the
customer’s real requirements
• Inspections cannot check non-functional
characteristics such as performance, usability,
etc.
15
University of Southern California
Center for Systems and Software Engineering
Test data and test cases
• Test data Inputs which have been
devised to test the system
• Test cases Inputs to test the system
and the predicted outputs from these
inputs if the system operates
according to its specification
16
University of Southern California
Center for Systems and Software Engineering
Methods of testing
• Test to specification:
–
–
–
–
Black box
Data driven
Functional testing
Code is ignored: only use specification
document to develop test cases
• Test to code:
– Glass box/White box
– Logic driven testing
– Ignore specification and only examine the code.
17
University of Southern California
Center for Systems and Software Engineering
Types of Testing – (Jokes)
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Aggression Testing: If this doesn’t work, I’m gonna kill somebody.
Compression Testing: []
Confession Testing: Okay, Okay, I did program that bug.
Congressional Testing:Are you now, or have you ever been a bug?
Depression Testing:If this doesn’t work, I’m gonna kill myself.
Egression Testing: Uh-oh, a bug… I’m outta here.
Digression Testing: Well, it works, but can I tell you about my truck…
Expression Testing: #@%^&*!!!, a bug.
Obsession Testing: I’ll find this bug if it’s the last thing I do.
Oppression Testing: Test this now!
Poission Testing: Alors! Regardez le poission!
Repression Testing: It’s not a bug, it’s a feature.
Secession Testing: The bug is dead! Long lives the bug!
Suggestion Testing: Well, it works but wouldn’t it be better if…
Ref: netfunny.com
18
University of Southern California
Center for Systems and Software Engineering
Testing Levels
•
•
•
•
Unit testing
Integration testing
System testing
Acceptance testing
19
University of Southern California
Center for Systems and Software Engineering
Unit testing
• The most ‘micro’ scale of testing.
• Tests done on particular functions or code
modules.
• Requires knowledge of the internal
program design and code.
• Done by Programmers (not by testers).
• Unit testing tool
• http://en.wikipedia.org/wiki/List_of_unit_tes
ting_frameworks
20
University of Southern California
Center for Systems and Software Engineering
Integration Testing
• Testing of combined parts of an application to
determine their functional correctness.
• ‘Parts’ can be
– code modules
– individual applications
– client/server applications on a network.
• Types of Integration Testing
–
–
–
–
Top-down
Bottom-up
Sandwich
Big-bang
21
University of Southern California
Center for Systems and Software Engineering
Top-down Integration Testing
http://www.site.uottawa.ca/~ssome/Cours/CSI5118/Integration.pdf
22
University of Southern California
Center for Systems and Software Engineering
http://www.site.uottawa.ca/~ssome/Cours/CSI5118/Integration.pdf
23
University of Southern California
Center for Systems and Software Engineering
http://www.site.uottawa.ca/~ssome/Cours/CSI5118/Integration.pdf
24
University of Southern California
Center for Systems and Software Engineering
http://www.site.uottawa.ca/~ssome/Cours/CSI5118/Integration.pdf
25
University of Southern California
Center for Systems and Software Engineering
Systems Testing
•
To test the co-existence of products and
applications that are required to perform
together in the production-like operational
environment (hardware, software, network)
• To ensure that the system functions together
with all the components of its environment as a
total system
• To ensure that the system releases can be
deployed in the current environment
26
University of Southern California
Center for Systems and Software Engineering
Acceptance Testing
Objectives

To verify that the system meets
the user requirements
When

After System Testing
Input

Output
Business Needs & Detailed
Requirements
 Master Test Plan
 User Acceptance Test Plan
 User Acceptance Test report
27
University of Southern California
Center for Systems and Software Engineering
Load testing
– Testing an application under heavy loads.
– Eg. Testing of a web site under a range of
loads to determine, when the system
response time degraded or fails.
28
University of Southern California
Center for Systems and Software Engineering
Stress Testing
– Testing under unusually heavy loads, heavy
repetition of certain actions or inputs, input of
large numerical values, large complex queries to
a database etc.
– Term often used interchangeably with ‘load’ and
‘performance’ testing.
Performance testing
– Testing how well an application complies to
performance requirements.
29
University of Southern California
Center for Systems and Software Engineering
Alpha testing
Testing done when development is nearing
completion; minor design changes may still be
made as a result of such testing.
Beta-testing
•Testing when development and testing are
essentially completed and final bugs and
problems need to be found before release.
30
University of Southern California
Center for Systems and Software Engineering
Good Test Plans (1/2)
• Developed and Reviewed early.
• Clear, Complete and Specific
• Specifies tangible deliverables that can be
inspected.
• Staff knows what to expect and when to expect it.
31
University of Southern California
Center for Systems and Software Engineering
Good Test Plans (2/2)
• Realistic quality levels for goals
• Includes time for planning
• Can be monitored and updated
• Includes user responsibilities
• Based on past experience
• Recognizes learning curves
32
University of Southern California
Center for Systems and Software Engineering
Test Cases
Contents
– Test plan reference id
– Test case
– Test condition
– Expected behavior
33
University of Southern California
Center for Systems and Software Engineering
Good Test Cases
Find Defects
• Have high probability of finding a new
defect.
• Unambiguous tangible result that can be
inspected.
• Repeatable and predictable.
34
University of Southern California
Center for Systems and Software Engineering
Good Test Cases
• Traceable to requirements or design
documents
• Push systems to its limits
• Execution and tracking can be automated
• Do not mislead
• Feasible
35
University of Southern California
Center for Systems and Software Engineering
36
University of Southern California
Center for Systems and Software Engineering
Bugs prioritization
37
University of Southern California
Center for Systems and Software Engineering
Outline
• Software Test in General
• Value-based Software Test
38
University of Southern California
Center for Systems and Software Engineering
Tester’s Attitude and Mindset
“The job of tests,
and the people that develop and run tests,
is to prevent defects,
not to find them.”
- Mary Poppendieck
39
University of Southern California
Center for Systems and Software Engineering
Pareto 80-20 distribution of test case value
[Bullock, 2000]
100
Actual business
value
80
% of
Value
for
Correct
Customer
Billing
60
Automated test
generation tool
- all tests have equal value*
40
20
5
10
15
Customer Type
*Usual SwE assumption for all requirements,
objects, defects, …
40
University of Southern California
Center for Systems and Software Engineering
Business Case for Value-Based
Testing
Return on Investment
(ROI)
2
1.5
1
0.5
0
-0.5 0
20
40
60
80
100
-1
% Tests Run
Pareto testing
ATG testing
41
University of Southern California
Center for Systems and Software Engineering
• How can we compare the value of test
cases ?
• How to prioritize test cases ?
• How to measure the value of test cases?
42
University of Southern California
Center for Systems and Software Engineering
Value-based Software Testing FrameworkFeature Prioritization
43
University of Southern California
Center for Systems and Software Engineering
How much test is enough?
Li, Q., Yang, Y., Li, M., Wang, Q., Boehm, B. W. and Hu, C., Improving software testing
process: feature prioritization to make winners of success-critical stakeholders. Journal of
Software Maintenance and Evolution: Research and Practice, n/a. doi: 10.1002/smr.512
44
University of Southern California
Center for Systems and Software Engineering
Value-based Test Case Prioritization
Failed
<<Change the status to NA
for all test cases that
depends on this failed test
case>>
<<No dependencies or all
test cases in Dependencies
Set have been passed>>
Not-Tested-Yet
Ready-to-Test
NA
Passed
45
University of Southern California
Center for Systems and Software Engineering
Value-based Test Order Logic
•Value First: Test the one with the highest value.
•Dependency Second: If the test case with the highest
value is not “Ready-to-Test”, which means at least one
of the test cases in its Dependencies Set is “Not-TestedYet”. In such situation, prioritize the “Not-Tested-Yet”
test cases according to “Value First” in this
Dependencies Set and start to test until all test cases in
the Dependencies Set are “Passed”. Then the test case
with the highest value is “Ready-to-Test”.
•Shrink the prioritization set ASAP: Exclude the tested
one out of the prioritization set.
46
University of Southern California
Center for Systems and Software Engineering
Value-based Test Order Logic
Value-based Prioritization for One Regression Testing Round
Pick the one with the
highest Value
Exclude the “Passed”
one for prioritization
N
<<- -In the Whole Set- ->>
<<In the Dependencies Set>>
N
Have dependencies?
Start to test
<<Ready-to-Test>>
Failed?
Y
Y
N
All dependencies
passed?
Y
<<Ready-to-Test>>
Exclude the “Failed” one
and the others “NA” that
depends on it for
prioritization
Multiple Regression Tests
Until all Test Cases “Passed”
47
University of Southern California
Center for Systems and Software Engineering
Test Case Dependency Tree
Start
TC1.1.1 (8, 8)
TC1.1.3 (1, 4.5)
TC1.2.1 (8, 8)
TC2.1.1(12,9.3)
TC1.1.2 (1,1)
TC1.2.2 (1, 4.5)
TC1.2.3 (1,1)
TC1.2.4 (1, 4.5)
TC5.1.1 (2, 6)
TC3.1.1 (16, 11)
TC3.2.1 (12, 11.2)
TC3.2.1 (4, 10)
TC3.3.1 (4,9.6)
TC4.1.1 (15, 11.8)
TC4.2.1 (10, 11.5)
TC5.2.1 (2, 10.1 )
48
University of Southern California
Center for Systems and Software Engineering
Accumulated Cost-Effectiveness (ACE) of Test
70
60
50
40
30
Test Case Prioritization
20
Feature Prioritization
10
0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
49
University of Southern California
Center for Systems and Software Engineering
Testing in 577
• EP-17
– Value-Based Testing Process
• EP-18
– Value-Based Software Testing Guideline
• EP-19
– Test Plan and Cases Template
• EP-20
– VB Test Procedure and Result Template
50
University of Southern California
Center for Systems and Software Engineering
Test Plan
• What, when, where, how, by whom?
–
–
–
–
–
Type of testing
Timeline
Developers’ machine / server
Tools, HW, SW
Responsible person
• Traceability Matrix
51
University of Southern California
Center for Systems and Software Engineering
Test Cases
52
University of Southern California
Center for Systems and Software Engineering
53
University of Southern California
Center for Systems and Software Engineering
Test Procedure
54
University of Southern California
Center for Systems and Software Engineering
Found bugs, then what?
http://softwaretestingandqa.blogspot.com/
55
University of Southern California
Center for Systems and Software Engineering
Developer: There is no I in TEAM
Tester: We cannot spell BUGS without U
56
Download