Chapter 11. Testing

advertisement
Using UML, Patterns, and Java
Object-Oriented Software Engineering
Chapter 11. Testing
Outline
♦
♦
♦
♦
♦
Terminology
Types of errors
Dealing with errors
Quality assurance vs Testing
Component Testing
♦
System testing
Function testing
Structure Testing
Performance testing
Acceptance testing
Installation testing
Unit testing
Integration testing
♦
♦
Testing Strategy
Design Patterns & Testing
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
2
Quality of today’s software….
The average software product released on the market is not error
free.
QuickTime?and a
Cinepak decompressor
are needed to see this picture.
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
3
What is this?
A failure?
An error?
A fault?
Need to specify
the desired behavior first!
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
4
Erroneous State (“Error”)
-Stress or overload errors
-Capacity or boundary errors
-Timing errors
-Throughput or performance
errors
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
5
Algorithmic Fault
-Missing initialization
-Branching errors
(too soon, too late)
-Missing test for nil
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
6
Mechanical Fault
(very hard to find)
Documentation does not
match actual conditions or
operating procedures
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
7
Terminology
♦
Reliability: The measure of success with which the observed
behavior of a system confirms to some specification of its
behavior.
♦ Failure: Any deviation of the observed behavior from the
specified behavior.
♦ Error: The system is in an erroneous state such that further
processing by the system will lead to a failure.
♦ Fault (Bug or Defect): The mechanical or algorithmic cause of
an error.
There are many different types of errors and different ways how
we can deal with them.
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
8
Terminology
♦
♦
♦
♦
Software Reliability: The probability that a software system
will not cause system failure for a specified time under
specified condition. [IEEE Std. 982-1989]
The Goal of Testing is to maximize the number of discovered
faults, which then allows developers to correct them and
increase the reliability of the system.
Testing defines the systematic attempt to find faults in a
planned way in the implemented software.
Testing is the process of demonstrating that faults are not
present.
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
9
How do we deal with Errors and Faults?
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
10
Verification?
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
11
Modular Redundancy?
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
12
Declaring the Bug
as a Feature?
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
13
Object-Oriented Software Engineering: Using UML, Patterns, and Java
14
Patching?
Bernd Bruegge & Allen H. Dutoit
Testing?
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
15
Examples of Faults and Errors
♦♦ Faults
Faultsin
inthe
theInterface
Interface
specification
specification
Mismatch
Mismatchbetween
betweenwhat
whatthe
the
client
needs
and
what
the
client needs and what the
server
serveroffers
offers
Mismatch
Mismatchbetween
between
requirements
requirementsand
and
implementation
implementation
♦♦ Algorithmic
AlgorithmicFaults
Faults
Missing
Missinginitialization
initialization
Branching
Branchingerrors
errors(too
(toosoon,
soon,
too
late)
too late)
Missing
Missingtest
testfor
fornil
nil
Bernd Bruegge & Allen H. Dutoit
♦♦ Mechanical
MechanicalFaults
Faults(very
(very
hard
hardto
tofind)
find)
Documentation
Documentationdoes
doesnot
not
match
actual
conditions
match actual conditionsor
or
operating
operatingprocedures
procedures
♦♦ Errors
Errors
Stress
Stressor
oroverload
overloaderrors
errors
Capacity
Capacityor
orboundary
boundaryerrors
errors
Timing
Timingerrors
errors
Throughput
Throughputor
orperformance
performance
errors
errors
Object-Oriented Software Engineering: Using UML, Patterns, and Java
16
Dealing with Errors
♦
Verification:
Assumes hypothetical environment that does not match real
environment
Proof might be buggy (omits important constraints; simply wrong)
♦
Modular redundancy:
Expensive
♦
Declaring a bug to be a “feature”
Bad practice
♦
Patching
Slows down performance
♦
Testing (this lecture)
Testing is never good enough
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
17
Another View on How to Deal with Errors
♦
Error prevention (before the system is released):
Use good programming methodology to reduce complexity
Use version control to prevent inconsistent system
Apply verification to prevent algorithmic bugs
♦
Error detection (while system is running):
Testing: Create failures in a planned way
Debugging: Start with an unplanned failures
Monitoring: Deliver information about state. Find performance bugs
♦
Error recovery (recover from failure once the system is released):
Data base systems (atomic transactions)
Modular redundancy
Recovery blocks
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
18
Some Observations
♦
It is impossible to completely test any nontrivial module or any
system
Theoretical limitations: Halting problem
Practical limitations: Prohibitive in time and cost
♦
Testing can only show the presence of bugs, not their absence
of bugs. (Dijkstra)
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
19
Testing takes creativity
♦
To develop an effective test, one must have:
Detailed understanding of the system
Knowledge of the testing techniques
Skill to apply these techniques in an effective and efficient manner
♦
Testing is done best by independent testers
We often develop a certain mental attitude that the program should
in a certain way when in fact it does not.
♦
Programmer often stick to the data set that makes the program
work
"Don’t mess up my code!"
♦
A program often does not work when tried by somebody else.
Don't let this be the end-user.
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
20
Testing Activities
Subsystem
Code
Unit
Test
Tested
Subsystem
Subsystem
Code
Requirements
Analysis
Document
System
Design
Document
User
Manual
Unit
Test
Tested
Subsystem
Integration
Test
Functional
Test
Integrated
Subsystems
Functioning
System
Tested Subsystem
Subsystem
Code
Unit
Test
All
Alltests
testsby
bydeveloper
developer
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
21
Testing Activities continued
Client’s
Understanding
of Requirements
Global
Requirements
Validated
Functioning
System PerformanceSystem
Test
Accepted
System
Acceptance
Test
User
Environment
Installation
Test
Tests
Testsby
byclient
client
Tests
Testsby
bydeveloper
developer
Usable
System
User’s understanding
System in
Use
Tests
Tests(?)
(?) by
byuser
user
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
22
Fault Handling Techniques
Fault Handling
Fault Avoidance
Design
Methodology
Fault Tolerance
Fault Detection
Atomic
Transactions
Reviews
Verification
Modular
Redundancy
Configuration
Management
Debugging
Testing
Unit
Testing
Bernd Bruegge & Allen H. Dutoit
Integration
Testing
Correctness
Debugging
System
Testing
Performance
Debugging
Object-Oriented Software Engineering: Using UML, Patterns, and Java
23
Quality Assurance encompasses Testing
Quality Assurance
Usability Testing
Scenario
Testing
Fault Avoidance
Verification
Prototype
Testing
Product
Testing
Fault Tolerance
Configuration
Management
Atomic
Transactions
Modular
Redundancy
Fault Detection
Reviews
Walkthrough
Inspection
Unit
Testing
Bernd Bruegge & Allen H. Dutoit
Debugging
Testing
Integration
Testing
System
Testing
Correctness
Debugging
Object-Oriented Software Engineering: Using UML, Patterns, and Java
Performance
Debugging
24
Types of Testing
♦
Unit Testing:
Individual subsystem
Carried out by developers
Goal: Confirm that each subsystem is correctly coded and carries
out the intended functionality
♦
Integration Testing:
Groups of subsystems (collection of classes) and eventually the
entire system
Carried out by developers
Goal: Test the interface among the subsystem
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
25
System Testing
♦
System Testing:
The entire system
Carried out by developers
Goal: Determine if the system meets the requirements (functional
and global)
♦
Acceptance Testing:
Evaluates the system delivered by developers
Carried out by the client.
May involve executing typical transactions on site on a trial basis
Goal: Demonstrate that the system meets customer requirements
and is ready to use
♦
Implementation (Coding) and testing go hand in hand
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
26
Unit Testing
♦
Informal:
Incremental coding
♦
Static Analysis:
Hand execution: Reading the source code
Walk-Through (informal presentation to others)
Code Inspection (formal presentation to others)
Automated Tools checking for
syntactic and semantic errors
departure from coding standards
♦
Dynamic Analysis:
Black-box testing (Test the input/output behavior)
White-box testing (Test the internal logic of the subsystem or
object)
Data-structure based testing (Data types determine test cases)
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
27
Black-box Testing
♦
Focus: I/O behavior. If for any given input, we can predict the
output, then the module passes the test.
Almost always impossible to generate all possible inputs ("test
cases")
♦
Goal: Reduce number of test cases by equivalence partitioning:
Divide input conditions into equivalence classes
Choose test cases for each equivalence class. (Example: If an object
is supposed to accept a negative number, testing one negative
+
number is enough)
+
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
28
Black-box Testing (Continued)
♦
Selection of equivalence classes (No rules, only guidelines):
Input is valid across range of values. Select test cases from 3
equivalence classes:
Below the range
Within the range
Above the range
Input is valid if it is from a discrete set. Select test cases from 2
equivalence classes:
Valid discrete value
Invalid discrete value
♦
Another solution to select only a limited amount of test cases:
Get knowledge about the inner workings of the unit being tested =>
white-box testing
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
29
White-box Testing
♦
♦
Focus: Thoroughness (Coverage). Every statement in the component is
executed at least once.
Four types of white-box testing
Statement Testing
Loop Testing
Path Testing
Branch Testing
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
30
Terms and Notions (2/2)
♦
Decision
a program point at which the control flow can diverge
case statement: multi-way branch or decision
♦
Junction
a point in the program where control flow can merge
♦
Flowcharts vs. Control Flow Graph
flowcharts: focus on process step
control flow graph: ignore the process step
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
31
Object-Oriented Software Engineering: Using UML, Patterns, and Java
32
Flow Graphs (1/2)
INPUT X, Y
Z := X+Y
V := X-Y
IF Z>0 GOTO SAM
JOE: Z := Z-1
SAM: Z := Z+V
FOR U=0 to Z
V(U),U(V) := (Z+V)*U
IF V(U) = 0 GOTO JOE
Z := Z-1
IF Z=0 GOTO ELL
U := U+1
NEXT U
V(U-1) := V(U+1) + U(V-1)
ELL: V(U+U(V)) := U+V
IF U=V GOTO JOE
IF U > V THEN U := Z
Z := U
END
Bernd Bruegge & Allen H. Dutoit
Flow Graphs (2/2)
♦
Flow Graph vs. Program: not 1-to-1 mapping
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
33
Path Testing (1/2)
♦
Path
a sequence of instructions/statements that starts at an
entry/junction/decision and ends at junction/decision/exit
restricted to entry/exit path(complete path)
♦
Multi-Entry/Multi-Exit Routines
provide an entry/exit parameter to encapsulate the common parts
into subroutines
♦
Fundamental Path Selection Criteria
every entry/exit path
every statement at least once
every branch at least once
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
34
Path Testing (2/2)
♦
Reachability of Code
cannot be solved by static analysis
♦
Path-Testing Criteria
Pn : test all segments of length n at least once
P1 : statement coverage, node coverage
P2 : branch coverage, link/edge coverage
P∞ : path coverage
♦
Common Sense and Strategy
branch and statement coverage: minimum requirement
path testing is not sufficient for data flow error
♦
Which Path: never mind the number of paths
simple and meaningful paths
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
35
Loop
♦
Single Loop
insufficient by looping and non looping
bypassing, minimum, typical, maximum
♦
Nested Loop
impossible to all combination of nested loop
start from inner-most loop
♦
Concatenated Loop
iteration number is independent: treat as two individual loops
otherwise: treat as nested loop
♦
Horrible Loop : avoid it
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
36
Multi-Entry/Multi-Exit Routines
♦
Testing Multi-Entry/MultiExit Routines
get rid of them
augment the flow graph
with input case statement
and exit parameters
do stronger unit testing
for each entry/exit
combination
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
37
White-box Testing (Continued)
♦
♦
Statement Testing (Algebraic Testing): Test single statements
(Choice of operators in polynomials, etc)
Loop Testing:
Cause execution of the loop to be skipped completely. (Exception:
Repeat loops)
Loop to be executed exactly once
Loop to be executed more than once
♦
Path testing:
Make sure all paths in the program are executed
♦
Branch Testing (Conditional Testing): Make sure that each
possible outcome from a condition is tested at least once
if ( i = TRUE) printf("YES\n");else printf("NO\n");
Test cases: 1) i = TRUE; 2) i = FALSE
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
38
White-box Testing Example
FindMean(float Mean, FILE ScoreFile)
{ SumOfScores = 0.0; NumberOfScores = 0; Mean = 0;
Read(ScoreFile, Score); /*Read in and sum the scores*/
while (! EOF(ScoreFile) {
if ( Score > 0.0 ) {
SumOfScores = SumOfScores + Score;
NumberOfScores++;
}
Read(ScoreFile, Score);
}
/* Compute the mean and print the result */
if (NumberOfScores > 0 ) {
Mean = SumOfScores/NumberOfScores;
printf("The mean score is %f \n", Mean);
} else
printf("No scores found in file\n");
}
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
39
White-box Testing Example: Determining the Paths
FindMean (FILE ScoreFile)
{ float SumOfScores = 0.0;
int NumberOfScores = 0;
1
float Mean=0.0; float Score;
Read(ScoreFile, Score);
2 while (! EOF(ScoreFile) {
3 if (Score > 0.0 ) {
SumOfScores = SumOfScores + Score;
NumberOfScores++;
}
5
Read(ScoreFile, Score);
4
6
}
/* Compute the mean and print the result */
7 if (NumberOfScores > 0) {
Mean = SumOfScores / NumberOfScores;
printf(“ The mean score is %f\n”, Mean);
} else
printf (“No scores found in file\n”);
9
}
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
8
40
Constructing the Logic Flow Diagram
Start
1
1-2-7-8
1-2-7-9
1-2-3-4-2-7-8
1-2-3*-4*-2*-
F
2
T
3
T
F
5
4
6
7
T
F
9
8
Exit
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
41
Finding the Test Cases
Start
1
a (Covered by any data)
2
b (Data set must contain at least one value)
(Positive score) d
c
4
(Data set must
f
be empty)
3
6
7
(Total score < 0.0) i
8
e (Negative score)
5
h (Reached if either f or
g
e is reached)
j (Total score > 0.0)
9
k
l
Exit
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
42
Test Cases
♦
♦
♦
Test case 1 : ? (To execute loop exactly once)
Test case 2 : ? (To skip loop body)
Test case 3: ?,? (to execute loop more than once)
These 3 test cases cover all control flow paths
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
43
Comparison of White & Black-box Testing 25.1.2002
♦
♦
White-box Testing:
Potentially infinite number of
paths have to be tested
White-box testing often tests
what is done, instead of what
should be done
Cannot detect missing use cases
♦
Black-box Testing:
Potential combinatorial
explosion of test cases (valid &
invalid data)
Often not clear whether the
selected test cases uncover a
particular error
Does not discover extraneous
use cases ("features")
Bernd Bruegge & Allen H. Dutoit
♦
♦
Both types of testing are needed
White-box testing and black box
testing are the extreme ends of a
testing continuum.
Any choice of test case lies in
between and depends on the
following:
Number of possible logical paths
Nature of input data
Amount of computation
Complexity of algorithms and
data structures
Object-Oriented Software Engineering: Using UML, Patterns, and Java
44
The 4 Testing Steps
1.1.Select
Selectwhat
whathas
hasto
tobe
be
measured
measured
3.3.Develop
Developtest
testcases
cases
Analysis:
Analysis:Completeness
Completenessof
of
requirements
requirements
Design:
Design:tested
testedfor
forcohesion
cohesion
Implementation:
Implementation:Code
Codetests
tests
2.2.Decide
Decidehow
howthe
thetesting
testingisis
done
done
Code
Codeinspection
inspection
Proofs
Proofs(Design
(Designby
byContract)
Contract)
Black-box,
Black-box,white
whitebox,
box,
Select
Selectintegration
integrationtesting
testing
strategy
(big
bang,
strategy (big bang,bottom
bottom
up,
up,top
topdown,
down,sandwich)
sandwich)
Bernd Bruegge & Allen H. Dutoit
AAtest
testcase
caseisisaaset
setof
oftest
test
data
dataor
orsituations
situationsthat
thatwill
will
be
used
to
exercise
the
unit
be used to exercise the unit
(code,
(code,module,
module,system)
system)being
being
tested
or
about
the
tested or about theattribute
attribute
being
beingmeasured
measured
4.4.Create
Createthe
thetest
testoracle
oracle
An
Anoracle
oraclecontains
containsof
ofthe
the
predicted
results
for
a
predicted results for aset
setof
of
test
cases
test cases
The
Thetest
testoracle
oraclehas
hasto
tobe
be
written
down
before
written down beforethe
the
actual
actualtesting
testingtakes
takesplace
place
Object-Oriented Software Engineering: Using UML, Patterns, and Java
45
Guidance for Test Case Selection
♦♦ Use
Useanalysis
analysis
knowledge
knowledge
about
functional
about functional
requirements
requirements(black-box
(black-box
testing):
testing):
Use
Usecases
cases
Expected
Expectedinput
inputdata
data
Invalid
input
data
Invalid input data
♦♦ Use
Useimplementation
implementation
knowledge
knowledgeabout
aboutalgorithms:
algorithms:
Examples:
Examples:
Force
Forcedivision
divisionby
byzero
zero
Use
Usesequence
sequenceof
oftest
testcases
casesfor
for
interrupt
handler
interrupt handler
♦♦ Use
Usedesign
design
knowledge
knowledgeabout
about
system
structure,
algorithms,
system structure, algorithms,
data
datastructures
structures (white-box
(white-box
testing):
testing):
Control
Controlstructures
structures
Test
Testbranches,
branches,loops,
loops,......
Data
Datastructures
structures
Test
Testrecords
recordsfields,
fields,arrays,
arrays,
......
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
46
Unit-testing Heuristics
1.1.Create
Createunit
unittests
testsas
assoon
soonas
asobject
object
design
is
completed:
design is completed:
Black-box
Black-boxtest:
test:Test
Testthe
theuse
use
cases
&
functional
model
cases & functional model
White-box
White-boxtest:
test:Test
Testthe
the
dynamic
model
dynamic model
Data-structure
Data-structuretest:
test:Test
Testthe
the
object
model
object model
2.2.Develop
Developthe
thetest
testcases
cases
Goal:
Goal:Find
Findthe
theminimal
minimal
number
of
test
cases
number of test casesto
tocover
cover
as
many
paths
as
possible
as many paths as possible
3.3.Cross-check
Cross-checkthe
thetest
testcases
casestoto
eliminate
eliminateduplicates
duplicates
Don't
Don'twaste
wasteyour
yourtime!
time!
Bernd Bruegge & Allen H. Dutoit
4.4.Desk
Deskcheck
checkyour
yoursource
sourcecode
code
Reduces
testing
time
Reduces testing time
5.5.Create
Createaatest
testharness
harness
Test
drivers
Test driversand
andtest
teststubs
stubsare
are
needed
for
integration
testing
needed for integration testing
6.6.Describe
Describethe
thetest
testoracle
oracle
Often
Oftenthe
theresult
resultof
ofthe
thefirst
first
successfully
executed
successfully executedtest
test
7.7.Execute
Executethe
thetest
testcases
cases
Don’t
forget
Don’t forgetregression
regressiontesting
testing
Re-execute
Re-executetest
testcases
casesevery
everytime
time
aachange
is
made.
change is made.
8.8.Compare
Comparethe
theresults
resultsof
ofthe
thetest
testwith
withthe
the
test
oracle
test oracle
Automate
Automateas
asmuch
muchas
aspossible
possible
Object-Oriented Software Engineering: Using UML, Patterns, and Java
47
Download