Software Testing Goal: develop software to meet its intended use! But: human beings make mistake! No issue is meaningful unless it can be put to the test of decisive verification. C.S. Lewis, 1934 bridge automobile television word processor ⇒ Product of any engineering activity must be verified against its requirements throughout its development. 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 1 21 September 2003 Customer PUM, Mariam Kamkar, IDA, LiU 2 Developer • Verifying bridge = verifying design, construction, process,… • Software must be verified in much the same spirit. In this lecture, however, we shall learn that verifying software is perhaps more difficult than verifying other engineering products. Requirements definition Requirements specification Functional requirements Nonfunctional requirements We shall try to clarify why this is so. Design Specification Code = System 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 3 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 4 1 Outline • • • • • • Component/Unit/Module/Basic testing Integration testing Function testing Performance testing Acceptance testing Installation testing Component code 21 September 2003 Component code Objective PUM, Mariam Kamkar, IDA, LiU • Component/Unit/Module/Basic testing • Integration testing Objective: to ensure that code implemented the design properly. 5 21 September 2003 6 Error, Fault, Failure Design Specification Unit test PUM, Mariam Kamkar, IDA, LiU Tested components Can lead to Integration test Unit test Can lead to Human error (bug) Tested components Fault Integrated modules Failure 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 7 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 8 2 Debugging vs Testing Types of Faults • • • • • Debugging: to find the bug • Testing: to demonstrate the existence of a fault – fault identification – fault correction / removal Algorithmic: division by zero Computation & Precision: order of op Documentation: doc - code Stress/Overload: data-str size ( dimensions of tables, size of buffers) Capacity/Boundary: x devices, y parallel tasks, z interrupts Timing/Coordination: real-time systems Throughout/Performance: speed in req • • • 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 9 21 September 2003 Types of Faults • • • PUM, Mariam Kamkar, IDA, LiU 10 Unit Testing Recovery: power failure Hardware & System Software: modem Standards & Procedure: organizational standard; difficult for programmers to follow each other 21 September 2003 PUM, Mariam Kamkar, IDA, LiU • • • • 11 Code Inspections Code Walkthroughs Open box testing Black box testing 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 12 3 Inspection Input Failure? • • • • • Test Object overview preparation inspection rework follow-up Output Oracle 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 13 21 September 2003 Inspection (cont) Use of un-initialized variables Jumps into loops Non-terminating loops Incompatible assignments Array indexes out of bounds Off-by-one errors Improper storage allocation or de-allocation Mismatches between actual and formal parameters in procedure calls 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 14 Walkthroughs some classical programming errors • • • • • • • • PUM, Mariam Kamkar, IDA, LiU design, code, chapter of user’s guide,… • presenter • coordinator • secretary • maintenance oracle • standards bearer • user representative 15 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 16 4 Black box / Closed box testing Discovery activity • incorrect or missing functions • interface errors • performance error Faults found per thousand lines of code Requirements review Design review Code inspection Integration test Acceptance test 2.5 5.0 10.0 3.0 2.0 input output 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 17 Black box testing • • • • 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 18 Equivalence partitioning Equivalence partitioning Boundary value analysis Decision Table Cause-Effect Invalid inputs Valid inputs • Exhaustive testing outputs 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 19 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 20 5 Guidelines Specification: the program accepts four to eight inputs which are 5 digit integers greater than 10000. 21 September 2003 PUM, Mariam Kamkar, IDA, LiU If an input condition specifies • A range: one valid and two invalid equivalence classes. • A specific value: one valid and two invalid equivalence classes. • A member of a set: one valid and one invalid equivalence classes. • A boolean: one valid and one invalid class. 21 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 22 Boundary value analysis Glass box testing! White box testing! Open box testing! Clear box testing! 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 23 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 24 6 Statement Coverage Glass box testing • • • • • logical decision loops internal data structure paths ... begin Begin if ( y >= 0) then y = 0; abs = y; end; test case-1 (yes): y >= 0 yes y=0 abs = y input: y = ? expected result: ? actual result: ? 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 25 test case-1(yes): input: y = ? expected result: ? actual result: ? 21 September 2003 if ( x < 10 && y > 20) { begin y >= 0 no yes y=0 abs = y z = foo (x,y); else z =fie (x,y); } end; test case-1 (yes): input: x = ?, y = ? expected result: ... actual result: ? test case-2 (no): input: y = ? expected result: ? actual result: ? PUM, Mariam Kamkar, IDA, LiU PUM, Mariam Kamkar, IDA, LiU 26 Begin Branch Coverage Begin if ( y >= 0) then y = 0; abs = y; end; 21 September 2003 27 21 September 2003 no z=fie (x,y) x<10 && y>20 yes z=foo (x,y) test case-2 (no): input: x = ?, y = ? expected result: ... actual result: ? PUM, Mariam Kamkar, IDA, LiU 28 7 Condition - Branch Coverage Path Coverage Begin no yes if ( x < 10 && y > 20) { x < 10 z = foo (x,y); else z =fie (x,y); no y > 20 } yes end; z=fie (x,y) z=foo (x,y) no x <> 0 z = z-x 21 September 2003 z > 10 z=0 PUM, Mariam Kamkar, IDA, LiU 29 (n,y) x = ?, z = ? (y,n) x = ?, z = ? z = sin(x) yes x<? y>? ----------------------------------test-case-1: t t test-case:2 t f test-case-3: f t test-case-4 f f yes no z=z/x 21 September 2003 Path with loops (n,n) x = ?, z = ? (n,y) x = ?, z = ? (y,n) x = ?, z = ? (y,y) x = ?, z = ? PUM, Mariam Kamkar, IDA, LiU 30 Data Flow Testing DEF(S) = {x | statement S contains a definition of variable x} USE(S) = {x | statement S contains a use of variable x} DEF-USE-Chain (du chain) = [x, S, S’] a d b c e a ? ? S1: i = 1; DEFS (S1) = {?} (?) S2: while (i <= n) USE(S2) = {?} (?) du-1: [i, S1, S2] (?), (?) e 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 31 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 32 8 Data Flow testing s = 0; i = 1; while (i <= n) { s + = i; i ++ } print (s); print (i); print (n); 21 September 2003 (d1-s) (d2-i) (u1-i, u2-n) Data Flow Testing s = 0; i = 1; while (i <= n) { s + = i; i ++ } print (s); print (i); print (n); du-s (u3-s, u4-i, d3-s) (u5-i, d4-i) du-s (u6-s) (u7-i) (u8-n) PUM, Mariam Kamkar, IDA, LiU 33 21 September 2003 dd: def-def dk: def-kill du: def-use ... (d1-s) (d2-i) (u1-i, u2-n) (u3-s, u4-i, d3-s) (u5-i, d4-i) (u6-s) (u7-i) (u8-n) PUM, Mariam Kamkar, IDA, LiU 34 Relative strengths of test strategies (B. Beizer 1990) Program Slicing All paths s = 0; i = 1; while (i <= n) { s + = i; i ++ } print (s); print (i); print (n); All definition-use paths i = 1; while (i <= n) { All uses All predicate/ Some computational uses } All computational/ Some predicate uses print (i); All computational uses i ++ All predicate uses Branch All definition 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 35 21 September 2003 PUM, Mariam Kamkar, IDA, LiU Statement 36 9 Components Classes of Integration Testing • • • • driver Bottom-up Top-down Big bang Sandwich Boundary conditions independent paths interface ... Component to be tested stub 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 37 21 September 2003 stub PUM, Mariam Kamkar, IDA, LiU Test cases 38 Bottom-up A B E 21 September 2003 C F D G PUM, Mariam Kamkar, IDA, LiU 39 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 40 10 Top-down 21 September 2003 PUM, Mariam Kamkar, IDA, LiU Big-bang 41 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 42 Sandwich Test Planing • • • • • • 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 43 Establishing test objectives Designing test cases Writing test cases Testing test cases Executing tests Evaluating test results 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 44 11 Termination Problem How decide when to stop testing Automated Testing Tools • Code Analysis tools • The main problem for managers! – Static, Dynamic • Termination takes place when • Test execution tools • resources (time & budget) are over • found the seeded faults • some coverage is reached – Capture-and-Replay – Stubs & Drivers • Test case generator 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 45 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 46 Outline Scaffolding Oracle • • • • • • :KDWFDQEHDXWRPDWHG" Test case generation 21 September 2003 Termination PUM, Mariam Kamkar, IDA, LiU 47 Component/Unit/Module/Basic testing Integration testing Function testing Performance testing Acceptance testing Installation testing 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 48 12 Component code • • • • Function testing Performance testing Acceptance testing Installation testing Objective: to ensure that the system does what the customer wants it to do. PUM, Mariam Kamkar, IDA, LiU Functioning systems Function test Acceptance test Customer requirements spec. 21 September 2003 Design Specification Unit test 7HVWHGFRPSRQHQWV Integration test Unit test 7HVWHGFRPSRQHQWV Integrated modules 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 50 Other software requirements Accepted system Integrated modules System functional requirements 49 Performance test Verified validated software 21 September 2003 Component code Objective Installation test System In Use! User environment PUM, Mariam Kamkar, IDA, LiU 51 Function testing functional requirements • have a high probability of detecting a fault • Use a test team independent of the designers and programmers • know the expected actions and output • test both valid and invalid input • never modify the system just to make testing easier • have stopping criteria 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 52 13 Cause-Effect (test case generation from req.) &DXVHV (IIHFWV C1: command is credit C2: command is debit C3: account number is valid C4: transaction amount is valid E1: print “invalid command” E2: print “invalid account 21 September 2003 C2 E3: print “debit amount not valid ” E4: debit account print E5: credit account print PUM, Mariam Kamkar, IDA, LiU 53 Performance testing nonfunctional requirements • • • • • • • E3 not C4 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 54 Acceptance testing customers, users need Security Accuracy Speed Recovery Stress test Volume test … 21 September 2003 and C3 number” • Benchmark test: a set of special test cases • Pilot test: everyday working – Alpha test: at the developer’s site, controlled environment – Beta test: at one or more customer site. • Parallel test: new system in parallel with previous one PUM, Mariam Kamkar, IDA, LiU 55 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 56 14 Installation testing users site Real life examples • First U.S. space mission to Venus failed. (reason: missing comma in a Fortran do loop) • December 1995: AA, Boeing 575, mountain crash in Colombia, 159 killed. Incorrect one-letter computer command (Cali, Bogota 132 miles in opposite direction, have same coordinate code) • June 1996: Ariane-5 space rocket, self-destruction, $500 million. (reason: reuse of software from Ariane-4 without recommended testing). Acceptance test at developers site Æ installation test at users site, otherwise may not be needed!! 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 57 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 58 Goals of software testing: Historical Evolution Real life examples Years • Australia: Man jailed because of computer glitch. He was jailed for traffic fine although he had actually paid it for 5 years ago. • Dallas Prisoner released due to program design flaw: He was temporary transferred from one prison to another (witness). Computer gave him “temporary assignment”. Prevent software faults Measure SQA test objectives 1980 1979: Myers, “Art of software testing” 1970 Establish confidence 1972 June, First formal conference software testing, university of North Carolina, Bill Hetzel. 1960 1957, Charles Baker distinguished debugging from testing 1950 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 59 Find faults 1981: Deutsch. Software project V and V cost complexity # applic. 21 September 2003 Not distinguished from debugging PUM, Mariam Kamkar, IDA, LiU 60 15 And … Testing can show the presence, but never the absence of errors in software. E. Dijkstra, 1969 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 61 21 September 2003 PUM, Mariam Kamkar, IDA, LiU 62 16