Sie sind auf Seite 1von 46

Object-Oriented Software Engineering

Using UML, Patterns, and Java

Chapter 11, Testing

Outline

Terminology Types of errors Dealing with errors Quality assurance vs Testing Component Testing
Unit testing Integration testing

System testing
Function testing Structure Testing Performance testing Acceptance testing Installation testing

Testing Strategy Design Patterns & Testing

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

Qualityoftodayssoftware.
y

Theaveragesoftwareproductreleasedonthemarketisnoterror free.

QuickTime and a Cinepak decompressor are needed to see this picture.

Bernd Bruegge & Allen H. Dutoit

OOSE: ch11lect1r1.ppt

What is this?

A failure? An error? A fault? Need to specify the desired behavior first!

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

Erroneous State (Error)

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

Algorithmic Fault

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

Mechanical Fault

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

Terminology

Reliability: The measure of success with which the observed behavior of a system confirms to some specification of its behavior. Failure: Any deviation of the observed behavior from the specified behavior. Error: The system is in a state such that further processing by the system will lead to a failure. Fault (Bug): The mechanical or algorithmic cause of an error. There are many different types of errors and different ways how we can deal with them.

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

How do we deal with Errors and Faults?

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

Verification?

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

10

Modular Redundancy?

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

11

Declaring the Bug as a Feature?

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

12

Patching?

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

13

Testing?

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

14

Examples of Faults and Errors


Faults in the Interface Faults in the Interface Mechanical Faults (very Mechanical Faults (very

specification specification

hard to find) hard to find)

Mismatch between what the Mismatch between what the client needs and what the client needs and what the server offers server offers Mismatch between Mismatch between requirements and requirements and implementation implementation
Algorithmic Faults Algorithmic Faults

Documentation does not Documentation does not match actual conditions or match actual conditions or operating procedures operating procedures
Errors Errors

Missing initialization Missing initialization Branching errors (too soon, Branching errors (too soon, too late) too late) Missing test for nil Missing test for nil

Stress or overload errors Stress or overload errors Capacity or boundary errors Capacity or boundary errors Timing errors Timing errors Throughput or performance Throughput or performance errors errors

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

15

Dealing with Errors

Verification:
Assumes hypothetical environment that does not match real environment Proof might be buggy (omits important constraints; simply wrong)

Modular redundancy:
Expensive

Declaring a bug to be a feature


Bad practice

Patching
Slows down performance

Testing (this lecture)


Testing is never good enough

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

16

Another View on How to Deal with Errors

Error prevention (before the system is released):


Use good programming methodology to reduce complexity Use version control to prevent inconsistent system Apply verification to prevent algorithmic bugs

Error detection (while system is running):


Testing: Create failures in a planned way Debugging: Start with an unplanned failures Monitoring: Deliver information about state. Find performance bugs

Error recovery (recover from failure once the system is released):


Data base systems (atomic transactions) Modular redundancy Recovery blocks

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

17

Some Observations

It is impossible to completely test any nontrivial module or any system


Theoretical limitations: Halting problem Practial limitations: Prohibitive in time and cost

Testing can only show the presence of bugs, not their absence (Dijkstra)

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

18

Testing takes creativity


Testing often viewed as dirty work. To develop an effective test, one must have:

Detailed understanding of the system Knowledge of the testing techniques Skill to apply these techniques in an effective and efficient manner

Testing is done best by independent testers


We often develop a certain mental attitude that the program should in a certain way when in fact it does not.

Programmer often stick to the data set that makes the program work
"Dont mess up my code!"

A program often does not work when tried by somebody else.


Don't let this be the end-user.
Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

19

Testing Activities
Subsystem Code Subsystem Code

Unit Test
Tested Subsystem

System Design Document

Requirements Analysis Document

Unit Test
Tested Subsystem

User Manual

Integration Test
Integrated Subsystems

Functional Test

Functioning System

Tested Subsystem

Subsystem Code

Unit Test
All tests by developer All tests by developer

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

20

Testing Activities continued


Global Requirements
Validated Functioning System PerformanceSystem

Clients Understanding of Requirements


Accepted System

User Environment

Test

Acceptance Test

Installation Test
Usable System

Tests by client Tests by client Tests by developer Tests by developer Users understanding

System in Use
Bernd Bruegge & Allen H. Dutoit OOSE:

Tests (?) by user Tests (?) by user


ch11lect1r2.ppt

21

Fault Handling Techniques


Fault Handling

Fault Avoidance

Fault Detection

Fault Tolerance

Design Methodology Verification

Reviews Configuration Management Testing

Atomic Transactions

Modular Redundancy

Debugging

Unit Testing

Integration Testing

System Testing

Correctness Debugging

Performance Debugging

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

22

Quality Assurance encompasses Testing


Quality Assurance Usability Testing Scenario Testing Fault Avoidance Fault Tolerance Atomic Transactions Fault Detection Reviews Debugging Testing Correctness Debugging Performance Debugging 23 Modular Redundancy Prototype Testing Product Testing

Verification

Configuration Management

Walkthrough

Inspection

Unit Testing

Integration Testing

Bernd Bruegge & Allen H. Dutoit OOSE:

System Testing ch11lect1r2.ppt

Types of Testing

Unit Testing:
Individual subsystem Carried out by developers Goal: Confirm that subsystems is correctly coded and carries out the intended functionality

Integration Testing:
Groups of subsystems (collection of classes) and eventually the entire system Carried out by developers Goal: Test the interface among the subsystem

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

24

System Testing

System Testing:
The entire system Carried out by developers Goal: Determine if the system meets the requirements (functional and global)

Acceptance Testing:
Evaluates the system delivered by developers Carried out by the client. May involve executing typical transactions on site on a trial basis Goal: Demonstrate that the system meets customer requirements and is ready to use

Implementation (Coding) and testing go hand in hand


Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

25

Unit Testing

Informal:
Incremental coding

Static Analysis:
Hand execution: Reading the source code Walk-Through (informal presentation to others) Code Inspection (formal presentation to others) Automated Tools checking for syntactic and semantic errors departure from coding standards

Dynamic Analysis:
Black-box testing (Test the input/output behavior) White-box testing (Test the internal logic of the subsystem or object) Data-structure based testing (Data types determine test cases)
ch11lect1r2.ppt 26

Bernd Bruegge & Allen H. Dutoit OOSE:

Black-box Testing

Focus: I/O behavior. If for any given input, we can predict the output, then the module passes the test.
Almost always impossible to generate all possible inputs ("test cases")

Goal: Reduce number of test cases by equivalence partitioning:


Divide input conditions into equivalence classes Choose test cases for each equivalence class. (Example: If an object is supposed to accept a negative number, testing one negative number is enough)

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

27

Black-box Testing (Continued)

Selection of equivalence classes (No rules, only guidelines):


Input is valid across range of values. Select test cases from 3 equivalence classes:

Below the range Within the range Above the range

Input is valid if it is from a discrete set. Select test cases from 2 equivalence classes:

Valid discrete value Invalid discrete value

Another solution to select only a limited amount of test cases:


Get knowledge about the inner workings of the unit being tested => white-box testing

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

28

White-box Testing

Focus: Thoroughness (Coverage). Every statement in the component is executed at least once. Four types of white-box testing
Statement Testing Loop Testing Path Testing Branch Testing

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

29

White-box Testing (Continued)


Statement Testing (Algebraic Testing): Test single statements (Choice of operators in polynomials, etc) Loop Testing:
Cause execution of the loop to be skipped completely. (Exception: Repeat loops) Loop to be executed exactly once Loop to be executed more than once

Path testing:
Make sure all paths in the program are executed

Branch Testing (Conditional Testing): Make sure that each possible outcome from a condition is tested at least once
if ( i = TRUE) printf("YES\n");else printf("NO\n"); Test cases: 1) i = TRUE; 2) i = FALSE
Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

30

White-box Testing Example


FindMean(float Mean, FILE ScoreFile) { SumOfScores = 0.0; NumberOfScores = 0; Mean = 0; Read(ScoreFile, Score); /*Read in and sum the scores*/ while (! EOF(ScoreFile) { if ( Score > 0.0 ) { SumOfScores = SumOfScores + Score; NumberOfScores++; } Read(ScoreFile, Score); } /* Compute the mean and print the result */ if (NumberOfScores > 0 ) { Mean = SumOfScores/NumberOfScores; printf("The mean score is %f \n", Mean); } else printf("No scores found in file\n"); }
Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

31

White-box Testing Example: Determining the Paths


FindMean (FILE ScoreFile) { float SumOfScores = 0.0; int NumberOfScores = 0; 1 float Mean=0.0; float Score; Read(ScoreFile, Score); 2 while (! EOF(ScoreFile) { 3 if (Score > 0.0 ) { SumOfScores = SumOfScores + Score; NumberOfScores++; } 5 Read(ScoreFile, Score); 6 } /* Compute the mean and print the result */ 7 if (NumberOfScores > 0) { Mean = SumOfScores / NumberOfScores; printf( The mean score is %f\n, Mean); } else printf (No scores found in file\n); } Bernd Bruegge & Allen H. Dutoit OOSE: ch11lect1r2.ppt RJLQ: Why are the rectangles called Basic Blocks in assemblers/compilers?

8 9
32

Constructing the Logic Flow Diagram


RJLQ: Can you write a regexpr for all paths thru the tests and basic blocks in this flowchart?
F T 3 T 4 6 F 5 Start 1 2

7 T 8 Exit
Bernd Bruegge & Allen H. Dutoit OOSE:

F 9

ch11lect1r2.ppt

33

Finding the Test Cases


Start

[RJLQ: What is ambiguous about the label on link c?]

1 a (Covered by any data) 2 b (Data set must contain at least one value) 3 e (Negative score) 5 h (Reached if either 4 or g 5 is reached)

(Positive score) d c (Data set must be empty) 4 f

6 7

(Total score < 0.0) 8 k

i Exit

j (Total score > 0.0) 9 l

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

34

Test Cases

Test case 1 : ? (To execute loop exactly once) Test case 2 : ? (To skip loop body) Test case 3: ?,? (to execute loop more than once) These 3 test cases cover all control flow paths

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

35

White-box Testing Example: Trail STD for Program*:


FindMean (FILE * fp) 0 1

--float Sum = 0.0; int Num = 0; float Mean=0.0; float Score; read(fp, Score);

(! EOF(ScoreFile))?

--F else 3 4 (Score>0.0)? T Sum += Score; Num++;

F 5 else 7 if (NumberOfScores > 0) /* Compute mean; print result */ T Mean = (Sum/Num); 8 printf(Mean score: %f\n, Mean); else 9 F

--6 Read(fp, Score) Both of these are PASS exits (return True to caller)

--printf (No scores found in file\n);

RJLQ1: Why are there no FAIL exits? RJLQ2: Can you merge any states or call stmts?? * Modified for C: pre-condition: (fp = fopen(ScoreFile, read))
Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

36

White-box Testing: RegExpr for Program Traces:


FindMean (FILE * fp) 0 1

--Action0*

(! EOF(ScoreFile))?

--F 3 (Score>0.0)? 4 F 5 T

On entry, test StateKs guard condition. If TRUE, execute ActionK Routine and, on a trigger event, go to the PASS-exit state. If FALSE, skip ActionK and go to the FAIL-exit state (now or on trigger?).

Action3
? 6

--Action?

7 if (NumberOfScores > 0) /* Compute mean; print result */ 8 else ? 9 F

T Both of these are PASS exits (return True to caller)

Action7

--Action?

RegExpr: 012{3(4|5)62}*7(8|9) <2, 3, 7> => (1([4]6)*(8| Bernd Bruegge & Allen H. Dutoit OOSE: ch11lect1r2.ppt 9))

37

White-box Testing Example: Revised STD for Program*:


FindMean (FILE * fp) 1 float Sum = 0.0; int Num = 0; float Mean=0.0; float Score; read(fp, Score); 2 (! EOF(ScoreFile))? T *On entry, test StateKs guard condition. If TRUE, execute ActionK Routine and, on a trigger event, go to the PASS-exit state. If FALSE, skip ActionK and go to the FAIL-exit state (now or on trigger?). 3 (Score>0.0)? 4: Num++; Sum += Score; Read(fp, Score) F else ?

--F else 7 if (NumberOfScores > 0) /* Compute mean; print result */ 8: Mean = (Sum/Num); printf(Mean score: %f\n, Mean); F else ?

--6: Read(fp, Score)

--9: printf (No scores found in file\n) RJLQ: Why are there no FAIL exits? * Pre-condition: (fp = fopen(ScoreFile, read))
Bernd Bruegge & Allen H. Dutoit OOSE:

Both of these are PASS exits (return True to caller)

ch11lect1r2.ppt

38

Revised function FindMean(Mean, fp)


#include <stdio.h> #include <stdlib.h> int FindMean(float* Mean, FILE* fp) //returns Num and updates mean value; { float Sum = 0.0; float Score = 0.0; int Num = 0; // (dont do this in src code) fscanf(fp, "%f", &Score); //red and parse into Score; while (!feof(fp)) { //compute the mean and return the result: if (Score>0.0) { Sum+= Score; Num++; dprintffd("Score=%f; Sum = %f; Num = %d\n", Score, Sum, Num); } fscanf(fp, "%f", &Score); } if (Num>0) *Mean = (Sum/Num); return Num; // } // (printout moved out - FindMean was over-constrained // dprint* macros are: #ifdef DEBUG printf()
Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

39

Main: Test Driver for Function Under Test:


int main (int argc, char* argv[]) { FILE* fp; float mean; int num = 0; if (argc < 2) { printf("Usage: meantest scorefile\n"); return 0; } if ((fp = (fopen(argv[1], "r")))) { //pre-condition num = FindMean(&mean, fp); //<- Function Under Test if (num>0) printf("The mean score is %f\n", mean); else printf("No scores in file\n"); // (i.e., no scores > 0) return 1; } else { //not part of Slide 31 printf("Can't open file %s\n", argv[1]); return 0; } } // (printouts moved out of FindMean - over-constrained
Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

40

Build and run meantest on scorefile2:


saturn.cs.uml.edu(269)> cat scorefile2 32 12 14 20 0 5 0 12 20 2 ----------------------------------------------------------------saturn.cs.uml.edu(270)> meantest scorefile2 Score=32.000000; Sum = 32.000000; Num = 1 Score=12.000000; Sum = 44.000000; Num = 2 Score=14.000000; Sum = 58.000000; Num = 3 Score=20.000000; Sum = 78.000000; Num = 4 Score=5.000000; Sum = 83.000000; Num = 5 Score=12.000000; Sum = 95.000000; Num = 6 Score=20.000000; Sum = 115.000000; Num = 7 Score=2.000000; Sum = 117.000000; Num = 8 The mean score is 14.625000 ------------------------------------------

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

41

More usage tests:


saturn.cs.uml.edu(283)> pwd /tmp_mnt/nfs/galaxy/faculty/fac1/lechner/public_html/05f523 saturn.cs.uml.edu(282)> ls scorefile* scorefile1 scorefile2 scorefile3 scorefile4 ----------------------------------------------saturn.cs.uml.edu(271)> cat scorefile3 0000 saturn.cs.uml.edu(273)> meantest scorefile3 No scores in file ==================================

Driver Usage tests:


-------------------------------------------saturn.cs.uml.edu(274)> meantest scorefile Can't open file scorefile -------------------------------------------saturn.cs.uml.edu(275)> meantest Usage: meantest scorefile
Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

42

Comparison of White & Black-box Testing 25.1.2002

White-box Testing:
Potentially infinite number of paths have to be tested White-box testing often tests what is done, instead of what should be done Cannot detect missing use cases

Both types of testing are needed White-box testing and black box testing are the extreme ends of a testing continuum. Any choice of test case lies in between and depends on the following:
Number of possible logical paths Nature of input data Amount of computation Complexity of algorithms and data structures

Black-box Testing:
Potential combinatorial explosion of test cases (valid & invalid data) Often not clear whether the selected test cases uncover a particular error Does not discover extraneous use cases ("features")
Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

43

The 4 Testing Steps


1. Select what has to be 1. Select what has to be measured measured
Analysis: Completeness of Analysis: Completeness of requirements requirements Design: tested for cohesion Design: tested for cohesion Implementation: Code tests Implementation: Code tests

3. Develop test cases 3. Develop test cases


A test case is aaset of test A test case is set of test data or situations that will data or situations that will be used to exercise the unit be used to exercise the unit (code, module, system) being (code, module, system) being tested or about the attribute tested or about the attribute being measured being measured

2. Decide how the testing is 2. Decide how the testing is done done
Code inspection Code inspection Proofs (Design by Contract) Proofs (Design by Contract) Black-box, white box, Black-box, white box, Select integration testing Select integration testing strategy (big bang, bottom strategy (big bang, bottom up, top down, sandwich) up, top down, sandwich)

4. Create the test oracle 4. Create the test oracle


An oracle contains of the An oracle contains of the predicted results for aaset of predicted results for set of test cases test cases The test oracle has to be The test oracle has to be written down before the written down before the actual testing takes place actual testing takes place

Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

44

Guidance for Test Case Selection


Use analysis Use analysis

knowledge knowledge about functional about functional requirements (black-box requirements (black-box testing): testing):
Use cases Use cases Expected input data Expected input data Invalid input data Invalid input data

Use implementation Use implementation

knowledge about algorithms: knowledge about algorithms:


Examples: Examples: Force division by zero Force division by zero Use sequence of test cases for Use sequence of test cases for interrupt handler interrupt handler

Use design Use design

knowledge about knowledge about system structure, algorithms, system structure, algorithms, data structures (white-box data structures (white-box testing): testing):
Control structures Control structures

Data structures Data structures


Test branches, loops, ... Test branches, loops, ... Test records fields, Test records fields, arrays, ... arrays, ...
Bernd Bruegge & Allen H. Dutoit OOSE:

ch11lect1r2.ppt

45

Unit-testing Heuristics
1. Create unit tests as soon as object 1. Create unit tests as soon as object design is completed: design is completed: Black-box test: Test the use Black-box test: Test the use cases & functional model cases & functional model White-box test: Test the White-box test: Test the dynamic model dynamic model Data-structure test: Test the Data-structure test: Test the object model object model 2. Develop the test cases 2. Develop the test cases Goal: Find the minimal Goal: Find the minimal number of test cases to cover number of test cases to cover as many paths as possible as many paths as possible 3. Cross-check the test cases to 3. Cross-check the test cases to eliminate duplicates eliminate duplicates Don't waste your time! Don't waste your time!
Bernd Bruegge & Allen H. Dutoit OOSE:

4. Desk check your source code 4. Desk check your source code Reduces testing time Reduces testing time 5. Create aatest harness 5. Create test harness Test drivers and test stubs are Test drivers and test stubs are needed for integration testing needed for integration testing 6. Describe the test oracle 6. Describe the test oracle Often the result of the first Often the result of the first successfully executed test successfully executed test 7. Execute the test cases 7. Execute the test cases Dont forget regression testing Dont forget regression testing Re-execute test cases every time Re-execute test cases every time aachange is made. change is made. 8. Compare the results of the test with the 8. Compare the results of the test with the test oracle test oracle Automate as much as possible Automate as much as possible
ch11lect1r2.ppt 46

Das könnte Ihnen auch gefallen