Beruflich Dokumente
Kultur Dokumente
Outline
Overview of FTR and relationship to software quality improvement History of software quality improvement Impact of quality on software products The FTR process Beyond FTR Discussion and questions
Code reviews pay off even if the code is being tested later (Fagan)
To raise the quality of the finished product To improve developer skills finished product developer skills
3: Defined
4: Managed 5: Optimizing
Are peer reviews planned? Are actions associated with defects that are identified during peer reviews tracked until they are resolved? Does the project follow a written organizational policy for performing peer reviews? Do participants of peer reviews receive the training required to perform their roles? Are measurements used to determine the status of peer review activities? Are peer review activities and work products subjected to Software Quality Assurance review and audit?
10
Outline
Overview of FTR and relationship to software quality improvement History of software quality improvement Impact of quality on software products The FTR process Beyond FTR Discussion and questions
11
12
13
14
Informal
Spontaneous Ad-hoc No artifacts produced
Formal
In reality, there is also a middle ground between informal and formal techniques
16
Outline
Overview of FTR and relationship to software quality improvement History of software quality improvement Impact of quality on software products The FTR process Beyond FTR Discussion and questions
17
Cost-Benefit Analysis
Fagan reported that IBM inspections found 90% of all defects for a 9% reduction in average project cost Johnson estimates that rework accounts for 44% of development cost Finding defects, finding defects early and reducing rework can impact the overall cost of a project
18
Cost of Defects
What is the impact of the annual cost of software defects in the US?
$59 billion
Estimated that $22 billion could be avoided by introducing a best-practice defect detection infrastructure
Source: NIST, The Economic Impact of Inadequate Infrastructure for Software Testing, May 2002
19
Cost of Defects
Gilb project with jet manufacturer Initial analysis estimated that 41,000 hours of effort would be lost through faulty requirements Manufacturer concurred because:
10 people on the project using 2,000 hours/year Project is already one year late (20,000 hours) Project is estimated to take one more year (another 20,000 hours)
20
Average two hour inspection exposed four major and fourteen minor faults Savings estimated at $25,000 per inspection Additional studies showed the number of faults detected decreases exponentially by phase
21
Software Inspections
Why are software inspections not widely used? Lack of time Not seen as a priority Not seen as value added (measured by loc) Lack of understanding of formalized techniques Improper tools used to collect data Lack of training of participants Pits programmer against reviewers
22
The reviewers are swamped with information. Most reviewers are not familiar with the product design goals. There are no clear individual responsibilities. Reviewers can avoid potential embarrassment by saying nothing. The review is a large meeting; detailed discussions are difficult. Presence of managers silences criticism.
23
24
Fagans Contributions
Design and code inspections to reduce errors in program development (1976) A systematic and efficient approach to improving programming quality Continuous improvement: reduce initial errors and follow-up with additional improvements Beginnings of formalized software inspections
25
Planning Overview Preparation Examination Rework Follow-up Can steps be skipped or combined? How many people hours are typically involved?
26
Planning: Form team, assign roles Overview: Inform team about product (optional) Preparation: Independent review of materials Examination: Inspection meeting Rework: Author verify defects and correct Follow-up: Moderator checks and verifies corrections
27
Fagan recommends that a good size team consists of four people Moderator: the key person, manages team and offers leadership Readers, reviewers and authors
Designer: programmer responsible for producing the program design Coder/ Implementer: translates the design to code Tester: write, execute test cases
28
29
Active Design
Reviewers may be overloaded during preparation phase Reviewers lack of familiarity with goals Large team meetings can have drawbacks
Several brief reviews rather than one large review Focus on a certain part of the project Used this approach for the design of a military flight navigation system
30
Bisant and Lyle (1989) One author, one reviewer (eliminate moderator) Ad-hoc preparation Noted immediate benefits in program quality and productivity May be more useful in small organizations or small projects
31
N-fold Inspection
Martin and Tsai (1990) Rationale
A single team finds only a fraction of defects Different teams do not duplicate efforts
Follows Fagan inspection steps N-teams inspect in parallel with results Results from teams are merged After merging results, only one team continues on Team size 3-4 people (author, moderator, reviewers)
32
Phased Inspection
Knight and Myers (1993) Combines aspects of active design, Fagan, and N-fold Mini- inspections or phases with specific goals
33
Builds on the fact that most defects are found in preparation for the meeting (90/10) Is synergy as important to finding defects as stated by others? Collection occurs after preparation Rework follows
34
Gilb Inspections
Gilb and Graham (1993) Similar to Fagan inspections Process brainstorming meeting immediately following the inspection meeting
35
Other Inspections
36
Some researchers interpret Fagans work as a combination of all three Does present many of the elements associated with FTR FTR may be seen as a variant of Fagan inspections (Johnson, Tjahjono 1998)
37
Outline
Overview of FTR and relationship to software quality improvement History of software quality improvement Impact of quality on software products The FTR process Beyond FTR Discussion and questions
38
Process
Phases and procedures Author, Moderator, Reader, Reviewer, Recorder Defect removal, requirements elicitation, etc. Forms, consistent data collection, etc.
Roles
Objectives
Measurements
39
FTR Process
How much to review Review pacing When to review Pre-meeting preparation Meeting pace
40
Tied into meeting time (hours) Should be manageable Break into chunks if needed
41
Review Pacing
42
When to Review?
How much work should be completed before the review Set out review schedule with project planning Again, break into manageable chunks Prioritize based on impact of code module to overall project
43
Pre-Meeting Preparation
Materials to be given to reviewers Time expectations prior to the meeting Understand the roles of participants Training for team members on their various roles Expected end product
44
45
Select the correct participants for each role Understand team review psychology Choose the correct team size
46
Author Moderator Reader Reviewer Recorder (optional?) Who should not be involved and why?
47
48
Team Participants
49
Team Psychology
50
Team Size
What is the impact of large, complex projects? How to work with globally distributed teams?
51
FTR Objectives
Review meetings can take place at various stages of the project lifecycle Understand the purpose of the review
52
FTR Measurements
53
Documentation
Forms used to facilitate the process Documenting the meeting Use of standards How is documentation used by:
54
Sample Forms
Architecture design Detailed design Code inspection Functional design Software requirements
Inspection Metrics
How to gather and classify defects? How to collect? What to do with collected metrics? What metrics are important?
Defects per reviewer? Inspection rate? Estimated defects remaining?
Tools for collecting metrics Move beyond spreadsheets and word processors Primary barriers to using:
Cost Quality Utility
57
Outline
Overview of FTR and relationship to software quality improvement History of software quality improvement Impact of quality on software products The FTR process Beyond FTR Discussion and questions
58
Impact of reviews on the programmer Post-meeting activities Review challenges Survey of reviews and comparisons Future of FTR
59
Should reviews be used as a measure of performance during appraisal time? Can it help to improve commitment to their work? Will it make them a better reviewer when roles are reversed? Improve teamwork?
60
Post-Meeting Activities
Defect correction
How to ensure that identified defects are corrected? What metrics or communication tools are needed? Feedback to team members Additional phases of reviews
Follow-up
61
Review Challenges
Distributed, global teams Large teams Complex projects Virtual vs. face-to-face meetings
62
Survey of Reviews
Source: Ciolkowski, M., Laitenberger, O., & Biffl, S. (2003). Software reviews, the state of the practice. Software, IEEE, 20(6), 46-51.
63
Common obstacles
Time pressures Cost Lack of training (most train by participation)
64
The synergism among the review team that can lead to the discovery of defects not found by any of the participants working individually
Meetings are perceived as higher quality What about false positives and duplicates?
65
The need for face-to-face meetings has never been questioned Meetings are expensive!
Simultaneous attendance of all participants Preparation Readiness of work product under review High quality moderation Team personalities
66
67
Results
Defect detection effectiveness was not significantly different for either group Cost was less for nominal than for real groups (average time to find defects was higher) Nominal groups generated more issues, but had higher false positives and more duplication
68
69
60% dont prepare at all, only 50% use checklist, less than 10% use advance reading techniques
70
71
Repeatable success tends to use well defined techniques Reported success (NASA, Motorola, IBM)
95% defect detection rates before testing 50% overall cost reduction 50% reduction in delivery time
72
Future of FTR
1. 2. 3. 4. 5. 6. 7.
Provide tighter integration between FTR and the development method Minimize meetings and maximize asynchronicity in FTR Shift the focus from defect removal to improved developer quality Build organizational knowledge bases on review Outsource review and in-source review knowledge Investigate computer-mediated review technology Break the boundaries on review group size
73
Outline
Overview of FTR and relationship to software quality improvement History of software quality improvement Impact of quality on software products The FTR process Beyond FTR Discussion and questions
74
Discussion
Testing is commonly outsourced, but what about reviews? What are the implications to outsourcing one over the other? What if code production is outsourced? What do you review and how? What is the relationship between reviews and testing?
75
Discussion
Relationship between inspections and testing Do anonymous review tools impact the quality of the review process? How often to review? When to re-review? How to estimate number of defects expected?
76
Participants dont understand the review process Reviewers critique the producer, not the product Reviews are not planned Review meetings drift into problem solving Reviewer are not prepared The wrong people participate Reviewers focus on style, not substance
Source: www.processimpact.com
77
Observations
1985-1995: a fair amount of interest and research Terminology changes and appears to wane post 2000 Many sites are obsolete or have not been updated Very few surveys on quantifiable results regarding reviews, cost and quality improvements Those using quality methods tend to be enthusiastic, others have not joined in yet
78
Questions
79
References
Full references handout provided in class
80
82
83
84
85
Interesting Websites
Gilb: Extreme Inspection http://www.result-planning.com/Inspection Collaboration Tools http://www.resultplanning.com/Site+Content+Overview http://www.sdtcorp.com/pdf/ReviewPro.pdf
86