Short note on Reviews Guidelines
- Review the product, not the producer.
- Set an agenda and maintain it.
- Limit debate and rebuttal.
- spending time debating the question, the issue should be recorded for further
discussion off-line.
- Enunciate (Identify) problem areas, but don't attempt to solve every
problem noted.
- Review is not a problem-solving session. The solution of a problem can often
be accomplished by the producer alone or with the help of only one other
individual. Problem solving should be postponed until after the review meeting.
- Take written notes.
- Limit the number of participants and insist upon advance preparation.
- Develop a checklist for each product that is likely to be reviewed.
- Allocate resources and schedule time for FTRs.
- Conduct meaningful training for all reviewers.
- Review your early reviews.
Short note on Formal Technical Reviews. (FTR)
- Formal technical review (FTR) is a software quality control activity performed by software engineers (and others).
- The objectives of an FTR are:
- (1) To uncover errors in function, logic, or implementation for any representation of the software;
- (2) To verify that the software under review meets its requirements;
- (3) To ensure that the software has been represented according to predefined standards
- (4) To achieve software that is developed in a uniform manner;
- (5) To make projects more manageable. In addition, the FTR serves as a training ground, enabling junior engineers to observe different approaches to software analysis, design, and implementation
- The FTR is actually a class of reviews that includes walkthroughs and inspections
The Review Meeting:
- Every review meeting should abide by the following constraints:
- Between three and five people (typically) should be involved in the review.
- Advance preparation should occur but should require no more than two hours of work for each person.
- The duration of the review meeting should be less than two hours. Given these constraints, it should be obvious that an FTR focuses on a specific (and small) part of the overall software.
- For example, rather than attempting to review an entire design, walkthroughs are conducted for each component or small group of components.
Review Summary Report
- What was reviewed?
- Who reviewed it?
- What were the findings and conclusions?
The Players of Review Meeting
- Producer—the individual who has developed the work product
- Informs the project leader that the work product is complete and that a review is required.
- Review leader—evaluates the product for readiness, generates copies of product materials, and distributes them to two or three reviewers for advance preparation.
- Reviewer(s)—expected to spend between one and two hours reviewing the product, making notes, and otherwise becoming familiar with the work.
- Recorder— a reviewer who records (in writing) all important issues raised during the review.
Short note on Informal Reviews
- Informal reviews include
- A simple desk check of a software engineering work product with a colleague,
- A casual meeting (involving more than two people) for the purpose of reviewing a work product,
- The review-oriented aspects of pair programming.
- A simple desk check or a casual meeting conducted with a colleague is a review.
- However, because there is no advance planning or preparation, no agenda or meeting structure, and no follow-up on the errors that are uncovered,
- the effectiveness of such reviews is considerably lower than more formal approaches. But a simple desk check can and does uncover errors that might otherwise propagate further into the software process.
Review Metrics & their use
- Introduction : Technical reviews are one of many actions that are
required as part of good software engineering practice.
- Each action requires dedicated human effort.
- The following review metrics can be collected for each review that
is conducted
- Preparation effort, Ep—the effort (in person-hours) required to
review a work product prior to the actual review meeting.
- Assessment effort, Ea— the effort (in person-hours) that is
expending during the actual review
- Rework effort, Er— the effort (in person-hours) that is dedicated to
the correction of those errors uncovered during the review
- Work product size, WPS—a measure of the size of the work product
that has been reviewed (e.g., the number of UML models, or the
number of document pages, or the number of lines of code)
- Minor errors found, Errminor—the number of errors found that can be
categorized as minor (requiring less than some pre-specified effort
to correct)
- Major errors found, Errmajor— the number of errors found that can
be categorized as major (requiring more than some pre-specified
effort to correct)
What Are Reviews?
- Introduction : Software reviews are a “filter” for the software process.
- Reviews are applied at various points during software engineering and serve to
uncover errors and defects that can then be removed.
- Software reviews “purify” software engineering work products, including
requirements and design models, code, and testing data.
- Technical reviews – TR (Peer Reviews) are the most effective mechanism
for finding mistakes early in the software process.
- Six Steps are employed (Planning-Preparation-Structuring meeting Noting
error-Making correction-Verifying correction)
What Do We Look For?
- Errors and defects
- Error — A quality problem found before the software is released
to end users
- Defect — A quality problem found only after the software has
been released to end-users
- The primary objective of technical reviews is to find errors during
the process so that they do not become defects after release of the
software.
- The obvious benefit of technical reviews is the early discovery of
errors so that they do not propagate to the next step in the
software process.
Defect Amplification (Extension / Increase)
- A defect amplification model can be used to illustrate the
generation and detection of errors during the design and code
generation actions of a software process.