SWE-Week 04
SWE-Week 04
Chapter 16
Reviews – A Recommended
Approach
© 2020 McGraw Hill. All rights reserved. Authorized only for instructor use in the classroom.
No reproduction or further distribution permitted without the prior written consent of McGraw Hill.
Reviews
What are they?
• A meeting conducted by technical people.
• A technical assessment of a work product created during the
software engineering process.
• A software quality assurance mechanism.
© McGraw Hill 2
Cost Impact of Software Defects
• Error—a quality problem found before the software is released
to end users.
© McGraw Hill 3
Review Metrics 1
© McGraw Hill 4
Review Metrics 2
• Total errors found, Errtot. Represents the sum of the errors found:
Errtot = Errminor + Errmajor
• Error density. Represents the errors found per unit of work product reviewed:
Error density = Errtot ÷ WPS
Total error / Work product size
© McGraw Hill 5
Metrics Example
• The average defect density for a requirements model is 0.68
errors per page, and a new requirement model is 40 pages long.
© McGraw Hill 6
Effort Expended With and Without
Reviews
© McGraw Hill 7
Informal Reviews
The benefit is immediate discovery of errors and better work
product quality.
© McGraw Hill 8
Formal Technical Reviews
The objectives of an FTR (walkthrough or inspection) are:
1. To uncover errors in function, logic, or implementation for any
representation of the software.
2. To verify that the software under review meets its requirements.
3. To ensure that the software has been represented according to
predefined standards.
4. To achieve software that is developed in a uniform manner.
5. To make projects more manageable.
© McGraw Hill 9
Review Meeting
• Between three and five people (typically) should be involved in
the review.
• Advance preparation should occur but should require no more
than two hours of work for each person.
• The duration of the review meeting should be less than two
hours.
• Focus is on a work product (for example: a portion of a
requirements model, a detailed component design, source code
for a component).
© McGraw Hill 10
Review Players
• Producer—the individual who has developed the work product.
© McGraw Hill 11
Review Outcome
At the end of the review, all attendees of the FTR must decide
whether to:
1. Accept the product without further modification.
2. Reject the product due to severe errors (once corrected,
another review must be performed).
temporarily
3. Accept the product provisionally (minor errors have been
encountered and must be corrected, but no additional review
will be required).
© McGraw Hill 12
Review Reporting and Record Keeping
• During the FTR, the recorder records all issues raised and
summarizes these in a review issues list to serve as an action list
for the producer.
• A formal technical review summary report is created that
answers three questions:
1. What was reviewed?
2. Who reviewed it?
3. What were the findings and conclusions?
© McGraw Hill 13
Review Guidelines
• Review the product, not the producer.
• Set an agenda and maintain it.
• Limit debate and rebuttal.
• Take written notes.
• Limit the number of participants and insist upon advance preparation.
• Develop a checklist for each product that is likely to be reviewed.
• Allocate resources and schedule time for FTRs.
• Conduct meaningful training for all reviewers.
• Review your early reviews.
© McGraw Hill 14
Postmortem Evaluations
• A postmortem evaluation (PME) is a mechanism to determine
what went right and what went wrong with the software
engineering process and practices applied to a specific project.
© McGraw Hill 15
Agile Reviews 15 minutes daily
© McGraw Hill 16
Because learning changes everything. ®
www.mheducation.com
© 2020 McGraw Hill. All rights reserved. Authorized only for instructor use in the classroom.
No reproduction or further distribution permitted without the prior written consent of McGraw Hill.