0% found this document useful (0 votes)
8 views17 pages

SWE-Week 04

The document discusses the importance of reviews in software engineering, defining them as technical assessments aimed at ensuring quality and identifying defects early in the process. It outlines various metrics for evaluating review effectiveness, the roles of participants in review meetings, and guidelines for conducting formal and informal reviews. Additionally, it emphasizes the significance of postmortem evaluations and agile reviews in improving software development practices.

Uploaded by

dw9324764
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views17 pages

SWE-Week 04

The document discusses the importance of reviews in software engineering, defining them as technical assessments aimed at ensuring quality and identifying defects early in the process. It outlines various metrics for evaluating review effectiveness, the roles of participants in review meetings, and guidelines for conducting formal and informal reviews. Additionally, it emphasizes the significance of postmortem evaluations and agile reviews in improving software development practices.

Uploaded by

dw9324764
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

Because learning changes everything.

Chapter 16
Reviews – A Recommended
Approach

Part Three – Quality and Security

© 2020 McGraw Hill. All rights reserved. Authorized only for instructor use in the classroom.
No reproduction or further distribution permitted without the prior written consent of McGraw Hill.
Reviews
What are they?
• A meeting conducted by technical people.
• A technical assessment of a work product created during the
software engineering process.
• A software quality assurance mechanism.

What they are not!


not
• A project summary or progress assessment.
• A mechanism for political or personal reprisal!

© McGraw Hill 2
Cost Impact of Software Defects
• Error—a quality problem found before the software is released
to end users.

• Defect—a quality problem found only after the software has


been released to end-users.

• We make this distinction because errors and defects have very


different economic, business, and human impact.

• Review activities : The sooner you find a defect the cheaper it is


to fix it.

© McGraw Hill 3
Review Metrics 1

• Preparation effort, Ep — the effort (in person-hours) required to review a


work product prior to the actual review meeting.
• Assessment effort, Ea — the effort (in person-hours) that is expending
during the actual review.
• Rework effort, Er — the effort (in person-hours) that is dedicated to the
correction of those errors uncovered during the review.
• Work product size, WPS — a measure of the size of the work product that
has been reviewed (for example: the number of UML models, or the
number of document pages, or the number of lines of code).
• Minor errors found, Errminor — the number of errors found that can be
categorized as minor (requiring less than some pre-specified effort to
correct).
• Major errors found, Errmajor — the number of errors found that can be
categorized as major (requiring more than some pre-specified effort to
correct).

© McGraw Hill 4
Review Metrics 2

• Total errors found, Errtot. Represents the sum of the errors found:
Errtot = Errminor + Errmajor

• Error density. Represents the errors found per unit of work product reviewed:
Error density = Errtot ÷ WPS
Total error / Work product size

© McGraw Hill 5
Metrics Example
• The average defect density for a requirements model is 0.68
errors per page, and a new requirement model is 40 pages long.

• A rough estimate suggests that your software team will find


about 27 errors during the review of the document.

• If you find only 9 errors, you’ve done an extremely good job in


developing the requirements model.

© McGraw Hill 6
Effort Expended With and Without
Reviews

Access the text alternative for slide images.

© McGraw Hill 7
Informal Reviews
The benefit is immediate discovery of errors and better work
product quality.

Informal reviews include:


• A simple desk check of a software engineering work product
with a colleague.
• A casual meeting (involving more than 2 people) for the
purpose of reviewing a work product, or
• The review-oriented aspects of pair programming which
encourages continuous review as work is created.

© McGraw Hill 8
Formal Technical Reviews
The objectives of an FTR (walkthrough or inspection) are:
1. To uncover errors in function, logic, or implementation for any
representation of the software.
2. To verify that the software under review meets its requirements.
3. To ensure that the software has been represented according to
predefined standards.
4. To achieve software that is developed in a uniform manner.
5. To make projects more manageable.

© McGraw Hill 9
Review Meeting
• Between three and five people (typically) should be involved in
the review.
• Advance preparation should occur but should require no more
than two hours of work for each person.
• The duration of the review meeting should be less than two
hours.
• Focus is on a work product (for example: a portion of a
requirements model, a detailed component design, source code
for a component).

© McGraw Hill 10
Review Players
• Producer—the individual who has developed the work product.

• Review leader—evaluates the product for readiness, generates copies of


product materials, and distributes them to two or three reviewers for advance
preparation and facilitates the meeting discussion.

• Reviewer(s)—expected to spend between one and two hours reviewing the


product, making notes, and otherwise becoming familiar with the work.

• Recorder—reviewer who records (in writing) all important issues raised


during the review.

© McGraw Hill 11
Review Outcome
At the end of the review, all attendees of the FTR must decide
whether to:
1. Accept the product without further modification.
2. Reject the product due to severe errors (once corrected,
another review must be performed).
temporarily
3. Accept the product provisionally (minor errors have been
encountered and must be corrected, but no additional review
will be required).

© McGraw Hill 12
Review Reporting and Record Keeping
• During the FTR, the recorder records all issues raised and
summarizes these in a review issues list to serve as an action list
for the producer.
• A formal technical review summary report is created that
answers three questions:
1. What was reviewed?
2. Who reviewed it?
3. What were the findings and conclusions?

• You should establish a follow-up procedure to ensure that items


on the issues list have been properly corrected.

© McGraw Hill 13
Review Guidelines
• Review the product, not the producer.
• Set an agenda and maintain it.
• Limit debate and rebuttal.
• Take written notes.
• Limit the number of participants and insist upon advance preparation.
• Develop a checklist for each product that is likely to be reviewed.
• Allocate resources and schedule time for FTRs.
• Conduct meaningful training for all reviewers.
• Review your early reviews.

© McGraw Hill 14
Postmortem Evaluations
• A postmortem evaluation (PME) is a mechanism to determine
what went right and what went wrong with the software
engineering process and practices applied to a specific project.

• A PME is attended by members of the software team and


stakeholders who examine the entire software project, focusing
on excellences (achievements and positive experiences) and
challenges (problems and negative experiences).

• The intent is to extract lessons learned from the challenges and


excellences and to suggest improvements to process and
practice moving forward.

© McGraw Hill 15
Agile Reviews 15 minutes daily

• During the sprint planning meeting, user stories are reviewed


and ordered according to priority.

• The daily Scrum meeting is an informal way to ensure that


team members are all working on the same priorities and try to
catch any defects that may cause the sprint to fail.

• The sprint review meeting is often conducted using guidelines


like the formal technical review discussed in this chapter.

© McGraw Hill 16
Because learning changes everything. ®

www.mheducation.com

© 2020 McGraw Hill. All rights reserved. Authorized only for instructor use in the classroom.
No reproduction or further distribution permitted without the prior written consent of McGraw Hill.

You might also like