0% found this document useful (0 votes)
28 views

Testing in Sessionpdf

This document describes a method for measuring exploratory testing sessions. Key points: - Exploratory testing is unscripted and improvisational, so a method is needed to understand what is happening. - Testing is done in timed sessions with a clear mission/charter. Sessions are reviewed and results/obstacles discussed in a debriefing. - A session sheet is used to record bugs, coverage areas, time spent on different activities, and other metrics in a scannable format. - Breaking work down into test design, bug investigation, and setup helps diagnose productivity over time. Coverage areas also indicate if the right areas are being tested. - Data from past sessions can

Uploaded by

alexmursa
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views

Testing in Sessionpdf

This document describes a method for measuring exploratory testing sessions. Key points: - Exploratory testing is unscripted and improvisational, so a method is needed to understand what is happening. - Testing is done in timed sessions with a clear mission/charter. Sessions are reviewed and results/obstacles discussed in a debriefing. - A session sheet is used to record bugs, coverage areas, time spent on different activities, and other metrics in a scannable format. - Breaking work down into test design, bug investigation, and setup helps diagnose productivity over time. Coverage areas also indicate if the right areas are being tested. - Data from past sessions can

Uploaded by

alexmursa
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

A method to measure exploratory testing

Jon Bach Quardev Laboratories, LLC [email protected] www.quardev.com

Exploratory testing (AKA ad hoc testing) relies on tester intuition. It is unscripted, unrehearsed, and improvisational.

I, as test manager, need to understand whats happening, so I can direct the work and explain it to my clients.

Test in Sessions
1) 2) 3) Time Box Reviewable Result Debriefing

A clear mission for the session


Suggests what should be tested, how it should be tested, and what problems to look for Not meant to be a detailed plan General charters may be necessary at first:
Analyze the Insert Picture function

Charter:

Specific charters provide better focus, but take more effort to design:
Test clip art insertion. Focus on stress and flow techniques, and make sure to insert into a variety of documents. Were concerned about resource leaks or anything else that might degrade performance over time.

Focused test effort of fixed duration


Short: 60 minutes (+-15) Normal: 90 minutes (+-15) Long: 120 minutes (+-15)
Brief enough for accurate reporting Brief enough to allow flexible scheduling Brief enough to allow course correction Long enough to get solid testing done Long enough for efficient debriefings

Time Box:

Beware of overly precise timing

Measurement begins with observation


The manager reviews session sheet to assure that they understand it and that it follows the protocol The tester answers any questions Session metrics are checked Agenda: PROOF Charter may be adjusted Past Session may be extended Results New sessions may be chartered Obstacles Coaching happens Outlook Feelings

Debriefing:

Reviewable Result:
A scannable session sheet
Bugs
#BUG

Charter
#AREAS

Start Time Tester Name(s) Breakdown


Issues
#ISSUE

#DURATION #TEST DESIGN AND EXECUTION #BUG INVESTIGATION AND REPORTING #SESSION SETUP #CHARTER / OPPORTUNITY

CHARTER ----------------------------------------------Analyze MapMakers View menu functionality and report on areas of potential risk. #AREAS OS | Windows 2000 Menu | View Strategy | Function Testing Strategy | Functional Analysis

START ----------------------------------------------5/30/00 03:20 pm TESTER ----------------------------------------------Jonathan Bach TASK BREAKDOWN ----------------------------------------------#DURATION short #TEST DESIGN AND EXECUTION 65 #BUG INVESTIGATION AND REPORTING 25 #SESSION SETUP 20

jsb-000530-A.

Data Files Test Notes

The Breakdown Metrics


Testing is like looking for worms
Test Design and Execution

Session Setup

Bug Investigation and Reporting

Reporting the TBS Breakdown


A guess is okay, but follow the protocol
Test, Bug, and Setup are orthogonal categories. Estimate the percentage of charter work that fell into each category. Nearest 5% or 10% is good enough. If activities are done simultaneously, report the highest precedence activity. Precedence goes in order: T, B, then S. All we really want is to track interruptions to testing. Dont include Opportunity Testing in the estimate.

All test work fits here, somewhere

Activity Hierarchy

all work nonsession


inferred

session on charter

opportunity

test

bug

setup

Work Breakdown: Diagnosing the productivity


Test 28%

Do these proportions make sense? How do they change over time? Is the reporting protocol being followed?
Non-Session 61%
300.0 250.0 200.0 150.0 100.0 50.0 0.0 5/26

Bug 4% Setup 6% Opportunity 1%

6/9

6/23

7/7

7/21

8/4

8/18

Specifying coverage areas


These are text labels listed in the Charter section of the session sheet. (e.g. insert picture) Coverage areas can include anything areas of the product test configuration test strategies system configuration parameters Use the debriefings to check the validity of the specified coverage areas.

Coverage:

Are we testing the right stuff?

Coverage:

Is this a risk-based test strategy?

Distribution of On Charter Testing Across Areas


120 100 80

Is it a lop-sided set of coverage areas? Is it distorted reporting?

60 40 20 0

Using the Data to Estimate a Test Cycle


1. How many perfect sessions (100% on-charter testing) does it take to do a cycle? (lets say 40) 2. How many sessions can the team (of 4 testers) do per day? (lets say 3 per day, per tester = 12) 3. How productive are the sessions? (lets say 66% is on-charter test design and execution) 4. Estimate: 40 / (12 * .66) = 5 days 5. We base the estimate on the data weve collected. When any conditions or assumptions behind this estimate change, we will update the estimate.

More info
http://www.qasig.org/articles Scan tool -- TBA James Lindsay -- http://www.workroomproductions.com/papers/STAREast_AiSBT_slide s.pdf More about Exploratory Testing http://www.satisfice.com/articles.shtml

You might also like