0% found this document useful (0 votes)
690 views

The Agile Guide To Business Analysis and Planning

Uploaded by

William Richard
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
690 views

The Agile Guide To Business Analysis and Planning

Uploaded by

William Richard
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 116

The Agile Guide to Business

Analysis and Planning


This page intentionally left blank
The Agile Guide to
Business Analysis
and Planning
From Strategic Plan to Continuous
Value Delivery

Howard Podeswa

Boston • Columbus • New York • San Francisco • Amsterdam • Cape Town


Dubai • London • Madrid • Milan • Munich • Paris • Montreal • Toronto • Delhi • Mexico City
São Paulo • Sydney • Hong Kong • Seoul • Singapore • Taipei • Tokyo
Many of the designations used by manufacturers and sellers to distinguish their products are claimed as
trademarks. Where those designations appear in this book, and the publisher was aware of a trademark
claim, the designations have been printed with initial capital letters or in all capitals.

The author and publisher have taken care in the preparation of this book, but make no expressed or
implied warranty of any kind and assume no responsibility for errors or omissions. No liability is
assumed for incidental or consequential damages in connection with or arising out of the use of the
information or programs contained herein.

For information about buying this title in bulk quantities, or for special sales opportunities (which may
include electronic versions; custom cover designs; and content particular to your business, training
goals, marketing focus, or branding interests), please contact our corporate sales department at
[email protected] or (800) 382-3419.

For government sales inquiries, please contact [email protected].

For questions about sales outside the U.S., please contact [email protected].

Visit us on the Web: informit.com/aw

Library of Congress Control Number: 2020952174

Copyright © 2021 Pearson Education, Inc.

All rights reserved. This publication is protected by copyright, and permission must be obtained from
the publisher prior to any prohibited reproduction, storage in a retrieval system, or transmission in any
form or by any means, electronic, mechanical, photocopying, recording, or likewise. For information
regarding permissions, request forms and the appropriate contacts within the Pearson Education Global
Rights & Permissions Department, please visit www.pearson.com/permissions/.

Cover image: Kornn / Shutterstock


Lightbulb icon: Irina Adamovich / Shutterstock

ISBN-13: 978-0-13-419112-6
ISBN-10: 0-13-419112-9

ScoutAutomatedPrintCode
The book is dedicated to my parents: my late father, Yidel Podeswa,
a professional artist whose creative talent and life force have been
an everlasting inspiration to me, and my mother, Ruth Podeswa,
who, through her encouragement and example, instilled in me the
confidence to take on new challenges.
This page intentionally left blank
Contents

Foreword . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xxvii
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xxxi
About the Author . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xlvii

Chapter 1 The Art of Agile Analysis and Planning . . . . . . . . . . . . . . . . . . . . . . . . . 1


1.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 On Art and Agile Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.3 I Work for a Mainstream Company! What’s This Got to Do with Me? . . . . 5
1.4 Story 1: It’s Not My Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.4.1 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.5 Story 2: The Cantankerous Customer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.5.1 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
1.6 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
1.7 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

Chapter 2 Agile Analysis and Planning: The Value Proposition . . . . . . . . . . . . . . 13


2.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.2 What Is Agile Analysis and Planning? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.3 Who Is a Business Analyst? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.4 Why Agile Analysis and Planning? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.5 The Parallel Histories of Agile and Business Analysis . . . . . . . . . . . . . . . . . 16
2.5.1 A Brief History of Business Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.5.2 A Brief History of Agile Development . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.6 Two Diagnoses for the Same Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.7 The Business Analysis Diagnosis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
2.8 The Business Analysis Track Record . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
2.9 The Agile Diagnosis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
2.10 The Agile Track Record . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
2.11 Why Agile Teams Should Include an Effective BA Competency . . . . . . . . . 24
2.12 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
2.13 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

vii
viii C

Chapter 3 Fundamentals of Agile Analysis and Planning . . . . . . . . . . . . . . . . . . . 27


3.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
3.2 What the Agile Manifesto Means for Business Analysis . . . . . . . . . . . . . . . 28
3.2.1 Agile Manifesto . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
3.2.2 The Impact of the First Value on Analysis . . . . . . . . . . . . . . . . . . . . . . 28
3.2.3 The Impact of the Second Value on Analysis . . . . . . . . . . . . . . . . . . . . 28
3.2.4 The Impact of the Third Value on Analysis . . . . . . . . . . . . . . . . . . . . . 29
3.2.5 The Impact of the Fourth Value on Analysis . . . . . . . . . . . . . . . . . . . . 29
3.3 What the Twelve Principles Mean for Business Analysis . . . . . . . . . . . . . . . 29
3.4 Practices, Standards, and Frameworks . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
3.4.1 Business Analysis Standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
3.4.2 Requirements-Related Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
3.4.3 Agile Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
3.4.4 Agile Frameworks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
3.5 Overview of Agile Roles and the Business Analyst . . . . . . . . . . . . . . . . . . . 58
3.5.1 The Product Owner’s BA Responsibilities . . . . . . . . . . . . . . . . . . . . . . 59
3.5.2 The Agile Team Analyst . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
3.5.3 The ScrumMaster’s BA Responsibilities . . . . . . . . . . . . . . . . . . . . . . . . 60
3.5.4 Proxy User . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
3.5.5 BA Responsibilities of the Product Champion (Director) . . . . . . . . . . . 61
3.5.6 Coach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
3.5.7 When Are Dedicated Business Analysts Advised? . . . . . . . . . . . . . . . . 61
3.5.8 Business Analysts Provide Requirements Leadership . . . . . . . . . . . . . . 62
3.5.9 The Distinction between Business Analysts and Business Systems
Analysts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
3.6 Soft Skills of the Agile Business Analyst . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
3.6.1 Making the Unconscious Conscious . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
3.6.2 Curiosity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
3.6.3 Agent of Change. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
3.6.4 Political Intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
3.6.5 Works Well with Difficult People . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
3.6.6 Negotiation Skills . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
3.6.7 Facilitation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
3.6.8 Adaptability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
3.6.9 Not Afraid to Ask Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
3.6.10 Sense of Humor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
3.7 13 Key Practices of Agile Analysis and How They Differ from Waterfall . . 65
3.7.1 A Competency, Not a Role . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
C ix

3.7.2 A Facilitator, Not a Messenger . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65


3.7.3 Changes to Requirements Are Welcomed . . . . . . . . . . . . . . . . . . . . . . . 66
3.7.4 Collaboration with Developers vs. Contractual Relationship . . . . . . . . 66
3.7.5 Just-In-Time Requirements Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
3.7.6 Conversation versus Documentation . . . . . . . . . . . . . . . . . . . . . . . . . . 66
3.7.7 Specification by Example: Acceptance Test–Driven Development . . . . 66
3.7.8 Small Requirements Units . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
3.7.9 Vertical Slices of Functionality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
3.7.10 Lightweight Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
3.7.11 Business Analyst and Business Stakeholder Engagement across the
Complete Development Lifecycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
3.7.12 Mix of BA Classic and Agile BA Tools . . . . . . . . . . . . . . . . . . . . . . . . 67
3.7.13 Meet Them Where They Are . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
3.8 Agile Business Analysis Rules of Thumb . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
3.9 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
3.10 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68

Chapter 4 Analysis and Planning Activities across the Agile Development


Lifecycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
4.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
4.2 Overview of the Agile Analysis and Planning Map . . . . . . . . . . . . . . . . . . . 72
4.3 The Zones . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
4.4 The Lanes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
4.5 A Story in Three Acts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
4.6 Act 1: The Short Lane . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
4.6.1 Initial Preparation and Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
4.6.2 Seeding the Backlog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
4.6.3 Daily Activities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
4.6.4 Feature Closeout: Prepare for GA . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
4.6.5 Quarterly Inception, Iteration Inception . . . . . . . . . . . . . . . . . . . . . . . 78
4.6.6 Iteration Closeout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
4.6.7 Quarterly Closeout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
4.7 Act 2: The Long Lane . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
4.8 Act 3: The Grand Lane . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
4.8.1 Scale the Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
4.8.2 Scaled Quarterly Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
4.8.3 Scaled Iteration Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
4.8.4 Daily Planning and Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
x C

4.8.5 Iteration Closeout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80


4.8.6 Quarterly Closeout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
4.9 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
4.10 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81

Chapter 5 Preparing the Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83


5.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
5.2 This Chapter on the Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
5.3 What Is Initiation and Planning? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
5.4 How Long Should You Spend Up Front on Initiation and Planning? . . . . . . 87
5.4.1 The Greater the Anticipated Risks, the Greater the Need for
Upfront Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
5.4.2 What’s Past Is Prologue . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
5.5 The Purpose Alignment Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
5.5.1 Differentiating Quadrant (Top Right) . . . . . . . . . . . . . . . . . . . . . . . . . 89
5.5.2 Parity Quadrant (Bottom Right) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
5.5.3 Partner Quadrant (Top Left) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
5.5.4 Who Cares? Quadrant (Bottom Left) . . . . . . . . . . . . . . . . . . . . . . . . . . 90
5.6 Preparing the Infrastructure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
5.6.1 Transitioning from Manual to Automated Testing . . . . . . . . . . . . . . . . 91
5.6.2 Timing the Automation of the Build and Distribution Processes . . . . . 93
5.7 Organizing Development Teams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
5.7.1 Guidelines for Forming Agile Teams . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
5.7.2 Organize around Value . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
5.7.3 Feature Teams versus Generic Teams . . . . . . . . . . . . . . . . . . . . . . . . . . 96
5.7.4 The Extended Team . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
5.7.5 Why Organizing by Competency Is Bad for the Business . . . . . . . . . . . 98
5.8 Managing Stakeholder Expectations about Agile Development . . . . . . . . . . 99
5.8.1 The Negative Expectation That Requirements Delayed Are
Requirements Denied . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
5.8.2 Productivity Expectations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
5.9 Preparing the Customer–Developer Relationship . . . . . . . . . . . . . . . . . . . 101
5.9.1 Customer’s Bill of Rights and Responsibilities . . . . . . . . . . . . . . . . . . 101
5.9.2 Developers’ Bill of Rights and Responsibilities . . . . . . . . . . . . . . . . . . 102
5.10 Agile Financial Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
5.10.1 Measuring Success . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
5.10.2 Discovery-Driven Financial Planning . . . . . . . . . . . . . . . . . . . . . . . . 103
5.11 Preparing the Marketing and Distribution Teams . . . . . . . . . . . . . . . . . . 103
5.12 Preparing Channels and Supply Chains . . . . . . . . . . . . . . . . . . . . . . . . . . 104
C xi

5.13 Preparing Governance and Compliance . . . . . . . . . . . . . . . . . . . . . . . . . . 104


5.13.1 Challenge Compliance Assumptions . . . . . . . . . . . . . . . . . . . . . . . . . 105
5.13.2 Do Compliance After Process Design . . . . . . . . . . . . . . . . . . . . . . . . 105
5.13.3 Focus on Goals, Not Means . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
5.13.4 One-Time Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
5.14 Preparing for Increased Demand on Resources . . . . . . . . . . . . . . . . . . . . 106
5.15 Preparing an Enterprise for Agile Development . . . . . . . . . . . . . . . . . . . . 107
5.15.1 Agile Fluency Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
5.15.2 Transitioning the Team . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
5.15.3 Transition Activities at the Enterprise Level . . . . . . . . . . . . . . . . . . . 109
5.15.4 Transition Timeline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
5.15.5 Communications Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
5.15.6 Agile Enterprise Transition Team . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
5.16 Determine Organizational Readiness . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
5.16.1 Organizational Readiness Checklist . . . . . . . . . . . . . . . . . . . . . . . . . 113
5.17 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
5.18 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114

Chapter 6 Preparing the Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115


6.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
6.2 This Chapter on the Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
6.3 Process Preparation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
6.4 Tailoring the Agile Practice to the Context . . . . . . . . . . . . . . . . . . . . . . . . 118
6.4.1 Costs of Agile Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
6.4.2 Benefits of Agile Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
6.4.3 Finding the Best Trade-Off of Costs and Benefits . . . . . . . . . . . . . . . 119
6.4.4 Determining the Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
6.5 Tuning the Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
6.5.1 Business Analysis Information Artifacts and Events . . . . . . . . . . . . . 122
6.5.2 Checklist of Agile BA Information Artifacts . . . . . . . . . . . . . . . . . . . 122
6.5.3 Defining Requirements Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
6.5.4 Tuning the Backlog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
6.5.5 Determining Requirements Granularity Levels . . . . . . . . . . . . . . . . . 127
6.5.6 Tracing Requirements and Other Configuration Items . . . . . . . . . . . 129
6.5.7 Setting Process Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
6.6 Optimizing the Process Using Value Stream Mapping . . . . . . . . . . . . . . . . 145
6.7 Determining Process Readiness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
6.8 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146
6.9 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146
xii C

Chapter 7 Visioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147


7.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
7.2 This Chapter on the Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
7.3 Overview of Product Visioning and Epic Preparation . . . . . . . . . . . . . . . . 150
7.3.1 An Example of Product Visioning and Why It’s Important . . . . . . . . 151
7.3.2 Visioning Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152
7.3.3 Initial Stakeholder Identification . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152
7.3.4 Facilitation Tips . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152
7.4 Root-Cause Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152
7.4.1 Five Whys . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
7.4.2 Cause–Effect Diagrams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157
7.4.3 Cause–Effect Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
7.5 Specifying a Product or Epic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
7.6 The Problem or Opportunity Statement . . . . . . . . . . . . . . . . . . . . . . . . . . 167
7.7 The Product Portrait . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
7.7.1 The Product Portrait Template . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170
7.8 Crafting the Product and Epic Vision Statements . . . . . . . . . . . . . . . . . . . 172
7.8.1 The Product Vision Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172
7.8.2 The Epic Vision Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172
7.8.3 Properties of Well-Crafted Product and Epic Vision Statements . . . . . 172
7.8.4 Vision versus Mission Statements . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
7.9 Stakeholder Analysis and Engagement . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175
7.9.1 Identify and Analyze Stakeholders . . . . . . . . . . . . . . . . . . . . . . . . . . . 176
7.9.2 Plan Stakeholder Collaboration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176
7.9.3 Plan Stakeholder Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
7.9.4 Facilitate and Conduct Ongoing Engagement and Analysis . . . . . . . . 179
7.10 Analyzing Goals and Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182
7.10.1 Use Circumstance-Based Market Segmentation as a Basis for
Goals and Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182
7.10.2 Representing Goals and Objectives within the Story Paradigm . . . . 183
7.11 Analyze Leap of Faith Hypotheses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
7.11.1 What Is a Lean Startup? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
7.11.2 What Are Leap of Faith Hypotheses? . . . . . . . . . . . . . . . . . . . . . . . . 185
7.11.3 Value Hypotheses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186
7.11.4 Growth Hypotheses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186
7.11.5 Specifying Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
7.11.6 Hypotheses in Discovery-Driven Planning . . . . . . . . . . . . . . . . . . . . 189
7.11.7 Assumption Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
7.11.8 Using a Milestone Planning Chart to Plan Assumption Testing . . . . 190
C xiii

7.12 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192


7.13 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192

Chapter 8 Seeding the Backlog—Discovering and Grading Features . . . . . . . . . 193


8.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193
8.2 This Chapter on the Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
8.3 Overview: Seeding the Backlog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
8.3.1 Definitions: Epics and Stories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
8.3.2 How Many Features Should You Seed Up Front? . . . . . . . . . . . . . . . . 196
8.3.3 Whom to Invite to Backlog Seeding . . . . . . . . . . . . . . . . . . . . . . . . . . 197
8.4 Circumstance-Based Market Segmentation for Feature Discovery . . . . . . 198
8.5 Other Ways to Discover Initial Features . . . . . . . . . . . . . . . . . . . . . . . . . . 198
8.6 Feature Independence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
8.7 Using the Role-Feature-Reason Template to Represent Epics and
Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
8.8 Specifying Emergent Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
8.9 Physical Representation of Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
8.10 Feature Attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
8.11 Determining Customer and User Value with Kano Analysis . . . . . . . . . . 202
8.11.1 Select the Target Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202
8.11.2 Select the Customers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203
8.11.3 Create the Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203
8.11.4 Create Prototypes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
8.11.5 Test the Questionnaire Internally . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
8.11.6 Conduct the Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
8.11.7 Grade the Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
8.11.8 Interpreting the Kano Grades . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
8.11.9 Satisfaction versus Fulfillment Graph . . . . . . . . . . . . . . . . . . . . . . . . 207
8.11.10 The Natural Decay of Delight (and Its Opposite) . . . . . . . . . . . . . . 208
8.11.11 Continuous Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
8.12 Sequencing Epics and Features in the Backlog . . . . . . . . . . . . . . . . . . . . . 212
8.12.1 Determining Cost of Delay. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212
8.12.2 Determining WSJF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
8.12.3 Prioritization Tips . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
8.13 Writing Feature Acceptance Criteria . . . . . . . . . . . . . . . . . . . . . . . . . . . . 215
8.14 Analyzing Nonfunctional Requirements and Constraints . . . . . . . . . . . . 216
8.14.1 Do NFRs Go in the Backlog? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217
8.14.2 NFRs and Constraints Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . 217
xiv C

8.15 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220


8.16 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220

Chapter 9 Long-Term Agile Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221


9.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221
9.2 This Chapter on the Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224
9.3 Overview of Long-Term Planning, Epic Planning, and MVP . . . . . . . . . . . 224
9.4 The Full-Potential Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225
9.4.1 Phase 1: Define Bold Targets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225
9.4.2 Phase 2: Create a Detailed Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226
9.4.3 Phase 3: Deliver Quick Wins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226
9.4.4 The Business Analyst’s Contribution to a Successful
Full-Potential Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227
9.5 Using MVPs to Validate the Assumptions behind the Plan . . . . . . . . . . . . 228
9.5.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228
9.5.2 What Is an MVP?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229
9.5.3 The MVP Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229
9.6 Capabilities for Effective MVP Implementation . . . . . . . . . . . . . . . . . . . . . 231
9.6.1 Technical Capabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
9.6.2 Deployment and Delivery Approach . . . . . . . . . . . . . . . . . . . . . . . . . 232
9.6.3 Deployment Options and Potential Issues . . . . . . . . . . . . . . . . . . . . . 234
9.7 Overview of the Product Roadmap . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240
9.8 Planning the Interim Periods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
9.8.1 Specify the Interim Timeline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242
9.8.2 Craft Interim Goals and Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . 242
9.8.3 Specify Assumptions and Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . 243
9.8.4 Specify Events and Milestones . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243
9.8.5 Specify Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243
9.9 Using the Product Roadmap for Shorter Planning Horizons . . . . . . . . . . . 248
9.10 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 248
9.11 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249

Chapter 10 Quarterly and Feature Preparation . . . . . . . . . . . . . . . . . . . . . . . . . . 251


10.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
10.2 This Chapter on the Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254
10.3 Overview of Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254
10.3.1 Examples of Feature-Length Change Initiatives . . . . . . . . . . . . . . . . 254
10.4 Benefits of Feature Preparation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 256
10.5 Feature Preparation Activities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 256
C xv

10.6 Timing of Feature Preparation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257


10.7 Assessing Readiness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 258
10.7.1 Using the Feature Definition of Ready (Feature DoR). . . . . . . . . . . . 258
10.8 Accounting for Preparation Work: Tasks and Spikes . . . . . . . . . . . . . . . . 258
10.9 Specifying Features and Their Acceptance Criteria . . . . . . . . . . . . . . . . . 259
10.9.1 Specifying Epic Acceptance Criteria . . . . . . . . . . . . . . . . . . . . . . . . . 260
10.9.2 Specifying Feature Acceptance Criteria . . . . . . . . . . . . . . . . . . . . . . 261
10.9.3 The Analyst Contribution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261
10.9.4 Analyze AC During Triad Meetings . . . . . . . . . . . . . . . . . . . . . . . . . 262
10.9.5 Specifying AC in the BDD Gherkin Syntax . . . . . . . . . . . . . . . . . . . 262
10.9.6 Specifying UAT for End-to-End Workflows . . . . . . . . . . . . . . . . . . . 263
10.10 Context Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263
10.11 Stakeholder Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264
10.12 Persona Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264
10.12.1 History of Personas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265
10.12.2 Persona Examples. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266
10.12.3 Creating Personas. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267
10.12.4 Documenting Personas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 268
10.12.5 Working with Personas. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 269
10.13 Overview of Journey, Process, and Value Stream Maps . . . . . . . . . . . . . 272
10.14 Journey Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 272
10.14.1 Overview of the Customer Journey Map . . . . . . . . . . . . . . . . . . . . 273
10.14.2 Customer Journey Map: Mortgage Example . . . . . . . . . . . . . . . . . 273
10.14.3 Components of a Journey Map. . . . . . . . . . . . . . . . . . . . . . . . . . . . 274
10.14.4 Using the Journey Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278
10.14.5 More on Journey Maps. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283
10.15 Value Stream Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283
10.15.1 Developing a Value Stream Map. . . . . . . . . . . . . . . . . . . . . . . . . . . 284
10.16 Business Process Modeling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285
10.16.1 Bring Process Participants Together . . . . . . . . . . . . . . . . . . . . . . . . 285
10.16.2 What Situations Call for Process Modeling?. . . . . . . . . . . . . . . . . . 286
10.16.3 Screenshots Do Not a Process Model Make . . . . . . . . . . . . . . . . . . 286
10.16.4 Do Just Enough Analysis for Your Purposes. . . . . . . . . . . . . . . . . . 287
10.16.5 Models with and without Swimlanes . . . . . . . . . . . . . . . . . . . . . . . 287
10.16.6 BPMN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287
10.17 Use-Case Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 298
10.17.1 Use-Case Modeling Example: Claims . . . . . . . . . . . . . . . . . . . . . . . 298
10.17.2 Use-Case Modeling Elements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 299
xvi C

10.18 User-Role Modeling Workshops . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 300


10.18.1 Agenda . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 300
10.19 Review the Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 307
10.19.1 Context Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 307
10.19.2 UML Communication Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . 308
10.19.3 Data Flow Diagrams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 308
10.19.4 Architecture (Block) Diagrams . . . . . . . . . . . . . . . . . . . . . . . . . . . . 310
10.20 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312
10.21 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313

Chapter 11 Quarterly and Feature Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315


11.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315
11.2 This Chapter on the Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 318
11.3 Overview of Quarterly Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 318
11.4 Overview of Flow-Based Feature Planning . . . . . . . . . . . . . . . . . . . . . . . 318
11.5 When Is Planning at This Level Advised and Not Advised? . . . . . . . . . . . 319
11.6 When to Use Quarterly Planning versus Flow-Based Feature
Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319
11.7 How to Conduct Quarterly Planning with Agility . . . . . . . . . . . . . . . . . . 320
11.7.1 Create a Culture of Change . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321
11.7.2 Use Data-Informed Decisioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321
11.7.3 Specify Outcomes, Not Outputs . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321
11.7.4 View the Plan as a Hypothesis, Not a Contract . . . . . . . . . . . . . . . . 321
11.8 XP’s Planning Game Guidelines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 322
11.8.1 Overview of the Planning Game . . . . . . . . . . . . . . . . . . . . . . . . . . . . 322
11.8.2 Overview of Roles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 323
11.8.3 Overview of Planning Principles . . . . . . . . . . . . . . . . . . . . . . . . . . . 323
11.9 Quarterly Planning: Timing Considerations . . . . . . . . . . . . . . . . . . . . . . 325
11.10 Preparing for the Planning Event . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 325
11.10.1 Verify Entry Conditions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 325
11.10.2 Prepare Invitation List . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 326
11.10.3 Determine the Planning Horizon . . . . . . . . . . . . . . . . . . . . . . . . . . 326
11.10.4 Prepare Inputs and Deliverables . . . . . . . . . . . . . . . . . . . . . . . . . . . 326
11.10.5 Refine Features and Acceptance Criteria Incrementally . . . . . . . . . 327
11.11 Planning Topics (Agenda) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 328
11.11.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 329
11.11.2 Exploration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 332
11.11.3 Commitment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341
11.11.4 Planning Retrospective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 348
C xvii

11.12 Reviewing the Quarterly Plan, Once the Quarter Is Underway . . . . . . . 351
11.12.1 Start of an Iteration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 351
11.12.2 Velocity Corrections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 351
11.12.3 New Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 352
11.12.4 The Plan Becomes Obsolete . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 352
11.13 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 352
11.14 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 352
Chapter 12 MVPs and Story Maps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353
12.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353
12.2 This Chapter on the Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 356
12.3 MVPs and Story Mapping: How the Tools Complement Each Other . . . 356
12.4 MVP Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 356
12.4.1 What Is an MVP?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357
12.4.2 MVP Case Study: Trint . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357
12.4.3 Venues for MVP Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 358
12.4.4 MVP Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 359
12.4.5 MVP’s Iterative Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 363
12.4.6 The Pivot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 363
12.4.7 Incrementally Scaling the MVP . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364
12.4.8 Using MVPs to Establish the MMP . . . . . . . . . . . . . . . . . . . . . . . . . 365
12.5 Story Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 366
12.5.1 Jeff Patton’s Story Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 366
12.5.2 Benefits of a Story Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 367
12.5.3 The Anatomy of a Story Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 368
12.5.4 Dependency Relationships on the Map . . . . . . . . . . . . . . . . . . . . . . 370
12.5.5 Story Map Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 370
12.5.6 Tips for Writing Stories on the Map . . . . . . . . . . . . . . . . . . . . . . . . 372
12.5.7 Constructing the Backbone . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 372
12.5.8 Constructing the Ribs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 379
12.5.9 Other Forms of Story Maps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 386
12.6 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388
12.7 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388

Chapter 13 Story Preparation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 391


13.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 391
13.2 This Chapter on the Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394
13.3 Overview of Story Preparation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394
xviii C

13.4 Story Fundamentals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394


13.4.1 What Is a Story? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395
13.4.2 Alternative Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395
13.4.3 Size Taxonomy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395
13.4.4 What’s in a Name? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 396
13.4.5 User Story Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 397
13.5 The Three Cs of Stories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 397
13.5.1 Card . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 397
13.5.2 Conversation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 398
13.5.3 Confirmation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 398
13.6 Who Is Responsible for User Stories?. . . . . . . . . . . . . . . . . . . . . . . . . . . . 398
13.6.1 Who Writes Stories? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 398
13.6.2 The Analyst Value Added . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 399
13.6.3 The Triad . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 400
13.7 Physical versus Electronic Stories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 403
13.8 Specifying Values for Story Attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . 404
13.9 Writing the Story Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 404
13.9.1 When to Use a Story Template (and When Not To) . . . . . . . . . . . . . 404
13.9.2 Role-Feature-Reason (Connextra) Template . . . . . . . . . . . . . . . . . . . 405
13.10 Specifying Story Acceptance Criteria . . . . . . . . . . . . . . . . . . . . . . . . . . . 407
13.10.1 Examples of Story Acceptance Criteria . . . . . . . . . . . . . . . . . . . . . 408
13.10.2 Who Writes Acceptance Criteria? . . . . . . . . . . . . . . . . . . . . . . . . . 408
13.10.3 When to Create and Update Acceptance Criteria . . . . . . . . . . . . . . 409
13.10.4 Specification by Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 409
13.10.5 How Extensive Should the Acceptance Criteria Be? . . . . . . . . . . . . 411
13.10.6 How Many Acceptance Criteria per Story? . . . . . . . . . . . . . . . . . . 411
13.10.7 Characteristics of Well-Formed Acceptance Criteria . . . . . . . . . . . 411
13.10.8 Emergent Acceptance Criteria . . . . . . . . . . . . . . . . . . . . . . . . . . . . 413
13.10.9 Using the Behavior-Driven Development (BDD) Gherkin Format . . . 413
13.10.10 Who Tests Acceptance Criteria and When? . . . . . . . . . . . . . . . . . 414
13.11 Stories That Aren’t User Stories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 414
13.11.1 What Is a Spike or Enabler Story?. . . . . . . . . . . . . . . . . . . . . . . . . . 415
13.11.2 Functional Spike . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 416
13.11.3 Technical Spike . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 418
13.11.4 Compliance Story . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 420
13.11.5 Bug-Repair Stories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 420
13.12 Guidelines for Writing High-Quality Stories . . . . . . . . . . . . . . . . . . . . . 420
13.12.1 INVEST . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 421
13.12.2 INVEST IN CRUD . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 421
C xix

13.13 Patterns for Splitting Stories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 422


13.13.1 How to Use the Patterns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 422
13.13.2 Tie-Breakers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 422
13.13.3 The Patterns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 423
13.14 Analyzing Business Rules and AC with Decision Tables . . . . . . . . . . . . 433
13.14.1 Behavioral Business Rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 434
13.14.2 Decision Table Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 435
13.14.3 Benefits of a Decision Table . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 436
13.14.4 How to Elicit Rules Using the Table . . . . . . . . . . . . . . . . . . . . . . . . 436
13.15 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 440
13.16 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 440

Chapter 14 Iteration and Story Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 441


14.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 441
14.2 This Chapter on the Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 444
14.3 Overview of Iteration and Story Planning . . . . . . . . . . . . . . . . . . . . . . . . 444
14.4 Attendees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 445
14.5 Duration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 445
14.6 Inputs for Iteration Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 445
14.7 Deliverables of Iteration Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 446
14.7.1 The Iteration Goal and Iteration Backlog . . . . . . . . . . . . . . . . . . . . . 446
14.7.2 The Developer Task Board . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 446
14.7.3 The Increment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 446
14.8 Planning Rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 447
14.9 Part 1: Forecast What Will Be Accomplished . . . . . . . . . . . . . . . . . . . . . 447
14.9.1 Update . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 448
14.9.2 Forecast Capacity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 448
14.9.3 Review Ready and Done Definitions . . . . . . . . . . . . . . . . . . . . . . . . 449
14.9.4 Craft the Iteration Goal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 449
14.9.5 Discuss Stories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 450
14.9.6 Forecast the Stories That Will Be Delivered . . . . . . . . . . . . . . . . . . . 450
14.10 Part 2: Plan the Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 451
14.10.1 Should You Invite the PO to Part 2? . . . . . . . . . . . . . . . . . . . . . . . . 451
14.10.2 Overview of Part 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 451
14.10.3 Part 2 Steps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 452
14.11 Setting Up the Kanban Board . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 458
14.11.1 Columns on the Kanban Board . . . . . . . . . . . . . . . . . . . . . . . . . . . . 459
14.12 Scaling Iteration Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 462
xx C

14.13 Feature Preview Meeting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 462


14.13.1 Feature Preview Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 462
14.13.2 Timing Considerations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 462
14.13.3 Why Two Iterations Ahead? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 463
14.14 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 463
14.15 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 463

Chapter 15 Rolling Analysis and Preparation—Day-to-Day Activities . . . . . . . . 465


15.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 465
15.2 This Chapter on the Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 468
15.3 Overview of Rolling Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 468
15.3.1 A Day in the Life of the Agile Analyst . . . . . . . . . . . . . . . . . . . . . . . 468
15.3.2 Overview of Analysis Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 470
15.4 Updating Task Progress . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 470
15.5 Triad Guideline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 470
15.6 Actions That May Be Taken against a Developer Task . . . . . . . . . . . . . . 471
15.7 Monitoring Progress . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 471
15.7.1 The Daily Standup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 471
15.7.2 Follow-Up Meeting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 474
15.7.3 Updating the Developer Task Board . . . . . . . . . . . . . . . . . . . . . . . . . 475
15.7.4 Updating the Kanban Board . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 476
15.7.5 Monitoring Progress with a Daily Burndown Chart . . . . . . . . . . . . . 479
15.7.6 Burnup Charts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 486
15.7.7 What Should You Use: Burndown or Burnup Charts? . . . . . . . . . . . 486
15.7.8 Cumulative Flow Diagrams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 487
15.8 Story Testing and Inspection (Analyze-Code-Build-Test) . . . . . . . . . . . . . 490
15.8.1 Overview of the Analyze-Code-Build-Test Cycle . . . . . . . . . . . . . . . 490
15.8.2 Validating Value . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 491
15.9 Managing Scope Change during the Iteration . . . . . . . . . . . . . . . . . . . . . 495
15.9.1 When Progress Is Lower or Higher than Expected . . . . . . . . . . . . . . 495
15.9.2 When the PO Wants to Add Stories After the Iteration Begins . . . . . 495
15.10 Updating Business Analysis Documentation . . . . . . . . . . . . . . . . . . . . . 496
15.10.1 Persisting Stories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 496
15.10.2 Feature Documentation: Organize by Features, Not Stories . . . . . . 497
15.10.3 Updating the Use-Case Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . 497
15.10.4 Other Analysis Documentation . . . . . . . . . . . . . . . . . . . . . . . . . . . 506
15.10.5 Tracing Analysis Artifacts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 506
15.11 Ongoing Analysis of Upcoming Epics, Features, and Stories . . . . . . . . . 509
15.11.1 How Long Should You Spend on Preparation? . . . . . . . . . . . . . . . . 509
C xxi

15.11.2 Overview of Rolling Preparatory Analysis . . . . . . . . . . . . . . . . . . . 509


15.11.3 Feature Preparation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 510
15.11.4 Story Preparation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 510
15.11.5 Pruning and Ordering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 512
15.12 Accounting for Progress at the End of the Iteration . . . . . . . . . . . . . . . . 513
15.12.1 Accounting for Stories That Are Not Done . . . . . . . . . . . . . . . . . . 513
15.12.2 Accounting for Progress When an Iteration Is Canceled . . . . . . . . 513
15.13 The Iteration Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 514
15.13.1 Inputs and Deliverable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 514
15.13.2 Topics/Agenda . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 515
15.13.3 Iteration Review—Artifacts for Forecasting and Tracking
Progress . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 516
15.14 The Iteration Retrospective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 517
15.14.1 Timing Considerations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 517
15.14.2 Attendees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 518
15.14.3 Inputs and Deliverables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 518
15.14.4 Topics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 518
15.14.5 Iteration Retrospective Games . . . . . . . . . . . . . . . . . . . . . . . . . . . . 520
15.15 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 524
15.16 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 525

Chapter 16 Releasing the Product . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 527


16.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 527
16.2 This Chapter on the Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 530
16.3 Getting Stories to Done . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 530
16.4 Releasing to the Market: Timing Considerations . . . . . . . . . . . . . . . . . . 530
16.4.1 Should You Reserve a Hardening Iteration for Prerelease
Activities? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 531
16.5 Staging the Release . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 532
16.5.1 Pre-Alpha . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 533
16.5.2 Alpha Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 533
16.5.3 Beta Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 533
16.5.4 General Availability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 535
16.6 Quarterly (Release) Retrospective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 539
16.6.1 Facilitation Guidelines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 539
16.6.2 Preparing the Timeline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 542
16.6.3 Walkthrough of a Quarterly Retrospective . . . . . . . . . . . . . . . . . . . 543
16.7 Pivot-or-Persevere Meeting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 544
16.7.1 Data-Informed—Not Data-Driven . . . . . . . . . . . . . . . . . . . . . . . . . . 545
xxii C

16.7.2 Timing Considerations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 545


16.7.3 Attendees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 545
16.7.4 Walkthrough of a Pivot-or-Persevere Meeting . . . . . . . . . . . . . . . . . 545
16.8 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 547
16.9 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 548

Chapter 17 Scaling Agility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 549


17.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 552
17.2 This Chapter on the Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 552
17.3 Why Do We Need a Scaled Agile Approach? . . . . . . . . . . . . . . . . . . . . . . 552
17.3.1 Why Scaled Agile Teams Are Interdependent . . . . . . . . . . . . . . . . . . 553
17.3.2 Product Complexity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 554
17.3.3 Shared Components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 554
17.4 Planning: Choosing an Approach That Supports Inter-team
Collaboration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 554
17.4.1 Review of the Two Approaches . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 555
17.4.2 Which Approach Should You Use at the Frontend? . . . . . . . . . . . . . 555
17.4.3 Overview of the Analyst Contribution to Scaled Planning and
Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 557
17.5 Continuous Delivery: Delivering Software Continuously, Safely, and
Sustainably at Scale . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 558
17.5.1 Overview of Automation in the Test-Build-Deploy Steps . . . . . . . . . 558
17.5.2 DevOps, CI/CD . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 559
17.5.3 Test-Driven Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 562
17.5.4 ATDD and BDD . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 563
17.6 Scaled Agile Culture: Creating a Culture That Supports Innovation
at Scale . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 564
17.6.1 Effective Agile Leadership . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 564
17.6.2 Prioritize Quality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 566
17.6.3 Remove Silos; Foster Collaboration . . . . . . . . . . . . . . . . . . . . . . . . . 566
17.6.4 Foster a Culture of Rapid Learning . . . . . . . . . . . . . . . . . . . . . . . . . 566
17.7 Scaling the Backlog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 566
17.7.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 567
17.7.2 One Top-Level Product . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 568
17.7.3 Multiple Subproducts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 568
17.7.4 One Product-Level PO . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 568
17.7.5 One Backlog at the Whole-Product Level . . . . . . . . . . . . . . . . . . . . . 568
17.7.6 Multiple Team Backlogs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 569
C xxiii

17.7.7 Feature Teams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 569


17.7.8 Component Teams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 569
17.7.9 One Definition of Done (DoD) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 569
17.8 Scaling the Agile Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 570
17.8.1 Scaling by Subproduct and Product Area: MyChatBot
Case Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 570
17.8.2 Scaling the PO Role . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 571
17.8.3 Portfolio and Program Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . 572
17.8.4 Forming the Feature Teams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 575
17.8.5 The Extended Team . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 576
17.8.6 Component Teams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 577
17.8.7 Competency Groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 577
17.8.8 The Product Owner Council . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 579
17.8.9 User Task Force . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 581
17.8.10 Release Management Team . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 581
17.9 Scaling the Agile Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 581
17.9.1 Scaled Agile Frameworks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 582
17.9.2 Overview of Scaled Activities and Events . . . . . . . . . . . . . . . . . . . . . 583
17.9.3 Initial Preparation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 585
17.9.4 Scaled Quarterly and Feature Planning . . . . . . . . . . . . . . . . . . . . . . 586
17.9.5 Scaled Iteration (Sprint) Planning Meetings . . . . . . . . . . . . . . . . . . . 595
17.9.6 Big Room Iteration Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 598
17.9.7 Feature Preview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 599
17.9.8 Integration Meetings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 599
17.9.9 Daily Standup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 600
17.9.10 Scrum of Scrums (SoS) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 600
17.9.11 Product Owner Council Meeting . . . . . . . . . . . . . . . . . . . . . . . . . . 601
17.9.12 Scaled (Quarterly) Feature Preparation (Multiple Teams) . . . . . . . . 602
17.9.13 Team-Level Story Preparation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 605
17.9.14 User Task Force Meetings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 606
17.9.15 Scaled Iteration Review or Feature Review . . . . . . . . . . . . . . . . . . . 606
17.9.16 Scaled Iteration Retrospective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 607
17.9.17 Scaled Quarterly/Feature Retrospective . . . . . . . . . . . . . . . . . . . . . 610
17.9.18 Open Space . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 611
17.9.19 Triad . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 614
17.10 Agile Requirements Management Software Tools . . . . . . . . . . . . . . . . . 615
17.10.1 Requirements Management Tool Checklist . . . . . . . . . . . . . . . . . . . 615
17.10.2 Overview of Agile Requirements Management Tools . . . . . . . . . . . 615
xxiv C

17.11 Lightweight Tools for Supporting Inter-team Collaboration . . . . . . . . . 615


17.11.1 Team Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 616
17.11.2 Visualization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 616
17.11.3 “Just Talk” . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 616
17.11.4 Scouts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 616
17.11.5 Roamers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 616
17.11.6 Shared Team Members . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 617
17.11.7 Implement Work Items Sequentially, Not Concurrently . . . . . . . . . 617
17.11.8 Enforce a Definition of Ready . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 617
17.12 Potential Issues and Challenges in Scaling Agility . . . . . . . . . . . . . . . . . 617
17.12.1 Guidelines for Non-colocated Teams . . . . . . . . . . . . . . . . . . . . . . . 617
17.12.2 Guidelines for Working with Waterfall Teams . . . . . . . . . . . . . . . . 619
17.12.3 Inability to Deploy Frequently and Reliably . . . . . . . . . . . . . . . . . . 620
17.12.4 Recurring Integration Errors and Dependency Issues . . . . . . . . . . . 620
17.12.5 Conflicting Priorities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 620
17.12.6 Insufficient Business Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . 621
17.13 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 622
17.14 What’s Next? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 622

Chapter 18 Achieving Enterprise Agility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 623


18.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 623
18.2 This Chapter on the Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 626
18.3 Overview of Enterprise Agility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 627
18.3.1 Definition of an Agile Enterprise . . . . . . . . . . . . . . . . . . . . . . . . . . . 627
18.3.2 Why It Matters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 627
18.3.3 The Business Analysis Contribution . . . . . . . . . . . . . . . . . . . . . . . . . 628
18.3.4 Drivers for Enterprise Agility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 628
18.3.5 Agility in Heavily Regulated Sectors . . . . . . . . . . . . . . . . . . . . . . . . 629
18.4 Foundational Practices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 629
18.4.1 Lean Startup/MVP . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 630
18.4.2 Full-Potential Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 630
18.4.3 Circumstance-Based Market Segmentation . . . . . . . . . . . . . . . . . . . 630
18.4.4 Disruptive Innovation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 631
18.5 Overview of the Agile Process for Developing Innovative Products . . . . . 631
18.6 Agile Corporate Culture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 632
18.6.1 Definition of Corporate Culture . . . . . . . . . . . . . . . . . . . . . . . . . . . . 633
18.6.2 Definition of Agile Corporate Culture . . . . . . . . . . . . . . . . . . . . . . . 633
18.7 Overview of Principles and Practices for an Agile Corporate Culture . . . . 634
C xxv

18.8 Three Principles for Applying Agile Practices . . . . . . . . . . . . . . . . . . . . . 635


18.8.1 Tailor the Approach to the Circumstance. . . . . . . . . . . . . . . . . . . . . 635
18.8.2 Protect Islands of Innovation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 644
18.8.3 Invest Aggressively in Enterprise Agility . . . . . . . . . . . . . . . . . . . . . 648
18.9 The Thirteen Practices for an Agile Corporate Culture . . . . . . . . . . . . . . 650
18.9.1 Iterative Experimentation (Fail Fast) . . . . . . . . . . . . . . . . . . . . . . . . . 650
18.9.2 Embrace Change . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 652
18.9.3 Acceleration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 653
18.9.4 Empathy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 655
18.9.5 Responsible Procrastination (Last Responsible Moment) . . . . . . . . . 659
18.9.6 Distributed Authority . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 659
18.9.7 Let Those Who Do the Work Estimate the Effort . . . . . . . . . . . . . . . 663
18.9.8 Collaboration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 663
18.9.9 Commit to Outcomes, Not Outputs . . . . . . . . . . . . . . . . . . . . . . . . . 666
18.9.10 Transparency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 666
18.9.11 Bust Silos . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 667
18.9.12 Data-Informed Innovation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 672
18.9.13 Monitor Adjacent and Low-End Markets . . . . . . . . . . . . . . . . . . . . 673
18.10 Agile Financial Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 675
18.10.1 Real Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 675
18.10.2 Discovery-Driven Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 675
18.11 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 676

Appendix A Additional Resources and Checklists . . . . . . . . . . . . . . . . . . . . . . . 677


A.1 Mapping of Book Chapters to IIBA and PMI Guides . . . . . . . . . . . . . . . . 677
A.2 Rules of Thumb in Agile Analysis and Planning . . . . . . . . . . . . . . . . . . . . 682
A.3 Facilitation Tips . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 684
A.4 Visioning Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 686
A.5 Stakeholder Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 687
A.6 NFRs and Constraints Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 689
A.7 Readiness Checklist for Quarterly Planning . . . . . . . . . . . . . . . . . . . . . . . 690
A.7.1 Analysis Readiness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 690
A.7.2 Logistics Readiness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 690
A.8 Checklist of Invitees for Quarterly Planning . . . . . . . . . . . . . . . . . . . . . . . 692
A.9 Checklist of Quarterly and Feature Planning Inputs . . . . . . . . . . . . . . . . . 693
A.10 Checklist of Quarterly and Feature Planning Deliverables . . . . . . . . . . . 694
A.11 Checklist of Quarterly (Release) Retrospective Questions . . . . . . . . . . . . 695
A.11.1 DevOps and Supporting Practices Perspective . . . . . . . . . . . . . . . . . 695
A.11.2 Technology Perspective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 695
xxvi C

A.11.3 Productivity Perspective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 695


A.11.4 Quality Assurance (Testing) Perspective . . . . . . . . . . . . . . . . . . . . . . 696
A.11.5 Program/Portfolio Perspective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 697
A.11.6 Marketplace Perspective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 697
A.12 Checklist of Invitees for Scaled Quarterly and Feature Planning . . . . . . 698
A.13 Overview of Agile Requirements Management Tools . . . . . . . . . . . . . . . 699
A.13.1 JIRA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 699
A.13.2 Blueprint . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 699
A.13.3 JAMA Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 699
A.13.4 Other Requirements Management and Collaboration Tools . . . . . . 699

Appendix B Discovery-Driven Planning Case Study: BestBots . . . . . . . . . . . . . . 701


B.1 Background: BestBots Case Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 701
B.2 Initial Market Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 702
B.2.1 Market Estimates (Past and Future) . . . . . . . . . . . . . . . . . . . . . . . . . . 702
B.2.2 Compound Annual Growth Rate . . . . . . . . . . . . . . . . . . . . . . . . . . . 702
B.2.3 Spreadsheet Fix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 703
B.3 Determine Constraints (Required Outcomes) . . . . . . . . . . . . . . . . . . . . . . 703
B.3.1 Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 704
B.4 Create Draft of Reverse Income Statement . . . . . . . . . . . . . . . . . . . . . . . . 705
B.4.1 Conclusions from the Reverse Income Statement Draft . . . . . . . . . . . 706
B.5 Create Pro Forma Operations Specifications . . . . . . . . . . . . . . . . . . . . . . . 706
B.6 Create Assumptions Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 708
B.7 Revise Reverse Income Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 709
B.8 Create Milestone Planning Chart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 710

Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 713

Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 715
Foreword

There are three things Howard and I have in common: our passion for business analysis,
our enthusiasm for painting, and our love of good food and conversation.
Several years ago, I worked at one of the largest banks in Canada as the center of
excellence (CoE) lead for requirement management/business analysis. I held the responsi-
bility for advancing the requirement management capabilities for the organization’s IT &
Operations unit, including the training curriculum for business analysis. It is there that
I met Howard as we collaborated, mapping the bank’s business analysis competencies
in the development of a new training curriculum for the bank’s business analysts. I was
immediately impressed by Howard’s ability to understand what I was trying to achieve.
He understood well the role of the business analyst and the knowledge and experience
business analysts must have to be efficient in their position. His recommendations to
augment the curriculum’s quality were to the point, and his willingness to collaborate and
adjust his course offerings to fit my needs was essential to me.
We subsequently met several times, through formal business meetings, discussing how
his courses were performing for us. These were also excellent opportunities to discuss
how we could collaborate to advance the training curriculum further. I eventually moved
to a different position. Howard and I stayed in contact. We met regularly on a casual
basis, catching up, and often ran into each other at industry conferences where Howard
presented.
We collaborated through the International Institute of Business Analysis (IIBA). I
served in various capacities for fifteen years, initially as a volunteer in multiple roles,
including chair of the board of directors. I also led the association as interim president and
CEO in 2013–2014. I covered various roles and functions afterward, including director of
business and corporate development, where I established multiple strategic alliances with
other professional associations.
I established a formal relationship with the Agile Alliance. I negotiated with them a
collaboration to develop the second edition of the Agile Extension to the BABOK Guide,
v3.0, which successfully launched in August 2017. It is an excellent publication. The book
tells you what you need to know about agile analysis—it lays out the land, if you wish;
it describes the concepts and techniques practitioners should know. Howard has mapped
them all out for you in this publication plus many others. However, in my opinion, the
real value this book provides, and the reason I don’t hesitate telling you to invest in it, is
the way Howard interlaces, using a running case study, dozens of scenario-based exam-
ples, tools, and techniques. Furthermore, Howard describes them all across the product
development lifecycle and how they apply to the most common agile industry frameworks.
Over the last twenty-plus years in business analysis covering various functions, I saw
firsthand how difficult it has been for many organizations to transition from a waterfall

xxvii
xxviii F

or some form of iterative development approach to agile. To my chagrin, I saw many


organizations debating whether the role of the business analyst still had a place in an agile
environment. I witnessed how challenging it has been for many seasoned business analysts
to upskill their agile competencies to continue to bring value to their organization. There
has been much progress since then, and business analysts have emerged as essential con-
tributors to agile initiatives. Today, organizations with a high level of maturity in prod-
uct development understand the critical importance that business analysts bring to their
agile practices. But for many other organizations, there are still significant challenges as
organizations try to fit bits and parcels of two or three agile frameworks to meet their
internal processes and ways to manage projects. And this is where the real value of this
book comes in. Howard has laid out more than 175 tools and techniques, examples, and
guidelines that product owners and business analysis practitioners can readily apply.
Howard’s involvement with IIBA is also important to note, as an overall supporter of
the association, as a contributor in the review of BABOK v3: A Guide to the Business
Analysis Body of Knowledge, and sometimes as a gadfly, provoking the organization
toward continual improvement. During my tenure at IIBA, I had the privilege to cochair
the IIBA official annual Building Better Capability global conference over the course of
five years. I had the opportunity to see Howard present in person. The subjects of his
presentations were always pertinent, and the delivery always professional, valuable, and
enthusiastically well received.
Throughout the years, the relationship I had with Howard evolved into a friendship,
as we shared similar passions and interests. I have the highest respect and admiration for
Howard. As this book demonstrates, he is the consummate business analysis professional
and a recognized leader in the field. He is also an accomplished artist, having exhibited
his work in several galleries throughout the world. And he certainly knows how to pick
the perfect restaurant for a great meal and conversation.
Howard is a pioneer in the field of business analysis. His first book, UML for the IT
Business Analyst: A Practical Guide to Object-Oriented Requirements Gathering, was
published in 2005 as business analysis fully emerged as a profession. His second book,
The Business Analyst’s Handbook, published in 2009, has become a business analysis
staple for both seasoned and aspiring business analysts throughout the world. The Agile
Guide to Business Analysis and Planning represents a culmination of his vast experience
in both agile and business analysis.
What is unique about this book is how Howard treats the subject. It is also how he
presents himself. The book has a personal feel to it. It’s rather uncommon for a business
publication to include several pages dedicated to the author’s particular interest, in this
case, Howard’s passion for painting. But by doing so, Howard connects with the reader
on a more personal level, demonstrating how his artistic capabilities add to the richness
of his persona and how creativity can be a catalyst for problem solving and innovation,
which Howard describes across the book.
With his vast experience in the field, Howard demonstrates how business analysis
and agile practitioners can apply fundamental business analysis practices and techniques
across the most widely used agile frameworks—including Scrum, Kanban, SAFe, DevOps,
XP, lean software development, lean startup, and continuous delivery (CD)—and across
all the product development lifecycle activities.
F xxix

Whether you are new to agile practices or a seasoned business analyst transitioning from
traditional business analysis to agile analysis, you will learn which tools to use and when to
use them. Howard provides step-by-step guidance for performing your analysis work across
the entire product development lifecycle, advice and guidance you can use immediately to be
more confident and productive from day one on an agile project. Product owners will gain
confidence in interacting with agile teams as they carry out the high-level agile planning
analysis activities. Furthermore, they will be able to leverage Howard’s guidance to man-
age stakeholder expectations and keep them involved and engaged throughout the product
development process.
I don’t pretend to be an expert in agile analysis and planning. I know enough about it to
understand how valuable this book is for anyone involved in an agile initiative. I have seen
the challenges many practitioners are facing when embarking on a new agile initiative.
This book will become a staple reference that both product owners and business analysis
practitioners should have by their side.
I am grateful that Howard asked me to write this foreword and thankful for the trust
he put in me in helping him wherever I can to bring this publication to fruition. I know
you will enjoy reading it and get great value from it.
Happy reading.
—Alain Arseneault
Former IIBA Acting President & CEO, and
President & CEO of TheBAExecutive TM
This page intentionally left blank
Preface

“The green reed which bends in the wind is stronger than


the mighty oak which breaks in a storm.”
—Confucius

This book aims to help enterprises become nimbler and more effective in responding to a
rapidly changing environment by assisting them in establishing a reliable, agile analysis,
and planning competency. Agile analysis and planning is defined in this book as an orga-
nizational competency concerned with the examination of a business or any aspect of it
(including culture, organizational structure, processes, and products) to learn what needs
to change and when in order to achieve a desired outcome, in a context that places a high
premium on adaptability, resilience, and continuous innovation and value delivery. Key
activities within the competency include analyzing who the product is for (the stakehold-
ers), defining their requirements, determining when the capabilities will be delivered, and
estimating costs and resources.

Why I Wrote This Book


In my many years of consulting with IT organizations, I’ve seen practitioners of agile analy-
sis and planning struggle to find a hands-on book that provides guidance they could readily
use on the job. Current books on the subject lay down the framework for the competency.
International Institute of Business Analysis (IIBA)’s Agile Extension to the BABOK Guide,
published in association with the Agile Alliance, provides a foundation that describes, in
broad terms, how to apply techniques and principles at different planning horizons. Project
Management Institute (PMI)’s Agile Practice Guide provides a valuable overview, from the
perspective of project leaders and project teams. There are also essential books that provide
detailed guidance on specific aspects of the discipline, such as Humble’s excellent books on
DevOps, Cohn’s books on user stories, and books devoted to specific frameworks, such as
The Scrum Guide. I saw a gap in the market, though, for something built on the foundation
of those books but that goes further. I realized there were hardly any publications that con-
nected the dots across these essential techniques while providing guidance specific enough
for the practitioner to adapt and apply them on the job. I wrote this book to fill that gap. It
offers actionable advice backed up by specific examples that illustrate how to use and adapt
agile practices in different scenarios.

xxxi
xxxii P

The guidance in the book is supported by more than 175 tools, techniques, examples,
diagrams, templates, checklists, and other job aids, making it an essential tool kit for any
business analysis practitioner or product owner. It synthesizes the analysis and planning
guidance of the most widely used agile frameworks and distills the lessons I’ve learned
from the last twenty to thirty years working with agile teams. Over time, I’ve made my
share of mistakes—failing, trying again, and failing better (to paraphrase Samuel Beck-
ett). Along the way, I’ve learned what works and what doesn’t. This book incorporates the
lessons learned from those mistakes so that you don’t have to learn them the hard way.
The guidance you’ll find in this book draws from the collective wisdom of those I’ve
worked with over the years: my colleagues and clients at REI Co-op, Covance, LabCorp,
US Food and Drug Administration (FDA), Intact Insurance, TD Bank, BMO Bank of
Montreal, Rogers Corp, TELUS, Canada Mortgage, Housing, True Innovation Inc., and
many others. I am grateful to them for trusting me to work with them and sharing their
lessons learned with me so that I could pay it forward and share them with you.
Agile analysis and planning focuses on improving communication with customers and
users so that the business can anticipate and respond effectively to changes in customers’
habits and behaviors even under extreme uncertainty. At no time in my memory has
this felt as important as today. As I complete this book, a pandemic is raging across the
globe, and the world is facing a long-overdue reckoning with the consequences of racial
and economic disparity. Everything at this moment seems uncertain, from the profound
to the mundane. What will society be like at the end of these changes? Will we come
together or be further divided? Will the shift from real-world engagement toward online
life be permanent? Will remote work become the norm? What about distance learning and
online shopping? It’s a time of great challenge but also an opportunity for reinvention. It
is my wish that this book will help you and the organizations you work for navigate these
changes, adapt, and even thrive in these incredibly uncertain times—and in the “new
normal” that is to follow.

State-of-the-Art Guidance across Agile Frameworks


This is my third book on business analysis. My earlier books, UML for the IT Busi-
ness Analyst (2005, 2009) and The Business Analyst’s Handbook (2008), described how
to carry out the business analysis function within an iterative-development lifecycle. It’s
been very gratifying to witness the international success enjoyed by those books, includ-
ing a Spanish and Portuguese edition and a second release of the UML book. If you
liked those books, I am confident you will enjoy this new publication as well. Much has
changed, though, since my first publication. This book returns to similar ground but with
a refreshed perspective on today’s most successful and widely used agile and analysis
frameworks and practices. These include:

• DevOps
• SAFe
• Kanban
P xxxiii

• Scrum
• Lean software development
• Lean startup and minimum viable product (MVP)
• User stories, Extreme Programming (XP)
• Continuous integration/ continuous delivery (CI/CD)
• Test-driven development (TDD), acceptance test–driven development (ATDD), and
behavior-driven development (BDD)
• Full-potential plan
• Discovery-driven planning
• Circumstance-based market segmentation
• Agile Fluency model

In addition, the book is aligned with the following professional certification guides:

• PMI: Agile Practice Guide


• IIBA: Agile Extension to the BABOK Guide v2
• PMI: Business Analysis Practice Guide
• IIBA: BABOK v3: A Guide to Business Analysis Body of Knowledge

What Makes This Book Unique?


Unlike many other guides, this book contains everything you need in one place to practice
effective agile analysis and planning:

• Detailed guidance: It’s a practical manual that tells you what to do and shows you
how to do it.
• Integration with business analysis: Most books on agile analysis focus solely on agile
techniques, overshadowing the use of valuable business analysis techniques such as
business rules analysis and process modeling. This book shows you how to insert leg-
acy analysis techniques into an agile process to increase an agile team’s productivity.
• Broad coverage of agile approaches and frameworks: The book incorporates best
practices from today’s most widely used agile frameworks, including lean, SAFe,
Kanban, and Scrum, enabling you to be effective in any agile environment.
• Experience-based guidance: This book is based on years of experience working with
companies and teams on improving agile analysis and planning in their organiza-
tions, learning what works and when. It’s informed by today’s most effective agile
frameworks but is beholden to none.
xxxiv P

• Context-based just-in-time learning: The book presents you with techniques and
guidelines in the context in which you’ll be using them across the development life-
cycle. You learn what you need to know and when you need to apply it.
• Extensive job aids: The book includes more than 175 valuable job aids to increase
your understanding and effectiveness. These include:
– Concrete examples and templates that you can use to create analysis and plan-
ning artifacts, such as the product vision statement, product roadmaps, story
maps, epics, features, spikes, stories, and acceptance criteria
– Sample diagrams and diagram legends
– Meeting agendas and other facilitation aids
– Checklists
• Contiguous end-to-end case study: An end-to-end case study runs through the
book, enabling you to see exactly how the steps and artifacts feed into each other
over the course of an agile development lifecycle.

Furthermore, the book provides clear evidence of the value of business analysis in an
agile organization—demonstrating how traditional business analysis combined with agile
analysis and planning techniques can produce higher-performing agile teams.

Why Agile Analysis and Planning Is Important for the


Enterprise
We know that organizations that adopt an agile approach experience significant benefits.
For example, their projects are 37 percent faster to market than the industry average
(QSM),1 and their productivity increases by 16 percent. 2 But we also know that an agile
organization can dramatically improve its success rates by enhancing its level of com-
petency in analysis and planning. 3 The “Business Analysis Benchmark”4 showed that
project success rates for agile organizations more than doubled from 42 percent at the
lowest maturity level (level 1) for the competency to 91 percent at the highest maturity
level (level 4). Moreover, it found that even modest increases in maturity levels could have
a significant impact. For example, a half-step increase from level 2 to 2.5 led to a rise in
success rates from 62 percent to 74 percent for agile organizations. (More on this research
is presented in Chapter 2.)

1. Quantitative Software Management Associates (QSMA), “The Agile Impact Report. Proven
Performance Metrics from the Agile Enterprise,” QSMA for Rally Software Development Corp.,
2009, 1.
2. QSMA, “Agile Impact Report,” 1.
3. The report correlated success to the maturity level of the requirements process, roughly
equivalent to what I refer to as analysis and planning in the book. The report looked at the impact
of maturity level on success rates for different development approaches, including agile.
4. Keith Ellis, “Business Analysis Benchmark—The Impact of Business Requirements on the
Success of Technology Projects,” IAG Consulting, 2009.
P xxxv

Problems that can be addressed by having effective agile analysis and planning capabil-
ities in your organization include the following:

• Added costs for rework because requirements were sufficiently understood up front
• Delays due to poor team planning and coordination
• Reduced team productivity because work is not being well prioritized across the
product
• Poorly managed stakeholder expectations
• Underresourced, overworked product owners
• Challenges scaling agile development because cultural issues within the organiza-
tion are not appropriately addressed

Today, agile analysis and planning is recognized as an effective approach for address-
ing these issues and more. Organizations who already have business analysts experienced
in traditional business analysis are upskilling them with agile competencies and embracing
them as valuable contributors. At the same time, startup technology companies that began
their agile journey without a strong business analysis competency are now adding it to their
organizations. As they mature, they’re finding that the skillset is becoming more relevant
to them because of the increased levels of complexity in the business domains they address
and in their products’ underlying architecture.
The benefits to the business of establishing an effective agile analysis and planning
competency include the following:

• Enhanced ability to anticipate customer need: Agile analysts use a wealth of tech-
niques to gain a deep understanding of the customer. Root-cause analysis and
circumstance-based market segmentation identify the underlying needs of custom-
ers and the root causes of the problems they are experiencing. Kano analysis helps
the business forecast the capabilities customers would embrace. MVP testing reveals
which proposed features are most valuable to customers and validates hypotheses in
order to direct development resources.
• Improved ability to manage change: Agile analysis increases the ability of teams to
sense and respond to change and make the appropriate adjustments along the way.
• Ability to plan effectively: The competency enables an organization to plan effectively
for the short term and long term, whether under conditions of extreme uncertainty
or when conditions are well known.
• Reduced time to market: Time to market is reduced because agile analysis focuses
development effort on a minimal set of high-value features that are further evalu-
ated and enhanced over time.
• Data-informed decisions: Agile analysis and planning practices enhance the ability
to make data-informed decisions by using the lean startup MVP process, A/B test-
ing, and actionable metrics.
xxxvi P

• Reduced rework and delays: Agile analysis reduces rework and unnecessary delays
because the right amount of analysis is performed at the right time.
• Improved team productivity: Productivity improves because the team is always
working on items of the highest value across the product.
• Improved stakeholder engagement: Stakeholders are more engaged due to an incre-
mental, rolling analysis process that involves them throughout the lifecycle.
• Product owner support: With a well-developed agile analysis competency, product
owners are provided with the support they need to be effective in their jobs. Agile
analysis and planning practitioners take on requirements and day-to-day commu-
nication with the team so that product owners can focus on the outward-facing
aspects of the role.
• Ability to leverage the business analysis (BA) experience: By upskilling their exist-
ing business analysts and incorporating them into agile organizations, companies
can leverage the experience of seasoned business analysts to improve team perfor-
mance on agile initiatives.

Who Should Read This Book


The intended readers for this book can be broadly grouped as follows:

• Business analysis practitioners and product owners


• IT directors and leaders of centers of excellence (CoEs) in business analysis, agile
practice, and DevOps
• Educators

The benefits for each type of reader are as follows.

Business Analysis Practitioners and Product Owners


The primary reader for this book is the working professional—a person responsible for the
analysis, planning activities, or both, in an agile software development organization.
The job titles of those who perform this work vary widely among organizations, as does the
distribution of responsibilities between those titles. They include business analysts, team
analysts, product owners, proxy product owners, and product managers. This book is for
anyone responsible for this work in an agile organization—regardless of job title.
If you are a product owner, you can use the knowledge in this book to learn how to

• Organize and coordinate agile teams for peak effectiveness.


• Analyze the market for the product.
• Develop a compelling product vision statement.
P xxxvii

• Plan and estimate requirements implementation at all planning horizons.


• Plan MVPs to test hypotheses for the product and make data-informed decisions.
• Prioritize epics and features across the product.

If you’re a business analyst, you can use this book to communicate the product vision
to the team and help them translate that vision down to smaller requirements units and
specifications (e.g., features, stories, and their acceptance criteria). Within these pages,
you’ll also find detailed guidance on maintaining the product backlog, tracking the prog-
ress of stories, story preparation, and estimation. Senior business analysts will learn how
to prepare and tailor the agile analysis process for their situation—including setting up
the product backlog, gaining consensus on the definition of ready, setting Kanban work-
in-progress limits, and determining capacity.
If you’re responsible for analysis and planning at any level in your organization, the
information in this book will provide you with the confidence and skills to work effec-
tively within any of the popular agile frameworks and practices in use today. If you’re an
entry-level business analyst or team analyst, you’ll appreciate the chapter on fundamen-
tals, the detailed guidance on feature and story preparation, and the wealth of job aids
in the book. If you’re a product or higher-level business analyst, you’ll benefit from the
book’s strategic guidance dealing with culture, stakeholder analysis business objectives,
strategic planning, and scaling considerations.

IT Directors and Leaders of CoEs in Business Analysis and Agile Practice


IT directors and CoE leaders in business analysis and agile practice can leverage the infor-
mation contained in this book to

• Develop and customize an agile analysis and planning framework that’s right for
the organization.
• Build a library of CoE resources for analysts and planners using the book’s tem-
plates, checklists, and examples.
• Craft a strong value proposition to communicate the benefits of agile analysis and
planning competency in the organization.

Educators: College or Corporate Trainers or Learning Directors


If you’re an educator, you can use this book as a basis for building a curriculum in agile
analysis and agile development that incorporates today’s most popular proven concepts,
tools, and techniques. Each chapter describes clearly defined objectives and summa-
ries, leveraging a running case study with sample solutions that you can use for group
workshops.
If you are interested in using the book to build a training curriculum, please contact
me for additional content and services, including PowerPoint presentations, eLearning
xxxviii P

offerings, and in-house training. Send email inquiries to [email protected] or check online
at https://www.nobleinc.ca.

How This Book Works


Think of this book as the voice of a coach in your ear as you walk through the agile analy-
sis and planning process. Each chapter guides you through the activities performed at that
point in the agile development cycle. The steps are illustrated with a running case study so
that you can see how analysis and planning artifacts evolve the course of development and
how they connect to each other. Additional examples are provided so you can see how to
apply the techniques to other situations.
I should note that the sequencing of analysis activities in the chapters is only a rough
guide because agile analysis and planning is not a sequential process. You rarely complete
a planning or analysis activity in one step; more typically, you perform some of it up front
and the rest of it in a rolling fashion. Moreover, activities are often carried out concur-
rently. For the most part, the chapters are sequenced based on the order in which activities
are first performed.

How to Read the Book


There are two ways to read this book:

1. The traditional way, front to back. That’s what I’d advise if you’re new to agile or
business analysis.
2. By skipping to the parts that are most important to you. You may prefer to read the
book this way if you have some agile experience and want to fill in your knowledge
gaps. In that case, I’d recommend you
– First scan Chapter 3 to fill gaps you may have in fundamental concepts.
– Next, read Chapter 4 to gain a bird’s-eye view of the agile analysis and planning
activities covered in this book.
– Then go to the chapters that deal with the activities that interest you. Each chap-
ter is self-contained, dealing with one or more analysis or planning activities.
When it refers to a topic that was introduced earlier in the book, I’ve included a
cross-reference in case you’re reading the book in a nonsequential manner.

Overview of Chapters
The following is a brief description of each chapter:
P xxxix

Chapter 1, The Art of Agile Presents a brief, personalized look at the art of agile
Analysis and Planning analysis and planning based on lessons learned from my
life both as an artist and as an analyst. It explains why
I believe the agile approach is conducive to the creative
process.
Chapter 2, Agile Analysis Presents the value proposition for developing an effec-
and Planning: The Value tive competency in agile analysis and planning in an
Proposition organization.
Chapter 3, Fundamentals Explains the principles, frameworks, concepts, and
of Agile Analysis and practices that underlie the agile analysis and planning
Planning competency and the rest of this book, such as lean,
Kanban, Scrum, DevOps, and user stories.
Chapter 4, Analysis and Provides an overview of planning and analysis activities
Planning Activities across across the agile product development lifecycle. Three
the Agile Development scenarios are covered: short-term initiatives with plan-
Lifecycle ning horizons up to three months, long-term initiatives
up to five years, and scaled agile initiatives. The Agile
Analysis and Planning Map in this chapter provides a
bird’s-eye view of the process. This map is referenced in
later chapters so that you can see where you are in the
development process as you progress through the book.
Chapter 5, Preparing the Explains how to prepare an organization for agile soft-
Organization ware development, including guidance on forming effec-
tive agile teams, managing stakeholders’ expectations,
and guidelines for governance, finance, and marketing
groups. (Please note that guidelines specific to scaled
organizations are covered in Chapter 17.)
Chapter 6, Preparing the Describes how to prepare the agile analysis and plan-
Process ning process. Senior analysts and CoE leads will learn
how to customize the right agile framework and prac-
tices for their situation and how to fine-tune process
parameters like work-in-progress limits and the defini-
tion of ready to optimize team productivity.
Chapter 7, Visioning Covers early analysis activities to envision a new prod-
uct or significant enhancement. Product owners can use
the information in this chapter to craft effective prod-
uct and epic vision statements and specify objectives.
Analysts will learn to communicate the product vision
to the team and continue the visioning process through
root-cause and stakeholder analysis. The chapter also
covers the specification of “leap of faith” hypotheses in
preparation for MVP planning.
xl P

Chapter 8, Seeding the Focuses on the discovery and specification of the initial
Backlog—Discovering and items in the product backlog. Analysts and product
Grading Features owners should read this chapter to learn how to prior-
itize and specify features and nonfunctional require-
ments for the product or release backlog. Prioritization
tools covered in this chapter include Kano analysis, cost
of delay, and weighted shortest job first (WSJF).
Chapter 9, Long-Term Explains how to perform long-term planning for hori-
Agile Planning zons of six months to five years. Product owners and
business analysts can use the information in this chapter
to create a long-term product roadmap, specify goals,
objectives, assumptions, and metrics for the planning
period. The chapter explains the full-potential plan—an
approach for planning transformative change over a
three- to five-year period. It describes the agile approach
to planning using MVPs to test assumptions and
determine what to include in the product. The chapter
also explores deployment strategies and options for the
long-term implementation plan, including guidelines for
when to use narrow and deep versus wide and shallow
approaches.
Chapter 10, Quarterly and Describes how to prepare upcoming features. When
Feature Preparation the team is using a Kanban approach, this preparation
occurs on a rolling basis. When a timeboxed planning
approach is used, it occurs before quarterly planning
for the group of features lined up for the quarter.
This chapter applies to both approaches. The chapter
includes both agile and legacy tools, including the fea-
ture definition of ready, ATDD, specification of feature
acceptance criteria using BDD, value stream mapping,
journey mapping, and process modeling.
Chapter 11, Quarterly and Describes how to plan an upcoming feature or quar-
Feature Planning ter. The chapter applies to teams that use timeboxed
planning approaches (in which case all features for
the quarter are planned together) and those that use a
single-item flow-based approach (in which case a single
feature is planned). The chapter begins with guidance
on when to use which approach. It explains how to plan
and estimate features using methods and approaches
such as the Planning Game, Planning Poker, Delphi
estimation, story points, ideal developer days, as well as
the no-estimating approach.
P xli

Chapter 12, MVPs and Demonstrates how to use MVPs and story maps to plan
Story Maps the delivery of learning and value within short time-
frames. MVPs are minimal versions of the product that
enable the product owner to test hypotheses and make
data-informed decisions about development investment
and resource allocation. Story maps are visual repre-
sentations of the plan that indicate the operational and
implementation sequencing of stories.
Chapter 13, Story Covers the analysis of stories before implementation.
Preparation This preparatory work occurs on a rolling basis if the
team is using Kanban. It is performed before iteration
planning when a timeboxed approach such as Scrum is
used. This chapter covers both contexts. Tools covered
include the INVEST story-writing guidelines, patterns
for splitting stories, and the specification of story accep-
tance criteria using BDD and the Gherkin syntax.
Chapter 14, Iteration and Covers planning for a short-term horizon of one week to
Story Planning one month. The chapter explains how to determine team
capacity and how to forecast which stories will be done.
Planning tools covered in this chapter include the itera-
tion backlog, developer task board, and Kanban board.
Chapter 15, Rolling Analysis Describes day-to-day rolling analysis and planning
and Preparation— activities. The chapter includes guidance on ongoing
Day-to-Day Activities story and feature preparation, the daily stand-up,
updating the developer task board, burndown chart,
cumulative flow diagrams, and more.
Chapter 16, Releasing the Covers the final preparations for general availability
Product (GA), also known as production release. The chapter
includes guidance on operational preparations, value
validation, alpha testing, and beta testing. It also exam-
ines the pros and cons of using a hardening iteration
before GA.
Chapter 17, Scaling Agility Describes the analysis and planning challenges faced
by large agile organizations. It provides actionable
guidance for scaling the agile organization, the process,
and the product backlog. This chapter explains and
incorporates best practices for scaled agile development,
including DevOps, CI/CD, ATDD, BDD, and SAFe.
Chapter 18, Achieving Explores agile analysis, planning, and product develop-
Enterprise Agility ment from the enterprise perspective—beyond the IT
context that has been the main focus of the rest of this
book. The chapter includes thirteen practices for opti-
mizing an enterprise’s responsiveness to change.
xlii P

Appendixes Provide a collection of useful tools for the agile analyst


and planner, including checklists, templates, and agendas
for easy reference on-the-job or during training. Also
included is a detailed case study illustrating discovery-
driven planning—the financial planning counterpart to
the data-driven development approach described in the
rest of this book.

Repeating Book Features


This book contains several repeating features to make it easier to find what you need.
They are identified with icons as follows:

Checklist: Useful lists for the practitioner (e.g., a checklist of stakeholders)

Example: A concrete example of an artifact


e.g.

Template: A template for creating an artifact (text or diagram)

Tips and Guidelines: Useful tips, guidelines, and formulae for the practitioner

Cross-reference: Cross-reference to another book section, where you can learn more about
a topic

Introducing the BLInK Case Study


This book follows one case study through the product development lifecycle, from vision-
ing to continuous value delivery. The case study is included so that you can immediately
see how to apply the techniques and to connect them over the course of product develop-
ment. (If you’re not a fan of case studies, you can skip or quickly scan those sections. I
won’t be offended, and you won’t miss any new concepts.)
Many people learn best by doing. I am one of them. If that describes you, I urge you to
actively work through the case study sections yourself, comparing your deliverables with
those I’ve provided in the book. It’s perfectly okay for your deliverables to be different from
those in the book or for you to come up with different results. The outputs will depend on
P xliii

the conversation you have (or imagine having) with stakeholders and how you choose to
document them. What’s important is that you can justify any decisions you’ve made.
The example I’ve chosen for this book revolves around a fictionalized insurance com-
pany called Better Living (BL) Inc. As the case study opens, BL is looking to develop
a usage-based insurance (UBI) product that uses data from Internet of Things devices
to personalize health insurance costs and benefits. The product is to be named BLInK
— Better Living through Insurance Knowledge.
One reason I chose this case study is that it’s current: as I started work on this book,
I was working with an insurance client on a similar product. But the main reason I chose
it is that it involves the analysis of an innovative product within a mainstream business—
just the type of initiative where one is most likely to find an agile business analyst. As the
case study opens in Chapter 7, the product is in its early visioning phase. Throughout the rest
of the book, we follow the agile analysis and planning of this product through to imple-
mentation and delivery.

Certification Information
This book is mapped to the following professional certification guides:

• BABOK v3: A Guide to the Business Analysis Body of Knowledge


• Agile Extension to the BABOK Guide v2
• The PMI Guide to Business Analysis
• The Agile Practice Guide

For a detailed mapping of chapters to the guides, please see Appendix A.2.

Register your copy of The Agile Guide to Business Analysis and Planning on the
InformIT site for convenient access to electronic templates, updates, and/or correc-
tions as they become available. To start the registration process, go to informit.com/
register and log in or create an account. Enter the product ISBN (9780134191126)
and click Submit. Look on the Registered Products tab for an Access Bonus Content
link next to this product and follow that link to access any available bonus materi-
als. If you would like to be notified of exclusive offers on new editions and updates,
please check the box to receive email from us.
xliv P

Thanks
No person gets anywhere on their own; we all do it with the help and mentoring of others.
First and foremost, I want to thank the many colleagues and mentors who have generously
shared their knowledge throughout my career. A special thanks to Alain Arseneault, with
whom I worked closely at BMO Financial Group and in many other contexts. He has been
enormously instrumental in the development and success of business analysis internation-
ally through his pioneering work developing the bank’s competency and later through
his involvement with IIBA in multiple capacities, including acting CEO. Alain has been
incredibly generous with support and guidance over the years, and he has gone beyond-
the-beyond with this assistance on this book. I can’t thank him enough.
Often, transformative change is the result of a change agent—an individual with vision
and a strategy for executing it. I’ve met these talented individuals in many organizations,
and they’ve often wielded influence far beyond their formal titles, largely as a result of
the respect in which they are held by their peers. In this regard, I want to thank Abhijeet
Mukherjee, with whom I worked at UST Global to raise the maturity level of business
analysis across the corporation. Thanks, too, to Saurabh Ranjan, who was UST’s COO
at the time and a champion and primary sponsor for Global BA and Strategic Consulting
CoE-related programs and initiatives. I also want to thank three other leaders of change
in their organizations—Trenton Allen at REI Co-op; Andre Franklin at Covance; and
Dana Mitchell, agile practice lead for agile transformation at TD Bank Securities—for
trusting me to work with their teams and for sharing their insights about agile analysis
and planning practices.
A big shoutout as well to the early agile adopters, clients who saw the promise of itera-
tive, incremental development right from the beginning and were true pioneers in business
domains that were not particularly open to agile development and analysis at the time.
Foremost among these was John Beattie, former VP at TELUS—an agile visionary and
someone with whom I had the immense pleasure of working. I’d also like to thank Tim
Lloyd from True Innovation for his helpful encouragement and collaboration over many
years.
Special thanks to Karl Wiegers, whose early writing on requirements spurred my interest
in business analysis, for sharing his experience and guidance as a writer and analyst. He is
a living example of the principle of paying it forward. Thanks also to Christopher Edwards
for his valuable input and detailed notes on the last chapters. Without all of these people,
and many others too numerous to name, this book would not exist in its current form.
Thanks also to my technical editors, Ron Healy, for the care he took to consider the
guidance in this book against his own experience, and to Clifford Berg, who encouraged
me to expand the coverage of DevOps practices and challenge my own assumptions, and
helped me find the most useful guidance to highlight in several of the book’s key chapters.
Both editors gave me precisely what I was looking for—a hard time—and the book is
much better for their efforts. Thanks also to Tracy Brown, my development editor, for her
support and guidance. A huge shoutout to Haze Humbert, executive editor at Pearson, for
cajoling, encouraging, and generally kicking my ass to get this book done, and to everyone
else on the Pearson team, including Rachel Paul, Menka Mehta, Julie Nahil, and Carol
P xlv

Lallier. Thanks, as well, to Christopher Guzikowski, my first editor at Pearson during the
early days of the book, for believing in the book and supporting it when it counted most.
This book is especially indebted to the almost weekly telephone calls about its themes
over the four years of its making with a lead developer at Hootsuite, one of Canada’s
most innovative agile companies. His input and insights are so interwoven into this book
that he is very much a collaborator. It is an added pleasure that he is also my son, Yasha
Podeswa.
This page intentionally left blank
About the Author

Howard Podeswa is an established author, professional artist, and sought-after speaker at


international conferences. His paintings have been shown in numerous exhibitions across
Canada and internationally, including the United States, Italy, and South Africa. His work
is held in numerous private and public collections.
Podeswa’s career in software development began when an academic background in
nuclear physics led to a job working on a nuclear-accident simulation program for Atomic
Energy of Canada Ltd. Since then, he has been enthralled by software development and
often found himself on the cusp of change as a developer of innovative systems in trans-
portation, laboratory automation, and communications. From the 1990s onward, he has
been helping large organizations transition their planning, analysis, and requirements
engineering (RE) processes to agile practices across a broad range of sectors, including
telecommunications, banking, government services, insurance, and healthcare.
He plays a leading role in the industry as a designer of agile and business analysis
(BA) training programs for companies and higher education institutions, including Bos-
ton University Corporate Education Center and Humber College; as a reviewer of the
BA profession’s standard books of best practices (BABOK [IIBA] and Business Analysis
for Practitioners—A Practice Guide [PMI]); and as an author whose books have become
staples in many BA libraries: The Business Analyst’s Handbook and UML for the IT
Business Analyst.
Podeswa, through his role as director for Noble Inc., has provided agile and BA train-
ing programs and consulting services to clients across the globe in the private and public
sectors. Companies that have benefitted from his services include the International Stan-
dards Organization (ISO), Moody’s, the Mayo Clinic, TELUS, UST Global, BMO, TD
Bank, Intact Insurance, Labcorp, the US Food and Drug Administration (FDA), Canada
Mortgage and Housing Corporation (CMHC), Bell Nexia, and Thomson Reuters.

xlvii
This page intentionally left blank
254 C 10 Q  F P

10.2 This Chapter on the Map


As indicated in Figure 10.1, we’ll be examining the following items in the Seeding the
Backlog zone: epic preparation, feature preparation, acceptance test–driven development
(ATDD) / behavior-driven development (BDD), persona analysis, journey mapping, value
stream mapping, and process mapping.

10.3 Overview of Features


Since we’re about to focus on features, let’s quickly review some fundamental concepts
about them.
A feature is a product-level work item that can be completed by one or more teams
within one quarter or release cycle. The feature may be expressed in the Connextra for-
mat—for example, “As a member, I want to receive messages and notifications so that I
can respond to issues that require my immediate attention.”
A feature is bigger than a story but smaller than an epic. The relationships can be sum-
marized as follows:

Epic > Feature > Story

Features often begin as epics. As we learned earlier, in Chapter 7, “Visioning,” an epic


is a product-level work item that may require multiple teams over multiple quarters and
may span product areas, business areas, and value streams. An example of an epic is the
introduction of home delivery across a product line to increase sales revenues by 20 per-
cent. Chapter 7 explains how to prepare an epic by articulating the epic vision and leap of
faith hypotheses. It also explores the MVP process for determining the minimum market-
able features (MMFs)—the high-value features to develop. The next step is to prepare the
upcoming features. This chapter focuses on that preparation.

10.3.1 Examples of Feature-Length Change Initiatives


As discussed in previous chapters, you decompose large work items into stories—small
work items that deliver value but require no more than a few days’ work—in order to
shorten the feedback cycle and smooth the flow of work. If that’s the case, why not dis-
pense with epics and features and treat all requirements as stories? You can, if the team
is exclusively tasked with small enhancements and bug fixes. Frequently, though, teams
are asked to work on items that exceed the maximum size for a user story. A larger con-
tainer—an epic or feature—is required to encapsulate the high-level functionality and
objectives that it will deliver. Epics and features also include acceptance criteria (AC) that
describe the product’s behavior when stories are strung together in an end-to-end work-
flow. Examples of work items larger than a story include the following:
10.3 O  F 255

10.3.1.1 Deliver a New or Improved Value Stream or Process


A work item to create a new process or value stream—or reengineer an existing one—
typically exceeds the maximum size of a user story and must be managed as a feature.
Feature preparation activities may include value stream mapping and modeling of the
current and future processes.

10.3.1.2 Nontrivial Change to a Mature Product


When a product is young, it’s relatively easy to add a new capability because there aren’t
too many existing ones that the new capability might affect. However, as the product
matures and accumulates a broader range of capabilities and components, it becomes
harder to add or change a capability because it can affect so many existing parts. As a
result, the change request must be classified as a feature.
Consider Customer Engagement One (CEO), the app being developed by our example
company, CEO Inc.1 Suppose the first version of the product allows customer support
agents to view messages from two sources—each with its own format and rules. If the
product owner (PO) wants to add a third message source, such as email, doing so affects
only one function—viewing. This requirement is achievable within a few days, so you
manage it as a user story.
Now suppose CEO has grown into a mature product with features to ingest, view,
triage, tag, respond to, assign, and resolve messages. It’s much more difficult to add a new
message source because all of the existing features have to be adapted. A change of this
type now takes weeks to implement and involves multiple teams. Consequently, you treat
it as a feature (or epic, if it spans quarters), not a user story.

10.3.1.3 Implementing a Use Case


A use case is a usage of the product or system, typically sized to deliver a goal a user can
accomplish through a single interaction with the product. Examples of use cases include
the following:

• Submit a college application.


• Open an account.
• Place an order.

Each use case represents all the ways the interaction can play out, including successful
and unsuccessful scenarios that the solution must support. The effort to implement all
the scenarios of a use case typically exceeds the maximum story size. Consequently, you
manage the use case as a feature and each scenario or set of related scenarios as a story.
For example, you represent the Place an order use case as a feature. The user stories for
the feature include the following, expressed in an informal format:

• Place an order (basic flow: no options).

1. This example was adapted from one provided by Yasha Podeswa in a conversation with the
author, August 2019.
256 C 10 Q  F P

• Place an express order.


• Place a backorder.

10.4 Benefits of Feature Preparation


Sometimes I have to convince teams that feature preparation is not only allowed in agile
development but should be encouraged and included in the plan. By preparing features
before quarterly planning sessions begin, you facilitate improved capacity planning: devel-
opers can provide better estimates because they have a clear understanding of what’s
being requested. Furthermore, by preparing features before their implementation, you
enable hyperproductive teams. 2 Developers can begin work on the solution without hav-
ing to wait for key information or technical preparations. Collaborating teams can work
in parallel with confidence because the feature’s acceptance criteria (AC) and process
models specify how the pieces must fit together when assembled. If integration errors
show up, they’re caught quickly because the feature AC are also used as the basis for spec-
ifying and executing automated high-level integration tests.

10.5 Feature Preparation Activities


This chapter focuses on preparation, while the next chapter focuses on planning. There is
no strict line between the two, but in general, planning is about commitment—determining
what features and goals will be delivered and gaining the commitment of collaborating
teams to do the work. Preparation is the work to make an item ready for planning and
implementation.
The outcome of feature preparation is a ready feature—one that is suitable for quar-
terly planning and able to be implemented without undue delay or rework. For example,
a ready feature is prioritized and can be accomplished in three months or less by one or
more teams.
Feature preparation activities include analysis and technical preparation. The analysis
activities may include the items summarized in the following checklist.

Checklist of Feature Preparation Analysis Activities


q Specification of features and AC
q Context analysis
q Stakeholder analysis
q Persona analysis

2. Jeff Sutherland, “Scrum: What Does It Mean to Be Ready-Ready?” (OpenViewVenture, 2011),


https://www.youtube.com/watch?time_continue=3&v=XkhJDbaW0j0
10.6 T  F P 257

q Journey mapping
q Value stream mapping
q Process modeling
q Use-case modeling
q User-role modeling workshops
q Initial splitting into stories
This chapter covers all of the items in the preceding list except for the last. The decom-
position of features into stories (aka story splitting) is covered in Chapter 13, “Story
Preparation.”
To be clear, you don’t perform all of the preparatory activities in the preceding check-
list for every feature. The chapter provides guidance on activities to consider doing—but
only do what’s necessary for the situation.

Guidelines for splitting features into user stories are provided in Chapter 13, section 13.13.
Additional guidelines for preparing features on a scaled initiative can be found in Chap-
terb17, section 17.9.12.

Technical preparation involves the drafting of a solution design, creation and testing
of proofs of concept and prototypes, and readying the architectural runaway—a task that
includes the specification of service communication protocols, identification of compo-
nents, and creation of infrastructure. While this book focuses on analysis issues, we do
review some of the models used in technical preparation that you should be familiar with
as an analyst. These include the following:

• Context diagrams
• Communication diagrams
• Data-flow diagrams
• Block diagrams

10.6 Timing of Feature Preparation


When do you begin the preparation of features? The lean guideline is to wait until the last
responsible moment (LRM)—the point at which any further delay would result in unaccept-
able costs. How you apply this principle depends on the planning approach you’re using.
In a Kanban system, you prepare each feature as it approaches the top of the backlog,
with a lead time of about six weeks for large features and two to four weeks for smaller
ones.
If the teams are using the alternative planning approach—timeboxing—you prepare
the group of features lined up for the upcoming quarter starting about halfway (six weeks)
258 C 10 Q  F P

into the prior quarter. Some organizations prepare these features in a reserved iteration
(e.g., SAFe’s I nnovation and Planning [IP] Iteration), 3 but this is generally not advised.
We look at arguments for and against reserved iterations (aka hardening iterations) in
Chapter 17.

10.7 Assessing Readiness


Use the checklist in Appendix A.7 to assess whether or not teams are ready for quarterly
planning. Conditions in the checklist include that a vision, roadmap, and impacted users
have been specified and that sufficient features (about ten to twenty) are ready.

10.7.1 Using the Feature Definition of Ready (Feature DoR)


Use the feature definition of ready (DoR) to determine if a feature is ready to be included
in the quarterly plan or (in Kanban) to advance on to development.
The following are examples of the feature DoR conditions we saw in Chapter 6, “Pre-
paring the Process.”

• The feature is right-sized: The feature is small enough to be implemented within a


quarter by one or more teams.
• The feature has no (or minimal) dependencies on other features.
• The feature is valuable.
• All teams are committed.
• The feature is estimable: The feature is understood well enough to be estimated.

For more on the feature DoR, see Chapter 6, section 6.5.7.6.

10.8 Accounting for Preparation Work: Tasks and Spikes


Once you’ve flagged the need for preparatory analysis, how do you account for that work
in your plans? If the analysis will be performed during the iteration in which it’s flagged,
represent it as a developer task. A developer task is a work item carried out by an individ-
ual team member. (The term developer is deceiving. Analysis, design, testing, and coding
are all treated as developer tasks.) Developer tasks are posted on a developer task board.

We look at developer tasks and developer task boards in Chapter 15, sections 15.4, 15.6,
15.7.3, and 15.7.5.

3. Richard Kastner and Dean Leffingwell, SAFe 5.0 Distilled: Achieving Business Agility with the
Scaled Agile Framework (Boston: Addison-Wesley, 2020), 262.
10.9 S F  T A C 259

If you plan to defer the analysis work to a future iteration, you’ll have to add it to the
product backlog. However, you can’t represent it as a user story because it doesn’t result
in working code. Instead, you manage the analysis as a functional spike, also known as
an enabler story. We’ll look at functional spikes in Chapter 13. Figure 10.2 is an example
of one.

e.g.
[5] Acceptance Criteria
Functional Spike: 1. A set of input conditions
affecting pricing
As an analyst, I want to
investigate pricing rules 2. Business rules, verified
so that the story to order by customer, specifying
a product may be how a product is to be
enabled. priced on the basis of
input conditions

Figure 10.2 Example of a functional spike

Figure 10.2 illustrates the functional spike to investigate pricing rules. The value that
it delivers is expressed in the “so that” clause: the spike enables a future story to order a
product. The spike is assigned five story points, indicating the estimate and time limit for
the analysis.
Once you’ve identified the analysis activities required to prepare the feature, the next
step is to perform them. The following sections provide guidelines for performing feature
AC specification, persona analysis, journey and value stream mapping, and process and
use-case modeling.

10.9 Specifying Features and Their Acceptance Criteria


Meet with business representatives, developers, and testers (sometimes called “the Triad”)
to describe the feature in a way that clearly communicates the requirement. Chapter 8,
“Seeding the Backlog—Discovering and Grading Features,” section 8.7, provides guide-
lines for specifying features using the Role-Feature-Reason (Connextra) template. Coach
stakeholders and the team to use the template, but don’t force its use where the resulting
wording is unnatural and impedes understanding.
Then, specify feature AC. AC play a central role in agile analysis: they serve as require-
ments and as the basis for user acceptance testing (UAT). For the first release of the fea-
ture, specify just enough AC to define an MMF—the minimum functionality required to
deliver value that the customer would view as significant.
As an analyst, you support feature AC specification. You support ATDD guidance
by ensuring AC are specified before work on the feature begins so that they can serve as
specifications by example. The AC tell the developers how much functionality must be
260 C 10 Q  F P

delivered for the item to be releasable—providing them with the information they need to
estimate the feature for capacity planning. The AC also serve as test scenarios to validate
the solution. These scenarios describe how the product must behave when user stories are
strung together in a larger workflow or value stream. A common approach is to specify
the AC in a feature file in the Gherkin syntax so they can be interpreted by a test automa-
tion tool such as Cucumber.
AC and estimates are so intertwined that you should encourage stakeholders to dis-
cuss them at the same time with developers and QA professionals so trade-offs can be
explored. This is the principle behind the Triad approach, discussed in Chapter 13.

For more on the Triad, see Chapter 13, section 13.6.3.

10.9.1 Specifying Epic Acceptance Criteria


Specify epic AC that communicate, at a high level, the minimum requirements for comple-
tion. In Chapter 7, we saw the following epic example. Its AC expresses the epic’s business
objective, “legacy system can be retired.”

e.g. Epic: Modernize customer loyalty program.


Acceptance Criteria: Implementation of this epic means that the legacy system
can be retired.

The following AC examples specify minimum capabilities for an epic.

Epic: As a planner, I want to introduce dropship capability to increase top-line


sales without the inventory ownership expense.
Acceptance Criteria:
Provide the ability to identify dropship-eligible product.
Enable financial reporting (sales $/units, sell-through %, inventory ownership) for
all dropship-eligible products.
Identify when dropship-eligible product is no longer available for sale.
Provide the ability to execute a clearance (markdown) price change for drop-
ship-eligible product.

Epic: Implement payment platform.


Acceptance Criteria: Completing this epic allows multiple payment types to be
used interchangeably.
10.9 S F  T A C 261

10.9.2 Specifying Feature Acceptance Criteria


Like epic AC, feature AC do not have to cover all possible scenarios. Instead, begin by
specifying an MMF that includes only the minimum level of functionality needed for the
feature to be seen as valuable by customers.
Following is an example of feature belonging to the epic we saw earlier: “As a planner I
want to introduce dropship capability to increase top-line sales without the inventory own-
ership expense.” Its AC are specified in brief descriptive text, also known as scenario titles.

Feature: Enable dropship product identification in assortment planning.


Acceptance Criteria:
Scenario: Specify a dropshipped product. (success)
Scenario: Specify a product ineligible for dropshipping. (failure)
Scenario: Search for dropshipped products satisfying search attributes.

Following is an example we saw in Chapter 8.

Feature:
As an incident manager, I want to manage incidents from a single interface so that
I can view and prioritize issues across all sources.
Acceptance Criteria:
I can view and manage scheduling delays.
I can view and manage nonemergency incidents.
I can filter/sort/rank all incidents by defined attributes.

10.9.3 The Analyst Contribution


As an agile analyst, you support ATDD by facilitating Triad conversations between
stakeholders, QA, and developers about AC and by specifying AC, as discussed earlier.
However, you should review and adjust your contribution over time based on experience.
Options for your involvement in feature AC include the following:4

• You own the feature files—or the team as a whole owns them.
• You write the AC, scenario titles, and Gherkin given/when/then specifications—or
you write AC and scenario titles, and QA professionals write the given/when/then
specs.

4. Ian Tidmarsh, “BDD—An Introduction to Feature Files,” Modern Analyst,


https://www.modernanalyst.com/Resources/Articles/tabid/115/ID/3871/BDD-An-introduction-
to-feature-files.aspx
262 C 10 Q  F P

10.9.4 Analyze AC During Triad Meetings


Analyze AC for epics and features incrementally, through collaborative sessions with busi-
ness stakeholders (representing the customer), testers, and developers—the Triad.
Before committing a feature to development, facilitate Triad discussions to specify high-
level AC in the language of the business. The AC and conversations clarify the require-
ments to stakeholders, testers, and developers. Continue to meet with the Triad to refine
the AC with more specific test scenarios.

See Chapter 13, section 13.6.3, for more on the Triad.

This chapter focuses on feature preparation, but you also need to prepare stories and
their AC. Story preparation and AC are discussed in Chapter 13.

10.9.5 Specifying AC in the BDD Gherkin Syntax


The Gherkin syntax is widely used because it can be easily interpreted by stakeholders,
testers, and test automation tools. Typically, you begin by writing story AC informally;
then, as the story approaches development, you specify test scenarios in Gherkin feature
files. Gherkin includes keywords such as given, when, and then to identify standardized
aspects of test scenarios.

Gherkin Template
Scenario: <<scenario title>>
Given <<precondition>>
When <<trigger>>
Then <<postcondition>>

For example, you create the following feature to introduce dropship capabilities.

Feature: Introduce Dropship Capability


As a planner, I want to introduce dropship capability for the company to increase
top-line sales without the inventory ownership expense.
Acceptance Criteria

* Provide the ability to identify dropship-eligible product.


* Provide the ability to execute a clearance (markdown) price change for drop-
ship-eligible product.
* Enable financial reporting (sales $/units, sell-thru %, inventory ownership)
for all dropship-eligible products.
* Identify when dropship-eligible product is no longer available for sale.
356 C 12 MVP  S M

• Indicate operational workflow on a story map backbone.


• Indicate how feature implementation will be sequenced in the story map ribs.

12.2 This Chapter on the Map


As shown in Figure 12.1, the chapter examines story mapping and MVP in the Quarterly
Inception/Feature Inception zone.

12.3 MVPs and Story Mapping: How the Tools


Complement Each Other
The primary objective of quarterly/feature planning (the subject of the last chapter) is to
develop a plan indicating how goals and capabilities will be delivered over the planning
horizon. That much is true for both agile and traditional planning. What makes an agile
plan different is that its goals—especially at the start of new product development—are
often learning goals, validated through MVPs, experimental versions of the product or
feature designed to test hypotheses and deliver learning. The learning that is derived from
this process is fed back into the agile plan—impacting subsequent goals and features that
will be delivered.
MVPs and quick wins often require workarounds for steps that have not yet been
implemented. Story maps provide a convenient way to view an end-to-end workflow
at each time interval so that stakeholders and the team can visualize gaps where work-
arounds are required. Beyond their use for MVP planning, story maps are useful tools for
planning features so that workflows are supported and meaningful value is delivered to
the customer on a regular basis (e.g., at least every iteration or one- to two-week period).
Both tools are covered in this chapter. We begin with MVP planning.

12.4 MVP Planning


When a product is a new-market innovation, you can’t prioritize features reliably upfront
because customers themselves often won’t know what they want until they see it. The lean
startup approach, 2 introduced earlier in this book, addresses this problem by running
experiments on customers—short-circuiting “the ramp by killing things that don’t make
sense fast and doubling down on the ones that do.”3

2. Eric Ries, The Lean Startup (New York: Random House, 2011).
3. Brad Smith (CEO, Intuit), as quoted in Ries, The Lean Startup, 35.
12.4 MVP P 357

12.4.1 What Is an MVP?


A minimum viable product (MVP) is a low-cost, experimental version of the product or
feature used to test hypotheses and determine if it’s worth fully investing in it. According to
Eric Ries, the inventor of lean startup, an MVP is “that version of the product that enables
a full turn of the Build-Measure-Learn loop with a minimum of effort and the least amount
of development.”4 MVP is not (as often thought) the first version of the product released to
the market. It’s a version meant for learning—a means to test hypotheses and to determine
the minimum set of features to include in a market-ready product. The minimal releasable
version of the product is referred to as the minimum marketable product (MMP).

12.4.2 MVP Case Study: Trint


You only really understand why MVPs are so crucial to the success of innovative product
development when you see a real example of the process. That was the case as I followed
the story of Trint, a company founded by Emmy-winning reporter, foreign and war corre-
spondent (and good friend) Jeffrey Kofman. Like many late-stage entrepreneurs, Kofman
set out to solve a problem he understood intimately because it had bothered him through-
out much of his previous professional life: every time Kofman had to transcribe an inter-
view by hitting PLAY, STOP, TRANSCRIBE, and REWIND, he couldn’t understand why
he was still using a process that had remained virtually unchanged since the 1960s and
1970s. Why wasn’t artificial intelligence (AI) being used to automate the speech-to-text
transcription? He knew the reason: journalists can’t risk inaccuracies. Since AI makes
mistakes, journalists wouldn’t use an AI-based product unless there was a way to verify
the content. The real problem, then, was how to leverage automated speech-to-text in
order to get to 100 percent accuracy.
Kofman knew that if he could solve that problem, he would have a winning product.
Furthermore, he knew that if his team could solve it for journalists—whom he knew to
be unforgiving—they could solve it for anybody. He concluded, therefore, that the most
important leap of faith hypothesis for the product was that the company could find a way
for users to correct errors in place in order to deliver transcripts that could be verified and
trusted. As Kofman saw it, his team needed to create a layer on top of AI (the automated
speech-to-text component) so that the AI part would do the heavy lifting of transcription,
allowing the user to focus on quicker tasks: search, verify, and correct. He believed that by
using this approach, he could reduce the time to perform a task that would normally take
hours to complete down to minutes or even seconds. From earlier chapters of this book,
you’ll recognize Kofman’s steps as the beginning of the MVP process: the articulation of
the problem, vision, and leap of faith hypotheses for the product.
To create the MVP, Kofman gathered a team of developers with experience in audio-to-
text alignment using manually entered text. He challenged them to hack together an MVP
version that would automatically transcribe speech to text and allow a user to edit it.
The company’s first MVP was built in just three months. Kofman decided to use some
of his limited seed funding to invest in user lab testing. He brought in a group of journalists

4. Ries, 77.
358 C 12 MVP  S M

for the testing day. Interestingly (as is often the case), the first MVP was “wrong.” While
the journalists liked the concept, they struggled to use the product, finding it annoying
to switch back and forth between editing and playback modes. (The original design used
the space bar as a toggle between modes and as the text space character during editing,
confusing users.) As Kofman told me, “Good innovative products should solve workflow
problems; this was creating new ones.” And so, using feedback from the MVP, he asked
the developers to build a new user experience with a better workflow.
MVP isn’t just about one test; it’s a process. Fifteen months into the project, in early
2016, the company developed a more refined version of the MVP. Kofman was ready to
prove his hypothesis that there was a strong market for the product. At this point, the
product provided much of the core functionality needed by users, such as the ability to
search for text to locate key portions of an interview. However, it still lacked key compo-
nents required to make it fully ready for the market. For example, there were no mecha-
nisms for payments or pricing.
Through his extensive network of journalistic colleagues, Kofman let it be known
that they would be opening up the product for free usage during one week of beta test-
ing. When the testing began, things proceeded normally until an influential journalist at
National Public Radio sent out a highly enthusiastic tweet, causing usage to soar. At ten
thousand users, the system crashed. It took the company two days to get back online, but
the test proved beyond a doubt that there was a market for the product.
Today, Kofman views that one day of MVP lab testing as perhaps the most important
action taken by the company in its early days because it caused developers to change
direction before spending a lot of time and money on a failed solution. The lesson, as
Kofman tells it, is this: “You have to test your ideas out on real people”—the people who
will actually use your product.
In previous chapters, we examined how to identify the leap of faith hypotheses that
must be tested and validated for the product to be viable. Now, we focus on the next step:
planning the MVPs that will test those hypotheses.

12.4.3 Venues for MVP Experiments


Since an MVP is only a test version, one of the first things to consider is where to run the
test and who the MVP’s testers will be. Let’s explore some options.

12.4.3.1 Testing in a Lab


A user testing lab may be internal or independently operated by a third party. Testing
labs provide the safest venue for testing, making them appropriate for testing in highly
regulated mainstream business sectors, such as banking or insurance, where there is min-
imal tolerance for errors. Because the lab setting provides an opportunity to gain deep
insight into users’ experience of the product, it’s also an ideal venue for MVP testing at the
beginning of innovative product development when it’s critical to understand customer
motivations and the ways they use the product.
The testers should be real users. However, in cases where the requirements are stable,
proxies may be used (e.g., product managers with a strong familiarity with the market).
12.4 MVP P 359

Include testers familiar with regulations governing the product, such as legal and compli-
ance professionals, to identify potential regulatory issues.

12.4.3.2 Testing MVPs Directly in the Market


The most reliable feedback comes from MVP-testing in the marketplace to a targeted
group of real customers. Consider this option for new-market disruptions, where first
adopters are often willing to overlook missing features for novelty. This option is also
advised for low-end disruptions, where customers are willing to accept reduced quality in
return for a lower price or greater convenience.

12.4.3.3 Dark Launch


Another way to limit negative impacts during MVP feature testing is to dark-launch it—to
stealthily make it available to a small group of selected users before broadening its release.
If the feature is not well received initially, it can be pulled back before it impacts the prod-
uct’s reputation; if customers like it, it is developed fully, incorporated in the product, and
supported.

12.4.3.4 Beta Testing


A beta version is an “almost-ready-for-prime-time” version—one that is mostly complete
but may still be missing features planned for the market-ready version. Beta testing is
real-world testing of a beta version by a wide range of customers performing real tasks.
Its purpose is to uncover bugs and issues, such as usability, scalability, and performance
issues, before wide release.
Feedback and analytics from beta testing are used as inputs to fix remaining glitches
and address user complaints before releasing the product or change to the market. Split
testing may also be performed at this time—whereby one cohort of users is exposed to the
beta version while a control group is not.

For more on split testing, see Chapter 7, section 7.11.5.2.

Beta testing is not just for MVPs; it should be a final testing step after internal alpha
testing for all new features and major changes before they are widely released.

For more on beta testing, see Chapter 16, section 16.5.3.

12.4.4 MVP Types


When planning an MVP, the objective is to hack together a version of the product or
feature that delivers the desired learning goals as quickly and inexpensively as possible.
The following are strategies for achieving that. One MVP might incorporate any number
of these strategies.

• Differentiator MVP
• Smoke-and-Mirrors MVP
360 C 12 MVP  S M

• Walking Skeleton
• Value Stream Skeleton
• Concierge MVP
• Operational MVP
• Preorders MVP

These MVPs are described in the following sections.

12.4.4.1 Differentiator MVP


At the start of new product development, the most common strategy is to develop a
low-cost version that focuses on the product’s differentiators. This was the approach we
saw taken earlier by Trint. Using existing components, the company was able to piece
together an MVP demonstrating the differentiating features of its product (speech-to-text
auto-transcription plus editing) and validating its value in just three months.
Another example is Google Docs, which began as Writely. Writely was an experi-
ment by Sam Schillace to see what kind of editor could be created by combining AJAX’s
( JavaScript in the browser) content-editable functionality with word-processing technol-
ogy. 5 Early versions focused on the product’s key differentiators—its speed, convenience,
and collaborative capabilities—while leaving out many other word-processing features,
such as rich formatting and pagination. The hypothesis was that users would be excited
enough about the differentiators to ignore the lack of richness in other areas. Interestingly,
real-time collaboration on documents—which became a differentiating feature—was not
seen as a primary one at the time; it was included because it seemed like the most natural
way to solve the problem of documents worked on by multiple people.
The first version of the original product was pulled together quickly, using the browser
for most of the editing capabilities and JavaScript to merge the local user’s changes with
those of other users. The client-side JavaScript amounted only to about ten pages of code.6
Over time, the company added more word-processing features when it became apparent
that they were essential to users and in order to open up new markets. Just one year after
Writely was introduced, it was acquired by Google. Within the first month of its adoption,
about 90 percent of Google was using it.

12.4.4.2 Smoke-and-Mirrors MVP (or Swivel Chair)


A Smoke-and-Mirrors MVP approach provides the user with an experience that is a close
facsimile of the real thing but is, in fact, an illusion—like the one created by the magician
pulling strings behind the curtain in the movie The Wizard of Oz.

5. Ellis Hamburger, “Google Docs Began as a Hacked-Together Experiment, Says Creator,”


The Verge, July 3, 2013, https://www.theverge.com/2013/7/3/4484000/sam-schillace-interview-
google-docs-creator-box
6. Hamburger, “Google Docs.”
12.4 MVP P 361

One of my clients, a cable company, used this approach to provide an MVP frontend
for customers to configure their own plans. The site operated in a sandbox, disconnected
from operational systems. Behind the scenes, an internal support agent viewed the inputs
and swivel-chaired to an existing internal system to process the request. The customer
was unaware of the subterfuge. The MVP allowed the company to test the hypothesis
that customers would want to customize their own plans before investing in developing
the capability.

12.4.4.3 Walking Skeleton


A Walking Skeleton, or spanning application, validates technical (architectural) hypothe-
ses by implementing a low-cost end-to-end scenario—a thin vertical slice that cuts through
the architectural layers of the proposed solution. If the Walking Skeleton is successful, the
business will invest in building the real product according to the proposed solution. If it
is unsuccessful, the technical team goes back to the drawing board and pivots to a new
technical hypothesis.
For example, in the Customer Engagement One (CEO) case study, the organization
plans an end-to-end scenario for ingesting text messages from a social-network appli-
cation, saving the messages using the proposed database solution, retrieving them, and
viewing them as a list. Another example is Trint, whose first MVP incorporated the end-
to-end scenario from speech to text to editing in order to validate the architectural design
for the product.

12.4.4.4 Value Stream Skeleton


A Value Stream Skeleton implements a thin scenario that spans an operational value
stream—an end-to-end workflow that ends with value delivery. It’s similar to a technical
Walking Skeleton except that it validates market instead of technical hypotheses. It cov-
ers an end-to-end business flow but does not necessarily use the proposed architectural
solution.
The intuitive sequence for delivering features is according to the order in which they’re
used. For example, you might begin by delivering a feature to add new products to the
product line for an online store and follow with features to receive inventory, place an
order and fulfill an order. Not only does this sequence minimize dependency issues, but
it also enables users to perform valuable work while waiting for the rest of the system to
be delivered. I usually took this approach in my early programming days. The problem
with it, though, is that it results in a long lag until an end customer receives value (e.g., a
fulfilled order). In a business environment where there is a strong advantage in being fast
to market, that kind of lag is unacceptable. Another problem is that it can delay the time
until a company can begin receiving revenue from customers.
A Value-Stream Skeleton avoids these problems by delivering quick wins that imple-
ment thin versions of the end-to-end value stream, often with reduced functionality.
The first version of a Value-Stream Skeleton focuses on the value stream’s endpoints—
the entry point where the customer makes a request and the endpoint where the customer
receives value. Workarounds are often used for the missing steps. For example, the first
MVP for an online store allows a customer to purchase a few select products. The product
362 C 12 MVP  S M

descriptions and prices are hardcoded into the interface instead of being pulled from a
database. This lowers development costs. The products are offered only in a single geo-
graphic region—simplifying the business rules and delivery mechanisms that the MVP
implements. Despite the thinness of the MVP, it provides learning value to the business
and real value to an end customer, who can already order and receive the products with
this early version. As the business grows, the MVP evolves to handle more products and
a broader geographical region.

12.4.4.5 Concierge MVP


The Concierge MVP7 is based on the idea that it’s better to build for the few than the
many. Early versions are aimed at a small submarket that is very enthusiastic about the
product, and the learning gained from the experience is used to scale the product. One
example of a Concierge MVP is Food on the Table,8 an Austin, Texas, company that
began with a customer base of one parent. The company met with the parent once a week
in a café to learn the parent’s needs and take orders. The orders were filled manually. The
process was repeated for a few other customers until the company learned enough to build
the product.
As the example illustrates, you begin the Concierge MVP approach by selecting a sin-
gle, real customer. The first customer can be found through market research, using ana-
lytics to determine the desired customer profile and inviting a customer who fits the pro-
file to act as an MVP tester. Alternatively, you can select the first customer from among
individuals who have previously indicated an interest in the product. This customer is
given the “concierge treatment”—served by a high-ranking executive (e.g., vice president
of product development) who works very closely with the customer, adding and adjusting
features as more is learned.
At this stage, internal processes are often mostly manual. A company might spend a
few weeks working with the first customer in this way, learning what that person does
and does not want, and then select the next customer. The process is repeated until the
necessary learning has been obtained and manual operations are no longer viable—at
which point the product is built and deployed.

12.4.4.6 Operational MVP


An MVP isn’t always created to validate software hypotheses and features; it can also be
used to test operational hypotheses and changes. In a real-life example (which I’ll keep
anonymous to protect the company), a company created an MVP to test the impact of a
price hike on sales. The MVP displayed the higher price to a select group of customers, but
behind the scenes, the customers were still being charged the regular, lower price. Once
the learning objective was achieved, customers received an email notifying them that they
had been part of a test group and that no extra charges were actually applied.

7. Eric Ries, The Lean Startup (New York: Random House, 2011), 180.
8. Eric Ries in Lee Clifford and Julie Schlosse, “Testing Your Product the Lean Startup Way,”
Inc., July 17, 2012, https://www.inc.com/lee-clifford-julie-schlosser/lean-startup-eric-ries-testing-
your-product.html
12.4 MVP P 363

12.4.4.7 Preorders MVP


The most reliable and cost-effective way to test a value hypothesis that customers will
pay for an innovative product is to offer a means to order it before it’s actually ready. The
MVP can be something as simple as a promotional video or demonstration prototype. It
may employ a stripped-down ordering process, such as order by email attachment, order
by phone, or an online ordering site with hardcoded options. An MVP of this type might
not require any stories—or it might need a few small stories (e.g., to set up a simple fron-
tend for placing orders).
My own company, Noble Inc., used this approach when we were considering develop-
ing a product to provide a 360-degree evaluation of the business analysis practice in an
organization. For the MVP, we developed a facsimile of the product and demonstrated it
to our clients in an attempt to generate presales. What we learned was that there wasn’t
enough interest to justify building the real thing. Despite the failure of the test, I consider
it money well spent. Imagine if we had learned it only after a large investment!
Dropbox’s version of this MVP strategy played out much better. Dropbox posted a
video of its product,9 illustrating its main features. The video received enthusiastic and
voluminous feedback from potential customers—making the case for the product and
generating important suggestions about features and potential issues that were incorpo-
rated into the first marketed version.

12.4.5 MVP’s Iterative Process


You don’t just create an MVP and test it once. The MVP process is iterative. Its steps are
as follows:

1. Establish an MVP to test hypotheses.


Specify an MVP to test one or more leap of faith hypotheses (e.g., using any of the
MVP types discussed in the prior section).
2. Tune the engine.
Make incremental adjustments to fine-tune the product on the basis of feedback
from customers as they use the product.
3. Decision point: persevere or pivot.
After tuning for a while, decide whether to persevere with the business model or
pivot to a different hypothesis.

12.4.6 The Pivot


A pivot is a switch to a different hypothesis based on a failure of the original premise. A
company may decide to pivot near the start of a product’s development due to the MVP
process described previously. Alternatively, the pivot may occur at any time in a product’s
life if it becomes apparent there is no market for the product, and the product should be

9. Drew Houston, “Dropbox Original MVP Explainer Video,” 2007, https://www.youtube.com/


watch?time_continue=12&v=iAnJjXriIcw
364 C 12 MVP  S M

reoriented toward a new market or usage.10 An example of a pivot to an established prod-


uct is Ryanair, once Europe’s largest airline (based on passenger numbers).11 Back in 1987,
when the company realized it was failing financially, it pivoted to a low-end, disruptive rev-
enue model based on the hypothesis that customers would be willing to pay for meals and
other perks in return for cheap fares. The hypothesis was borne out when customers flocked
to the airline.12 More recently, in response to Brexit, the company has again pivoted—this
time away from the United Kingdom to a business model based on growth outside of it.13

12.4.6.1 Constructive Failures


A pivot represents a failed premise, but, as the Ryanair example shows, the failure can
often be constructive. In fact, many of today’s successful companies are a result of such
failures. For example, Flickr resulted from the failure of a previous offering—Game
Neverending.14 When the original product failed, the company pivoted by turning it into
a successful photo-sharing app, leveraging the lessons it had learned about the value of
community and the social features it had developed for the game (such as tagging and
sharing). Groupon is another example. Conceived initially as an idealistic platform for
social change, it then pivoted to become a platform for those seeking a bargain.

12.4.7 Incrementally Scaling the MVP


An effective way to develop a product is to start with a manual MVP and automate and
scale it incrementally as the product grows. This approach was used by Zappos, an online
shoe store.
Here’s how the process played out, as described by the company’s founder: “My Dad
told me . . . I think the one you should focus on is the shoe thing. . . . So, I said okay, . . .
went to a couple of stores, took some pictures of the shoes, made a website, put them
up and told the shoe store, if I sell anything, I’ll come here and pay full price. They said
okay, knock yourself out. So, I did that, made a couple of sales.”15 In 1999, the company

10. Clif Gilley, “Do You Have to Build an MVP to Pivot?” [blog post], Quora, December 16,
2013, https://www.quora.com/Do-you-have-to-build-a-MVP-to-pivot
11. Thanks to my editor, Ron Healy, for informing me of this example.
12. Geoff Daigle, “Case Studies from Amazon, Yahoo, and Ryanair Reveal How Growth Teams
Should Use Data + Feedback,” Thinkgrowth.org, August 21, 2017, https://thinkgrowth.org/case-
studies-from-amazon-yahoo-and-ryanair-reveal-how-growth-teams-should-use-data-feedback-
d7b410a005f8
13. Alistair Smout and Kate Holton, “UPDATE 2—As Brexit Bites, Ryanair to Pivot Growth
Away from UK for Next 2 Years,” Reuters, April 6, 2017, https://www.reuters.com/article/britain-
eu-ryanair-hldgs/update-2-as-brexit-bites-ryanair-to-pivot-growth-away-from-uk-for-next-2-
years-idUSL5N1HE1YQ
14. Reid Hoffman, “The Big Pivot—with Slack’s Stewart Butterfield,” Masters of Scale with Reid
Hoffman [podcast], November 14, 2017. https://player.fm/series/masters-of-scale-with-reid-
hoffman/the-big-pivot-wslacks-stewart-butterfield
15. Jay Yarow, “The Zappos Founder Just Told Us All Kinds of Crazy Stories—Here’s the Surprisingly
Candid Interview,” Business Insider, November 28, 2011, https://www.businessinsider.com/
nick-swinmurn-zappos-rnkd-2011-11?op=1
12.4 MVP P 365

signed on a dozen brands—all men’s brown comfort shoes. As they added more respected
brands, such as Doc Martens, the company and market grew and, in tandem, Zappos
automated and scaled its business systems and processes.

12.4.8 Using MVPs to Establish the MMP


Using the MVP process, a company can quickly and inexpensively validate through exper-
imentation which features will make the most difference. These features are referred to
as the minimal marketable features (MMFs). An MMF is the smallest version of a fea-
ture (the least functionality) that would be viewed as valuable by customers if released
to the market. MMFs may deliver value in various ways, such as through competitive
differentiation, revenue generation, or cost savings. Collectively, the MMFs define the
minimum marketable product (MMP)—the “product with the smallest feature set that
still addresses the user needs and creates the right user experience.”16

BLINK CASE STUDY PART 20

Create an MVP
Background
You convene stakeholders and developers to specify the BLInK hypotheses that
will be tested during the first quarter and plan the MVPs that will be used to validate
them.

The Ask
The deliverables of the workshop will be
W Deliverable 1: Hypothesis—Leap of faith hypothesis (or hypotheses) critical to
the business case for the product
W Deliverable 2: MVP—High-level description of the MVP that will be used to
test the hypothesis.

Inputs
W Chapter 7, Case Study Part 8, Deliverable 1: Assumptions Checklist

What Transpires
The group discusses assumptions that are most critical to the product’s business
case. They agree that the most urgent leap of faith hypothesis is that reluctance to
sharing data can be overcome when a benefit is shown immediately (A7). Business
stakeholders and developers brainstorm ways to test the hypothesis quickly and
inexpensively.

16. Roman Pichler, “The Minimum Viable Product and the Minimum Marketable Product,”
October 9, 2013, https://www.romanpichler.com/blog/minimum-viable-product-and-minimal-
marketable-product
552 C 17 S A

scaled iteration retrospective. The chapter provides guidelines for selecting software tools
to support collaboration among teams. It also offers lightweight solutions, such as using
roamers and scouts.
The chapter concludes with guidance for addressing potential problems and challenges
when scaling an agile organization, such as coordinating with waterfall teams.

17.1 Objectives
This chapter will help you

• Understand how DevOps, CI, CD, and ATDD enable frequent, reliable delivery of
value to the end user.
• Understand how to structure a scaled development organization into portfolios,
programs, product areas, feature teams, and component teams.
• Know when to use timeboxed and when to use flow-based planning approaches.
• Conduct scaled agile events, such as scaled quarterly and iteration planning meetings.
• Conduct rolling analysis (feature and story preparation) on a scaled agile initiative.

17.2 This Chapter on the Map


As indicated in Figure 17.1, the chapter focuses on the Grand Lane of the planning and
analysis map, cutting across all activity zones from Initiation and Planning to Quarterly
Closeout.

17.3 Why Do We Need a Scaled Agile Approach?


It’s common, in agile circles, to hear that a scaled agile organization should be composed
of self-sufficient, independent teams.1,2 If agile teams were, in fact, totally independent
at scale, there would be no need for scaled agile frameworks (or this chapter); you would
simply follow team-level agile practices and multiply them across the organization with-
out any additional processes or roles. (As we’ll see, this is roughly the approach of

1. For example, the Scrum Guide declares that “members have all the skills necessary to create
value each Sprint” and are “self-managing.” Ken Schwaber and Jeff Sutherland, “The Scrum
Team,” in The Scrum Guide: The Definitive Guide to Scrum—The Rules of the Game, 2020, 5,
https://www.scrumguides.org
2. As another example, Ron Jeffries writes, "Much of the work of any company can be done
by single cross-functional teams." See Ron Jeffries, “Issues with SAFe,” April 2, 2014,
http://ronjeffries.com/xprog/articles/issues-with-safe
17.3 W D W N  S A A? 553

the Large Scale Scrum [LeSS] framework.) Yet, in practice, dependencies among teams
are the norm, not the exception, in scaled agile organizations. These persistent depen-
dencies aren’t a bug. They’re a feature of a well-scaled organization, and it is neither
possible nor desirable to eliminate them. Because agile teams in scaled organizations
are interdependent—not independent—we need effective solutions for coordinating and
integrating their work at scale.
First, we examine why teams are interdependent in a scaled agile organization. Then,
we look at the following strategies for addressing that interdependence:

• Planning: Choosing an agile planning approach that supports inter-team collaboration


• Continuous Delivery: Integrating, testing, and delivering software continuously,
safely, and sustainably at scale (DevOps/CI/CD)
• Scaled Agile Culture: Creating a culture that supports innovation at scale
• Scaling the Backlog: How to structure the product backlog in a scaled agile
environment
• Scaling the Organization: How to structure a scaled agile organization
• Scaling the Process: Scaling the agile process to promote collaboration across teams
• Scaling Tools: Tools and techniques for supporting scaled agile development and
team coordination
• Potential Issues in Scaling Agility: How to address challenges scaling agility, such as
non-colocated teams and coordination with waterfall developers

17.3.1 Why Scaled Agile Teams Are Interdependent


Scaled agile teams tend to be dependent on each other because of the interconnectedness
of a product’s features, technical complexity, and shared components. Let’s explore these
issues.

17.3.1.1 Interconnected Features


Consider a mobile phone and the subproducts—or high-level features—it encompasses,
such as a camera, photo-editing, messaging, and social-network capabilities. In a scaled
agile organization, each of these subproducts is maintained by a feature team or team of
teams.
The user can use each subproduct on its own, but the product’s full value lies in how all
its subproducts work with each other. For example, customers can access photo-editing
and messaging directly from the camera—enabling them to shoot, edit, and send images
seamlessly. Because subproducts are designed to work together this way, rather than as
standalones, they will inevitably have dependencies on each other—and so will the teams
that develop and maintain them.
The same applies when the product is not a physical object but a software system. Con-
sider Z-News, a fictional, digital news service. Z-News’s teams are organized by business
554 C 17 S A

areas (e.g., an order-processing team, a service-delivery team, a billings team). Now suppose
that stakeholders have requested a new subscription service to deliver personalized news
hourly to readers. This single request will require numerous teams working in concert with
each other. The order-processing team will add the capability to order the new subscription,
the service-delivery team will implement the delivery of customized news each hour, and the
billings team will implement the monthly subscription charges for the new service. Across
the value stream—from the subscription order to service delivery—each team relies on data
produced by other teams. For example, the order-processing team captures subscription
details, such as topics and sources, and the service-delivery team uses that information to
determine what news items to deliver. Because the teams are interdependent, they need
to coordinate their plans at the frontend of the development cycle, collaborate throughout
development, and integrate and test their work continually as stories are done. How they do
that effectively is the subject of this chapter.

17.3.2 Product Complexity


Another reason for team dependencies is that the competencies required to implement a
feature for a complex product are usually too numerous to be accommodated in a small
agile team of no more than ten members. Expertise is typically needed in UI design and
coding, cloud services, the deployment framework, automated testing, the application
stack, the software stack (infrastructure), open-source tools, database management, and
business domain knowledge. Since a small team usually can’t cover all these competen-
cies, the competencies are typically distributed among a group of interdependent teams.

17.3.3 Shared Components


Another reason that team dependencies can’t, and shouldn’t, be eliminated is that multi-
ple teams often share software components and are dependent on the team that manages
them. As we’ll explore later in this chapter, if we let feature teams change a component as
they see fit, the result will be inconsistency in design and quality across the component.
To ensure this doesn’t happen, a component team takes primary responsibility for it.
However, component teams introduce dependencies—because if a feature team requires a
change to a component, it’s dependent on the component team to implement it. Similarly,
if the component team changes a component, the feature teams that depend on it are
potentially impacted.

17.4 Planning: Choosing an Approach That Supports


Inter-team Collaboration
There are two necessary but distinct coordination issues to address in a scaled organi-
zation: What approach will the organization use to plan work across multiple teams,
and how will it time the integration and delivery of software across multiple teams? In
17.4 P: C  A T S I- C 555

answering those questions, it’s essential to realize that the solutions to the two prob-
lems are not necessarily the same. In fact, it’s usually best to use a mixed approach—a
timeboxed or hybrid approach to plan large features at the frontend and a flow-based
approach at the back end to continuously implement, integrate, and deliver improvements
to the customer. We addressed the issue of flow-based versus timeboxed approaches ear-
lier in this book. Let’s revisit it now with a focus on scaled agile organizations.

17.4.1 Review of the Two Approaches


In a flow-based approach, each work item moves from step to step in the development
lifecycle at its own pace, provided that work-in-progress (WIP) limits at each step are not
exceeded. The aim is to achieve a continuous flow of each item without bottlenecks—
from initiation through delivery. This is the approach used by the Kanban framework.
In contrast, with timeboxed planning, teams commit to all of the work items for a
specified period (the timebox) at the start of the period. Two common timeboxes are the
quarter and the iteration. A quarter refers to three months, but (as noted elsewhere) I use
the term in this book as a shorthand for a release cycle, a SAFe program increment (PI),
or any period of two to six months. An iteration is a shorter timebox, typically one or
two weeks. Frameworks that incorporate iterations include Scrum, Extreme Program-
ming (XP), LeSS, and SAFe. In Scrum, this period is referred to as a sprint. The maximum
duration of a sprint is one month.

17.4.2 Which Approach Should You Use at the Frontend?


As a general guideline, feature teams benefit most from a mixed planning approach at
the frontend, using flow-based (Kanban-style) planning for customer-driven features and
quarterly (timeboxed) planning for large, strategic initiatives.

17.4.2.1 When to Use a Flow-Based Approach to Accept Requirements into


Development
The flow-based portion of the budget enables teams to respond quickly to learning, rather
than waiting a quarter or more to apply newly gained knowledge. This part of the budget
should be set aside for small efforts that can be handled by a single team with minimal
help from others. For example, the team might be exploring options to improve the con-
version rate of browsers to subscribers or looking at different ways for a user to filter or
sort content. To do so, they try out different options with customers and adapt them based
on customer feedback. Since customers’ responses drive each inspect-and-adapt cycle,
there is no sense in trying to predict and prioritize their preferences too far in advance.
Consequently, a flow-based approach is advised.

17.4.2.2 The Pitfalls of Relying Solely on a Flow-Based Approach


However, many organizations with which I work have discovered that when they rely
solely on flow-based planning, the product becomes fractured because the approach
570 C 17 S A

17.8 Scaling the Agile Organization


As noted earlier in this chapter, an organization developing a complex product will inev-
itably require multiple interdependent teams in order to cover all the necessary compe-
tencies for all of its subproducts and components. For example, the top-level product for
a large company might easily include more than twenty subproducts. Each of these, in
turn, might be delivered over multiple channels (e.g., Web, mobile), each of which requires
specialized technical competencies. For a company such as SAP (a vendor of enterprise
resource planning software), this can require, in total, more than two thousand agile
teams. 23 In this section, we explore how to structure agile organizations of that size.

17.8.1 Scaling by Subproduct and Product Area: MyChatBot Case Study


The solution is to structure the organization by subproducts, also known as product
areas. Let’s look at a fictional example, MyChatBot. MyChatBot is an innovative com-
pany and product based on the hypothesis that customers will want to use chatbots for
common customer-engagement tasks in order to increase sales and customer outreach at
minimal cost. The company has identified ten primary high-level tasks customers would
use MyChatBot for, including Sales, Marketing, Customer Support and Engagement,
Analytics. In circumstance-based market segmentation, these are identified as the jobs
customers hire the product to do.

See Chapter 8, section 8.4, for more on circumstance-based market segmentation.

Figure 17.4 depicts how the MyChatBot organization is structured into levels of sub-
products. For illustration purposes, I’ve included only four of its subproducts.
As indicated in Figure 17.4, MyChatBot is the top-level product. Below are its subproducts—
one for each primary usage of the product. Four of these usages are highlighted: Sales,
Marketing, Customer Support and Engagement, and Analytics.
Each of these subproducts has numerous sub-subproducts, referred to as product areas.
For example, the Customer Support and Engagement subproduct includes a product area
for each of the following sub-subproducts:

• Collaboration Tool Automation: To facilitate the collaboration of support staff


• Ingest Content: To load Chatbot messages originating on social media and elsewhere
• User Efficiency: To optimize the efficiency of customer-support users

Each product area is divided up into feature sets—groups of related product features.
For example, Collaboration Tool Automation has one team for each of the following
feature sets: tagging, triaging, and assigning messages using automation. In a larger orga-
nization, there might be multiple teams devoted to each feature set.

23. Darrell K. Rigby, Jeff Sutherland, and Andy Noble, “Change Management: Agile at Scale,”
Harvard Business Review (May–June 2018), https://hbr.org/2018/05/agile-at-scale
17.8 S  A O 571

Product (aka Platform)


MyChatBot

MyChatBot Etc.
MyChatBot MyChatBot Customer MyChatBot
Subproduct Sales Marketing Support and Analytics
Engagement

Etc.
Product Areas Collaboration Tool Ingest User Efficiency
Automation Content

Etc.
Feature Sets/ Triage Assign
Feature Teams Tag Message Message Message

Shared
Resources Competency Component
Other Groups and Group Team
Teams (Business
SMEs, etc.)

Figure 17.4 MyChatBot organization

In addition to the feature teams, Figure 17.4 indicates component teams dedicated to
commonly used components. For example, MyChatBot might have a component team
dedicated to an API that manages outgoing messages to third-party products, such as
social networks. Figure 17.4 also indicates competency groups—associations that supply
the teams with members, shared resources, and support within a particular area of exper-
tise, such as UX design.

17.8.2 Scaling the PO Role


As mentioned earlier in this chapter, high-performing organizations require leadership at
every level. A product-level PO is responsible for the whole product, while area POs are
assigned at all the intermediate subproduct levels down to the individual team. Each of
572 C 17 S A

these teams is led by a team PO or proxy PO. We’ve discussed the product-level PO. Let’s
examine the other roles.

17.8.2.1 Area POs


An area PO should be assigned to each subproduct or sub-subproduct down to the level
above the team level. (At the team level, a team PO or proxy PO is assigned, as described
shortly.) Each area PO is responsible for a subproduct—a high-level use case, or job, custom-
ers hire the product to do. The role may be filled by a portfolio manager, program manager,
product manager, or SAFe Release Train Engineer (RTE). Area POs have ultimate respon-
sibility for prioritization decisions in their area—though (as noted earlier) other stakehold-
ers are typically required for signoffs and approvals, and local decision-making should be
devolved to lower-level POs. An area PO may also act as a PO for one of the lower levels.

17.8.2.2 Team POs


Each team is led by a team PO or proxy PO (described in the next section). The PO’s
outward-facing activities include speaking with business executives to understand stra-
tegic objectives, interacting with salespeople and customers, attending trade shows, con-
ducting surveys to understand the market, and talking to data analysts to understand how
people are using the product. Inward-facing duties involve close day-to-day interactions
with the team—requiring about ten hours or more per week.
The full complement of PO-related responsibilities is often too excessive for a single
person, so the work is often distributed among roles. If there is a team-level PO, the team
PO focuses on outward-facing activities, while the team analyst focuses on inward-facing
responsibilities. If the team is led by a proxy PO, the area PO focuses outward, and the
proxy PO takes on inward-facing tasks.

17.8.2.3 Proxy PO and Business Analyst


It’s hard enough for a PO to find sufficient time to work day-to-day with one team while
fulfilling external-facing responsibilities. In practice, a PO is often required to support
more than one team because of a scarcity of resources. An effective solution in this case is
to use a proxy PO or business analyst at the team level to take on some of the PO’s respon-
sibilities. The proxy PO or business analyst works full time with the team to answer
detailed questions about the requirements and communicate higher-level goals to the team
so that the PO can focus on external responsibilities.
Formally, this can play out in several ways. An area PO may be assigned to preside over
a group of teams, with proxy POs at the team level. Alternatively, a team-level PO may
be shared by a few teams, with team analysts taking on inward-facing responsibilities at
the team level.

17.8.3 Portfolio and Program Structure


Another way to structure a scaled organization is by portfolios and programs. This struc-
ture is especially well-suited to initiatives that span departments or entire products. Fig-
ure 17.5 depicts the organizational structure for XComm, a fictional company loosely
based on a real telecommunications company.
XComm

Products Services

Portfolios Prepaid Postpaid Network


Mobile Mobile TV Internet Website Call Center Upgrades

Improve Customer Video On Improve Customer


Set-Top Box Demand Call-Center Analytics
Programs Experience (VOD) Experience

Remotely Enhanced
Diagnose Upsell Experience (Call
Feature Teams and Search/Browse VOD VOD Account Recommen-
Return Box Titles Subscriptions Management Routing, etc.) for
Fix Set-Top dations High-Value
Box Customers

Figure 17.5 Portfolio and program organizational structure


574 C 17 S A

As depicted in Figure 17.5, the organization is divided into products and services. The
products division focuses on initiatives to improve the products XComm sells to its cus-
tomers (e.g., mobile and Internet products). The services side focuses on quality improve-
ments to its support services (e.g., call center improvements and network upgrades).

17.8.3.1 Portfolio Level


A portfolio is a broad initiative that may span departments, business areas, products, and
systems. Figure 17.5 indicates that the products division contains prepaid mobile, TV, and
Internet portfolios, each representing a line of business.
The portfolio is the largest organizing unit in SAFe, responsible for strategy and invest-
ment. Lean portfolio management (LPM) practices should be used. The focus of LPM is
on providing resources to long-lived teams of teams24 so that they can realize strategic
objectives and achieve desired outcomes. This contrasts with the traditional practice of
funding one-time projects with specified outputs. LPM includes the lean startup practices
covered in this book, such as MVP, pivot or persevere, lean techniques for eliminating
waste, and cultural practices such as servant-leadership (discussed in this chapter).

Portfolio Epics
In SAFe, long-lived initiatives at the portfolio level are classified as portfolio epics. A
portfolio epic can span multiple teams of teams—referred to in SAFe as Agile Release
Trains (ARTs). The following format may be used to specify the hypothesis statement for
a portfolio epic:25

Epic description: For [customers] who [perform some activity], the [solution] is a [what]
e.g. that [delivers this value]. Unlike [competition/existing solution or non-existing solution],
our solution [does something better].

Business outcomes (measurable benefits)

• <benefit 1>
• <benefit 2>

Leading indicators

• <indicator 1>
• <indicator 2>

24. John May, “Lean Portfolio Management: How to Build a Better Enterprise by Being More
Lean,” Atlassian, https://www.atlassian.com/agile/agile-at-scale/lean-portfolio-management
25. Richard Kastner and Dean Leffingwell, SAFe 5.0 Distilled: Achieving Business Agility with
the Scaled Agile Framework (Boston: Addison-Wesley, 2020), 154.
632 C 18 A E A

2. Identify the opportunities: Ask customers what needs aren’t being met well today.
What services and products are too costly, too inconvenient, or too inaccessible?
What difficulties are customers experiencing that they don’t even think of as prob-
lems because there is currently no alternative? Which of these problems can the
company solve through innovation?
3. Separate customers by needs: As Theodore Levitt, a professor at Harvard Busi-
ness School, has said, “People don’t want to buy a quarter-inch drill. They want
a quarter-inch hole.”13 In other words, it’s not the tool or product that counts to
the customer; it’s the outcome. Divide customers into groups by their needs (also
referred to as jobs)—a problem they want to solve or a need they have that is not
being met—not by demographics, product, or market size. Then seek to understand
and address the needs of each group.
4. Determine the vision: Articulate the vision for the product or improvement. If it’s a
disruptive innovation, specify a vision for performing a job in a way that meets or
outperforms expectations in the target group’s critical areas of concern (e.g., cost,
convenience)—even though initial versions might underperform in areas they care
less about.
5. Identify the leap of faith hypotheses: Identify the leap of faith hypotheses that must
be true for the business model to be successful.
6. Conduct MVP testing: Test the leap of faith hypotheses through rounds of MVP
experiments with real customers, making adjustments based on feedback and met-
rics. Use leading indicators to forecast the likely outcome.
7. Pivot or persevere: Use the results of MVP testing to identify the Minimum Market-
able Product (MMP)—the smallest version of the product that would be viable in
the market. Use feedback from MVP testing to determine whether to commit to the
vision or make a radical change in direction.
8. Continuously improve: Use the results of MVP testing to identify the Minimum
Marketable Product (MMP)—the smallest version of the product that would be
viable in the market. Use an iterative process to implement the MMP and continu-
ously improve the product. Use data, frequent feedback, and MVP testing to inform
decisions.
9. Accelerate: If the innovation is disruptive, accelerate rapidly to capture the market
before incumbents and competition can respond.

18.6 Agile Corporate Culture


Successful innovation is not just about having a good idea—or even the right processes.
It’s about culture. Everyone involved in developing a product deemed “innovative” in

13. As quoted in Christensen and Raynor, The Innovator’s Solution: Creating and Sustaining
Successful Growth, chapter 3.
18.6 A C C 633

their industries—especially if it’s a disruptive innovation—must share an organizational


culture that embraces, supports, and encourages innovation. Failing to do so can result in
disappointing failure.
Let’s begin by defining corporate culture; then we’ll look at what it means for that
culture to be agile.

18.6.1 Definition of Corporate Culture


Culture is the sum total of beliefs and ideas that guide behavior. Adam Grant defines it as
“repeated patterns of behavior that reveal norms and values.”14 Perhaps the most succinct
way to explain culture is that it’s “what people do when no one’s watching.”15
Corporate culture is “the beliefs and ideas that a company has and the way in which
they affect how it does business and how its employees behave.”16

18.6.2 Definition of Agile Corporate Culture


An agile corporate culture is a set of behaviors and ideas that guide an organization and
its employees in ways that optimize the organization’s ability to anticipate and respond to
change. Agile cultures embed collaboration, empowered decision-making, and cognitive
empathy in the organization—elements we explore further in this chapter.
Jeremy Gutsche defines the following prerequisites for an innovative culture:

• Urgency: A necessary condition for reinvention and innovation is that people have a
sense of urgency about the need for change.
• Perspective: When the organization’s perspective is based on past accomplishments,
the result can be complacency and a loss of urgency. An agile organization’s per-
spective is not focused on the past or exclusively on the present; it’s oriented toward
future needs and trends.
• Experimental Failure: The enterprise must value and nurture a culture of experi-
mentation. People should expect failure to occur—as a natural and necessary part
of innovation.
• Customer Obsession: The company must be obsessed with understanding its cus-
tomers and creating an emotional, cultural connection with them.
• Intentional Destruction: The organization understands that existing hierarchies
must be destroyed as a necessary precondition for reinvention, and it supports that
process.

14. Adam Grant, “The Science of Leadership” [podcast], Stay Tuned with Preet, December 27,
2018.
15. Grant, “The Science of Leadership.”
16. “Corporate culture,” Cambridge Dictionary, http://dictionary.cambridge.org/dictionary/
english/corporate-culture
634 C 18 A E A

These elements underlie the guidance in the following sections. For more on Jeremy’s
model, I urge readers to explore The Innovation Handbook17 and Exploiting Chaos.18

18.7 Overview of Principles and Practices for an Agile


Corporate Culture
Many existing agile and agile-adjacent frameworks and practices touch on agile corporate
culture, even if they don’t always call it out in those terms. These include lean thinking,
Six Sigma, lean startup, the GE Beliefs,19 DevOps, the Agile Manifesto, as well as lessons
learned from transitioning companies. 20 The following synthesizes this guidance into a set
of principles and practices for an agile culture.
The three principles for applying agile practices are as follows:

• Tailor the approach to the circumstance.


• Protect islands of innovation.
• Invest aggressively in enterprise agility.

The thirteen practices for an agile corporate culture are as follows:

• Iterative experimentation (fail fast)


• Embrace change
• Acceleration
• Empathy
• Responsible procrastination
• Distributed authority
• Let those who do the work estimate the effort
• Collaboration

17. Jeremy Gutsche, Create the Future + the Innovation Handbook: Tactics for Disruptive
Thinking (New York: Fast Company, 2020).
18. Jeremy Gutsche, Exploiting Chaos: 150 Ways to Spark Innovation in Times of Change
(New York: Gotham Books, 2009). Available as an ebook at http://cdn.trendhunterstatic.com/
EXPLOITING-CHAOS-by-Jeremy-Gutsche-TrendHunter.pdf
19. Jeffrey Immelt, “Letter to Shareholders,” in GE 2014 Annual Report, 10–11,
https://www.annualreports.com/HostedData/AnnualReportArchive/g/NYSE_GE_2014.pdf
20. See, for example, Steve Blank, “Corporate Acquisitions of Startups—Why Do They Fail?”
Forbes, April 22, 2014, https://www.forbes.com/sites/steveblank/2014/04/22/corporate-
acquisitions-of-startups-why-do-they-fail. Also see Peter Nowak, “Video Streaming in Canada,”
September 27, 2016, http://www.alphabeatic.com/video-streaming
18.8 T P  A A P 635

• Commit to outcomes, not outputs


• Transparency
• Bust silos
• Data-informed innovation
• Monitor adjacent and low-end markets

18.8 Three Principles for Applying Agile Practices


Let’s begin with the principles for applying the practices.

18.8.1 Tailor the Approach to the Circumstance


The core meaning of agility is adaptability, and nowhere is this attribute more apt than
for the agile approach itself. As in Fight Club (the novel and film by the same name), the
first rule of an agile corporate culture is that there is no agile culture—or no single one
for all situations. The agile practices that an organization adopts must be tailored to fit
the mission of the enterprise and the values that matter most to it—and those practices
should evolve as the mission changes over time. For example, Apple’s original mission was
“To make a contribution to the world by making tools for the mind that advance human-
kind.”21 A corporate culture tailored to this mission would embrace most, if not all, of
the agile practices discussed in this chapter, such as fail fast. In contrast, Apple’s mission
today is the more prosaic and product-focused statement that “Apple designs Macs, the
best personal computers in the world, along with OS X, iLife, iWork, and professional
software. Apple leads the digital music revolution with its iPods and iTunes online store.
Apple has reinvented the mobile phone with its revolutionary iPhone and App Store, and is
defining the future of mobile media and computing devices with iPad.”22 A corporate cul-
ture aligned with the new mission’s emphasis on past and current products and successes
would lean more toward predictability and reliability and less toward experimentation
and transformational change than one aligned with the first. It’s not a question of what’s
right—but what’s right for the organization at that time.
Culture can also vary within an organization. Suppose an established enterprise has
created a new business unit to develop an innovative service. Even while the rest of the
enterprise adopts a culture that supports predictability, the new business unit would be
wise to adopt a highly agile culture that promotes learning and embraces change due to
the novelty of the product.

21. “How Apple’s Current Mission Differs from Steve Jobs’ Ideals,” Investopedia, June 22, 2019,
https://www.investopedia.com/ask/answers/042315/what-apples-current-mission-statement-and-
how-does-it-differ-steve-jobs-original-ideals.asp
22. “How Apple’s Current Mission Differs.”
Index

A
A/B (split) testing Accuracy
actionable metrics and, 187 of estimates, 335
staging the release, 539 of risk forecasts, 546–547
value validation with, 491–494 of transparency, 667
AC. see Acceptance criteria (AC) Actionable metrics, 187–188
Acceleration Actions
agile culture embraces, 653–655 against developer tasks/stories, 471
innovative development and, 632 eliciting business rules with decision tables, 437,
sustaining, 110 438
Acceptance, of risk, 346 as journey map component, 277, 282
Acceptance criteria (AC) Activities
committing to feature's, 343 BPMN private process model, 293, 295
decision tables analyzing, 433–440 BPMN public process model, 288
feature change initiatives and, 254 postponing until last responsible moment (LRM),
feature documentation by specifying, 497 659
feature preparation and, 216–217, 256 scaled, 583–585
for functional spikes, 417 story map backbone, 374
refining incrementally, 327–328 Activity card, story map, 368–369
as requirements-related term, 40 Actor card, story map, 368–369
rules of thumb for, 683 Actors, constructing story map backbone, 374–375
specifying features and their, 259–263 Adaptability
Acceptance criteria (AC), story balancing scope commitment and, 343
avoid too many, 411, 429–430 of business analysts, 65
confirmation of, 398, 407 as core meaning of agility, 635
examples of, 397, 408 of high value in agile analysis, 14
extensiveness of, 411 Additional resources and checklists. see Resources and
as specification by example, 409–410 checklists, additional
specification of, 407–414 Agenda. see Topics/agenda
testing, 8 Agile Alliance, 18
of well-formed, 411–412 Agile analysis
when to create and update, 409 key practices versus waterfall, 65–68
writers of, 408–409 parallel histories of BA and, 16–17
Acceptance test-driven development (ATDD) rules of thumb, 68
agile analysis vs. waterfall, 66 Agile analysis and planning
and BDD, 563–564 art of. see Art of Agile analysis and planning
defined, 56 definition, 13
history of, 18 fundamentals. see Fundamentals of agile analysis
preventing last-minute integration issues, 9 and planning
specifying feature AC, 259, 261 Agile Analysis and Planning Map
Acceptance testing. see User acceptance testing (UAT) activities across development life cycle, 70–71
Accountability, vision with, 565 enterprise agility, 624–626

715
716 I

Grand Lane, 79–81 Alpha testing


introduction to, 72 beta testing after internal, 359
iteration and story planning, 442–443 pre-alpha stage, 533
lanes, 73–74 product release, 533
Long Lane, 79 Analysis and Planning activities across development
long-term agile planning, 222–223 life cycle
MVPs and story maps, 354–355 map. see Agile Analysis and Planning Map,
overview of, 72–74 understanding
preparing the organization, 84–85 mapping to IIBA and PMI guides, 678
preparing the process, 115 objectives, 69
quarterly and feature planning, 316–317 rules of thumb, 682–683
quarterly and feature preparation, 252–254 summary of, 81
releasing product, 528–529 Analysis documentation, product release, 538
seeding product backlog, 194–196 Analyze-code-build-test cycle, 490–494
Short Lane, 74–78 Anonymity, Delphi estimation, 338
story in three Acts, 74 Appendix A. see Resources and checklists, additional
story preparation, 392–393 Appendix B. see Discovery-driven planning, BestBots
summary of, 81 case study
visioning activities, 147–149 Apple iPhone, as disruptive innovation, 640
zones, 72–73 Approval, agenda plan, 348, 351
Agile analysis and planning, value proposition Architect, product owner council, 580
agile development history, 17–18 Architectural runway, preparing for scaling, 585
agile diagnosis, 22 definition, 510
agile track record, 22–23 Architecture
business analysis diagnosis, 18–19 collaboration-enhancing, 664
business analysis history, 16–17 cost-benefit calculation, 120–121
business analysis track record, 19–21 scaled feature preparation, 602–603
defining business analyst, 14–15 Architecture (block) diagrams, 310–312
mapping to IIBA and PMI guides, 678 Architecture review
objectives, 13 architecture (block) diagrams, 310–312
reasons for using, 15–16 context diagrams, 307–308
summary of, 25 data flow diagrams (DFDs), 308–310
two diagnoses for same problem, 18 UML communication diagrams, 308
understanding, 13–14 Area POs, defined, 572
why agile should include BA competency, 24–25 Art of Agile analysis and planning
Agile corporate culture. see Corporate culture Cantankerous Customer story, 10–11
Agile enterprise transition team, 112 example of, 1–5
Agile Extension to the BABOK Guide, 18 It's Not My Problem story, 8–9
Agile financial planning. see Financial planning for mainstream businesses, 5–7
The Agile Manifesto, 17–18 mapping to IIBA and PMI guides, 678
Agile Manifesto, 28–31 objectives, 1
The Agile Practice Guide, 31 summary of, 11
Agile Practice Guide, 18 Artifacts
Agile Release Trains (ARTs) iteration review for forecasting/tracking, 516–517
job-based organization structure, 668–669 tracing analysis, 506–508
quarterly planning event and, 318 updating BA documentation, 496
Scrum of Scrums meetings for, 600 in use-case model, 499–502
teams in SAFe organized into, 57, 582 ARTs. see Agile Release Trains (ARTs)
Agile requirements management tools, 699–700 Asana, requirements management tool, 699
Agile Software Development with Scrum (Schwaber Aspirational, vision statement as, 173
and Beedle), 7 Assumptions
Airbnb, iterative experimentation, 650 checklist, 708–709
Align cycles, waterfall teams, 619 defined, 36–37
I 717

interim goals to test, 383 mapping of book chapters to, 677–681


leap of faith hypotheses as, 189–191 requirement types checklist, 123
reviewing in planning agenda, 329 Backbone, constructing story map
specifying for interim periods, 243 activities, 374
validating using MVPs, 228–230 actors, 374–375
Assumptions/hypotheses card, story map, 368–369 case study, 374–379
ATDD. see Acceptance test-driven development (ATDD) inputs, 372–373
Attendees narrative, 372
daily standup, 473 overview of, 370
iteration planning, 445 scope, 373–374
iteration retrospective, 518 user tasks, 374
iteration review, 514 Backend systems, cost-benefit calculation, 120
pivot-or-persevere meetings, 545 Backlog, iteration
product backlog refinement, 512 defined, 446
quarterly planning, 692 forecast goal and scope, 447–451
quarterly/feature planning event, 326 forecasting stories that will be delivered, 450–451
scaled quarterly and feature planning, 587, 698 Backlog, quarterly, 327
Attractive feature, Kano grades, 206–207 Backlog refinement
Attributes as preparation in this book. see Preparation
defining PBI, 125–126 rules of thumb, 683
values for story, 404 Scrum, 47
Automation Backward (upward) traceability, 130, 506
computers for repetitive work in DevOps, 561 Bad news, safe spaces for, 653
continuous testing in DevOps, 562 Barriers, removing to accelerate change, 110
DevOps provisioning, 562 Basic (normal, or happy-day scenario) flow, 497
MVP case study for, 357–358 Basic grade features, Kano, 206
in test-build-deploy, 558–559 Batch size, DevOps and small, 561
testing without, 90–91 BDD. see Behavior-driven development (BDD)
timing build and distribution process, 93 Beck, Kent, 322–325, 336
transitioning from manual testing to, 91–93 Behavior, modeled by leaders, 566
value stream mapping for process, 284 Behavioral business rules, 37, 433–438
Autonomy Behavior-driven development (BDD)
benefits of, 660 as agile framework, 56
for branding disruptive products, 646–647 ATDD and, 563–564
extending to innovative business units, 645–646 feature acceptance criteria, 262–263
for internal business units, 647 preventing last-minute integration issues, 9
structuring new business as legally separate entity, story acceptance criteria, 408, 413–414
647 Behavior-trended funnel, split testing, 494
Availability Best practices, in estimation, 333–335
defined, 35 BestBots. see Discovery-driven planning, BestBots
determining initial capacity, 136 case study
forecasting iteration capacity, 448–449 Beta testing
as quality of transparency, 667 closed (private), 534
AWS (Cloud) competency group, 578 deployment to customers, 235–236
experimentation in mainstream via closed, 652
B of MVPs and all changes, 359
BABOK v3: A Guide to the Business Analysis Body open (public), 534
of Knowledge (BABOK Guide)[IIBA] product release, 533–534
BA information artifacts and events, 122 Big Bets, full potential plan for, 226–227, 630
BA practices, 17 Big room iteration planning
BA standards, 31 conversation options during, 132
business rules, 37 facilitation skills of agile business analyst, 64
context analysis, 263–264 scaled iteration planning, 80
knowledge areas, 31–32 scaling agile process, 598–599
718 I

Bill of rights, customer’s and programmer’s Bug fixes


cantankerous customer and, 10 bug-repair stories for, 420
customer’s, 101–102 deployment of, 236
developer’s, 102 quarterly and feature planning estimates for, 341
XP's Release Planning Game, 323–325 releasing immediately to market, 531
Blanchard, Ken, 565 Build
BLInK case studies analyze-code-build-test cycle, 492
architecture diagrams map, 311–312 and distribution processes, 93
BPMN private process model, 296–298 Build-Measure-Learn loop, 228–232
BPMN public process model, 291–292 Build-Test-Learn cycle, 232
cause-effect analysis, 159–161 Bulleted outline level, requirements granularity, 128
cause-effect tree, 163–166 Bulletin board, Open Space events, 612
defined, 147 Burndown charts
Five Whys, 154–157 burnup charts versus, 486–487
improved outcomes, 166 diagnosing productivity with signatures, 482–486
introduction to, xlii–xliii forecasting with, 330–331, 472
iteration implementation planning, 456–458 main elements, 479–480
Kano analysis, 209–212 monitoring progress, 479–486
long-term agile planning-product roadmap, optional elements, 480–481
243–248 quarterly/release, 516–517
MVP creation, 365–366 tracking developer tasks, 480–483
NFRs/constraints analysis, 217–220 Burndown signatures
personas, 270–271 healthy signature, 483–484
pivot-or-persevere meetings, 546–547 overestimating signature, 485–486
prioritizing features/specifying NFRs, 217–220 overview of, 482–486
problem statement, 168–169 underestimating signature, 484–485
product vision statement, 174–175 Burnup charts, 486–487
program board creation, 592–595 Business
release roadmap creation, 348–350 empathy in, 658
Sailboat game, 522–523 inviting representative to Triad meetings, 401
seeding product backlog, 200–201 prioritizes stories, 324
specifying stories for MVP, 384–386 Business analysis (BA)
stakeholder identification, 179–181 certification, 17
story map backbone creation, 375–379 checklist of information artifacts, 122–123
story map completion, 438–440 competency, and Scrum, 45
traceability, 507–508 contributing to agile enterprise, 628
use-case model creation, 501–502 definition, 17
user-role modeling workshop, 304–306 diagnosis, 18–19
Blueprint Storyteller, 699 do not skip in agile development, 8–9
Bold targets, full potential plan, 225–226, 630 history of, 16–17
Bottlenecks, cumulative flow diagrams, 489 impact of Agile Manifesto on, 28–29
Boundaries, cumulative flow diagram and, 488 overview of, 6
Boundary event, BPMN private process model, 295 parallel histories of agile and, 16–17
BPMN. see Business Process Model and Notation practiced in agile context, 6–7
(BPMN) reasons for agile analysis, 15–16
Brainstorming, 511 rules of thumb, 682–683
Branding, autonomous, 646–647 ScrumMaster and, 46
Breakout, Open Space events, 614 track record, 19–21
Breathing, Open Space events, 612 updating use-case model documentation. see Use-
A Brief History painting series (Podeswa), 3–4 case model, updating BA documentation
Briefly described level, requirements granularity, 128 what twelve principles mean for, 29–31
BSA (Business systems analysts), 63 why agile should include competency of, 24–25
Buckets, creating personas via, 267 Business analysis (BA) documentation, updating
Budget (capacity), planning agenda, 330–331 BLInK use-case model, 501–502
I 719

feature, 497 Business value


persisting stories, 496 business analysts maximize, 7
tracing analysis artifacts, 506–508 constructing story map ribs, 381
use-case model, 497–501 crafting iteration goal for, 449
use-case specifications, 503–506 of story, 395
Business Analysis Benchmark, 24–25 testing feature leap of faith hypotheses for, 493
Business Analysis Practice Guide (PMI)
BA domains, 32 C
business analysis standards, 31 CA Technologies with Rally Software, 700
Business analyst Cadence, setting process parameters, 134
agile analysis vs. waterfall, 67 Caliber, requirements management tool, 700
business systems analyst (BSA) versus, 63 Canadian Imperial Bank of Commerce (CIBC), 6
contribution to feature AC, 261 Cancelled iterations, progress in, 513
data-informed innovation role, 672 Cantankerous Customer story, 10–11
defining, 14–15 Capabilities. see Minimum viable product (MVP),
maximizing business value, 7 capabilities
propagating change, 10–11 Capacity (velocity)
proxy POs and, 572 adjusting after implementation begins, 137
roles and functions, 58–63 defined, 35
scaled planning, implementation and, 557–558 determining, 135–136
soft skills, 63–65 forecasting iteration, 448–449
successful full-potential plan, 227–228 for multiple teams, 137
supporting PO, 450 Cards
Triad meetings and, 402–404 developer task, 454
writing acceptance criteria, 408–409 physical versus electronic stories and, 403–404
Business case, reviewing in planning agenda, 329 on story maps, 368–369
Business data strategist, 672 Three Cs of stories and, 397–398
Business entity, agile unit as separate, 647 Case studies
Business goals, 32–33, 181–184 BestBots. see Discovery-driven planning, BestBots
Business model disruptions, 642–643 case study
Business objectives communities of practice (guilds), 669–670
analyzing, 182–184 throughout this book. see BLInK case studies
BA track record for, 20 Catch event, BPMN private process model, 295
defined, 33 Cause-effect diagrams, root-cause analysis, 157–161
interim goals to achieve, 383 Cause-effect tree, root-cause analysis, 161–166
translate product/epic vision statement into, CD. see Continuous delivery (CD)
181–184 CDs, as sustaining innovation, 637
Business Process Model and Notation (BPMN) CEO (Customer Engagement One) case study
private process model, 293–298 agile analysis and planning, 15–16
public process model, 288–292 feature does not deliver sufficient value, 234–237
reasons to select, 287–288 nontrivial change to mature product, 255
and UML, 57–58 persona examples, 266
Business process modeling, 285–298 story in three Acts, 74–79
Business requirements wide and shallow development approach, 239
BABOK v3, 123 Certification
defined, 33 in business analysis (BA), 17
Business resources, insufficient, 621 in Business Data Analytics, 672
Business rules Certified Professional for Requirements Engineering
analyzing with decision tables, 433–438 (CPRE), 17, 31, 69–71
behavioral and definitional, 37, 434 Change
defined, 37 Agile corporate culture embraces, 652–653
Business Rules pattern, splitting stories, 424 as agile development benefit, 119
Business systems analysts (BSA), 63 Agile Manifesto on, 28–29
Business use-case model, 498 articulating vision of, 652
720 I

best trade-off of costs and benefits, 119–121 Closed (private) beta testing, 534, 652
DevOps continuous delivery and, 559–561 Cloud (AWS) competency group, 578
DevOps lightweight management of, 560 CMS (Configuration management system), 131–133
feature initiatives of, 254–255 Coach
indicating on daily burndown chart, 481 BA responsibilities of, 61
for mature product, 255 leader as, 564
specifying interim timeline, 242 leader provides vision as, 660
in waterfall vs. agile, 14, 66 Cockburn, Alistair, 331
welcoming, 29 Code-test-learn, split testing with funnel metrics, 493
Change agents, business analysts as, 64 Cognitive empathy (perspective taking), 656, 658
Change culture, 320–321 Colbert, Steven, 7
Change management Collaboration
communications plan, 111–112 agile analysis vs. waterfall, 66
for organizations with no agile experience, 110–111 agile corporate culture practice of, 663
preparing enterprise for agile development, 107 Delphi estimation, 339–340
transitioning activities at enterprise level, 109–111 DevOps culture of, 559–560
Channels, preparing, 104 Dutch polder model of, 665–666
Charts, product portrait for, 170–171 internal (within enterprise), 663–665
Checklist. see also Resources and checklists, iteration review and, 516
additional outside enterprise, 665
agile BA information artifacts, 122–123 planning stakeholder, 176–178
attendees for quarterly planning, 69 Collaborative culture
attendees for scaled quarterly and feature planning, of business people and developers, 29
698 nurturing, 10–11
for general availability, 535–538 Colocation, big room iteration planning, 598
for NFRs and constraints, 689 Columns
quarterly and feature planning deliverables, 694 determining Kanban states, 144
quarterly and feature planning inputs, 693 Kanban board, 459–462
quarterly release retrospective, 541 Commitment
quarterly release retrospective questions, 695–697 agile planning, 352
readiness. see Readiness checklist balancing adaptability versus scope, 342
requirements management tool, 615 for committed vs. targeted features, 343
stakeholder, 176, 687–688 to goals and objectives over features, 329
visioning readiness, 152 to iteration goal, 449
Choreography, successful full-potential plan via, 228 to outcomes, not output, 666
Christensen, Clayton, 631, 638–639 to planning implementation, 455–456
CI. see Continuous integration (CI) to quarterly and feature planning agenda, 348–350
CIBC (Canadian Imperial Bank of Commerce), 6 to quarterly goals and objectives, 341
Circle, Open Space events, 612 to scope forecast, 341–342
Circles and Soup game, 520, 523–524, 544, 609 to sprint planning meeting, 597
Circumstance-based market segmentation stories to avoid overcommitment, 324
as basis for goals/objectives, 182–184 why quarterly plan is sometimes a promise,
enterprise agility practice, 630 321–322
for feature discovery, 193 Commitment phase, XP Quarterly/Release Planning
incorporating empathy, 657–658 Game, 322
other ways to discover initial features, 198–199 Communication
overview of, 193 business analysis diagnosis and, 19
in Short Lane analysis and planning, 75–77 of non-colocated teams, 618
as source of information for personas, 267 Communications plan, 111–112, 178–179
CJA (Customer journey analytics), 658–659 Communities of practice (CoPs), or guilds, 668–682
Claims Compassionate empathy (empathic concern), 657
BPMN private process model, 293–294 Competency
BPMN public process model, 288–289 agile analysis vs. waterfall, 65
use-case model, 298–299 forming guilds around, 669–671
I 721

groups, 577–578 regulatory, 236


organize teams around value, not, 95–96, 98 seeding product backlog, 216–217
Complete, well-formed AC as, 412 Construction phase, RUP lifecycle, 49
Complex UI pattern, splitting stories, 426 Constructive failures, 364
Complexity Context
point estimates measuring, 335 diagrams, 307–308
team dependencies due to product, 554 feature preparation via, 263–264
Compliance specifying for personas, 269
defined, 104 tailoring agile practice to, 118–121
preparing, 104–106 Continuous analysis, Kano, 208
story, 420 Continuous delivery (CD)
value stream mapping for, 284 ATDD and BDD, 563–564
Component teams automation in test-build-deploy steps, 558–559
overview of, 569–570 cadence of, 43
scaling agile organization, 571 CD and CI, 561–562
supporting extended team, 577 DevOps practices, 559–562
Comprehensive of high value in agile analysis, 14
configuration management in DevOps as, 561 history of agile development, 18
product backlog as, 125 how it works, 233–234
transparency must be, 667 MVP, 233–234
Computers, for repetitive work in DevOps, 561 overview of, 558
Comstock, Beth, 652 quarterly release retrospective, 540
Concierge MVPs, 362 in rolling analysis, 469
Concise, well-formed AC as, 412 of valuable software, 29–30
Condition-response table, business rules, 437 Continuous integration (CI)
Conditions, specifying business rules, 436–437 delivery cadence and, 43
Configuration management database (CMDB), DevOps practice of, 561
131–132 how it works, 233–234
Configuration management, DevOps comprehensive, innovative development and, 632
561 MVP deployment, 233–234
Configuration management system (CMS), 131–133 quarterly release retrospective, 540
Confirmation in rolling analysis, 469
specifying story AC. see Acceptance criteria (AC), and trunk-based development, DevOps, 562
story Contract, avoid viewing quarterly plan as, 321
Three Cs of stories, 398 Convergence, definition of, 639
Conflict, story point estimates reduce, 336–337 Conversation
Confluence, requirements management tool, 700 agile analysis vs. waterfall, 66
Connextra template managing dependencies, 132
defined, 40 Three Cs of stories, 398
describing/estimate features with, 332 user story as reminder to have, 39
overview of, 193 Conversion rate metrics, 492–494
representing epics/features, 193 Cooper, Alan, 265
specifying features, 259 CoPs (communities of practice), or guilds, 668–682
specifying products or epics, 166–167 Core values, Agile Manifesto, 28–29
summarizing root-cause analysis, 166 Corporate culture
tuning user stories, 139, 141 achieving enterprise agility, 632–633
when to write/not write story with, 404–405 definition of, 633
writing stories with, 395 Corporate culture, Agile practices
writing story description with, 405–407 acceleration, 653–655
Consensus-based decision making, 662 bust silos, 667–672
Constraints collaboration, 663–666
checklist for NFRs or, 689 commit to outcomes, not output, 666
defined, 37 data-informed innovation, 672–673
in discovery-driven planning case study, 703–705 distributed authority, 659–663
722 I

embrace change, 652–653 Customers


empathy, 655–659 balancing user features and technical debt, 341
iterative experimentation, 650–652 beta testing by end, 533–534
let those who do the work estimate the effort, 663 concerns about deployment frequency, 237
monitor adjacent and low-end markets, 673–675 corporate culture obsession with, 633
overview, 634–635 deployment to beta, 235–236
responsible procrastination (last responsible fostering cognitive empathy with feedback from, 658
moment), 659 identifying opportunities via, 632
transparency, 666–667 journey map for improved experience of, 278–279
Corporate culture, Agile principles needs of, 667
invest aggressively in enterprise agility, 648–650 provide proposed features to, 330
overview, 634–635 selecting in Kano analysis, 203
protect islands of innovation, 644–647 successful full-potential plan via, 227
tailor approach to circumstance, 635–643 Customer’s and Programmer’s Bill of Rights, 323–325
Cost of delay
determining, 126–127 D
Lean software development and, 53 DAD, hybrid framework, 583
responsible procrastination and, 659 Daily Activities zone
sequencing epics and features in backlog, 212 defined, 73
Costs rolling analysis in. see Rolling analysis and
of agile development, 118 preparation (day-to-day activities)
agile development reduces, 119 Short Lane analysis and planning in, 77–78
assessing for backlog items, 193 Daily burndown charts. see Burndown charts
finding best trade-off of benefits and, 119–121 Daily planning and analysis, 80
impact of BA on, 20–21 Daily Scrum. see Daily standup
operational, 215 Daily standup
COVID-19 pandemic, as a trigger for business model attendees, 473
disruptions, 642–643 day in the life of agile analysis, 468–469
CPRE (Certified Professional for Requirements defined, 47–48
Engineering), 17, 31, 69–71 facilitation tips, 472
Crest Whitestrips, disruptive innovation, 636, 638, 640 forecasting, 474
Cretaceous-Tertiary (K-T), extinction event, 638–639 monitoring progress, 471–474
Critical events, on daily burndown chart, 481 objectives, 472
Cross-functional teams, organize around value, 668 overview of, 471
Cross-stream (horizontal) traceability, 130 scaling agile process, 600
Cross-team dependencies, 600 self-organization, 472
CRUD acronym, writing high-quality user stories, 421 status updates to team, 473–474
Cultivating Communities of Practice (McDermott), timing considerations, 472
672 Dark launch, testing MVPs, 359
Culture. see also Corporate culture Data analyst, 672
defined, 633 Data Complexity pattern, splitting stories, 425
fostering rapid learning, 566 Data engineer, 672
nurturing collaborative, 10–11 Data flow diagrams (DFDs), 308–310
organizing development teams, 94 Data integrity, defined, 35
Cumulative flow diagrams, 487–490 Data scientist, 672
Curiosity of business analyst, 64 Data-informed decision-making, 321, 545
Customer Data-informed innovation, 672–673
bill of rights, 101–102 Day to day activities, 524
collaboration, Agile Manifesto, 28–29 DDD (Domain-driven design), 57–58
journey map, 273–278 Debriefing, Open Space events, 614
story of cantankerous, 10–11 Decision tables, business rules/AC, 433–438
Customer Engagement One (CEO) case study, 15–16 Decisions
Customer journey analytics (CJA), 658–659 delaying in Agile planning, 42
Customer-developer relationship, 101–102 distributed authority approach to, 659–663
I 723

Dutch polder model of making, 666–667 story preparation, 511


postponing until last responsible moment (LRM), threatening delivery of upcoming feature(s), 605
659 Deployment
Decomposing stories, 136 to beta customers, 235
Defects, as waste, 52 of bug fixes and minor changes, 236
Definition of done (DoD) customer concerns on frequency of, 237
examples of, 138 deferred, due breaking a flow, 236–237
reviewing in iteration planning, 449 deferred, due to technical limitations, 236
reviewing in sprint planning meeting, 597 deferred versus immediate, 234–237
in rolling analysis, 469 delivery versus, 233
in Scrum, 46 environment types, 231–232
specify only one across whole product, 569 impact of regulatory constraints, 236
tuning, 137–138 inability to achieve frequent and reliable, 620
Definition of ready (DoR) of major features and enhancements, 236
example of, 138–142 MVP delivery approach and, 232–234
feature, 142–144, 258 options and potential issues of, 234
inter-team collaboration, 617 to sacrificial product, 235
reviewing in iteration planning, 449 Design group, scaling agile organization, 577
reviewing in sprint planning meeting, 597 Developer
in Scrum, 46 basing estimate on capable, 333
team-level story preparation, 606 has final say on estimates, 325
tuning story, 138 in product owner council, 580
Definitional business rules, 37, 434 relationship with customer, 101–102
Delayed requirements, managing stakeholder role in Triad meetings, 402–403
expectations, 99 Developer task board
Delighters forecasting using, 474
documenting personas for, 269 iteration implementation with, 453–454
Kano grades, 206–207 as iteration planning deliverable, 446
Deliverables updating, 475
decision table, 434 Developer task cards, iteration planning, 454
iteration planning, 446–447 Developer tasks
iteration retrospective, 518 daily burndown chart, 479–480
iteration review, 514–515 feature preparation, 258–259
quarterly and feature planning checklist for, 694 identifying iteration development, 452–453
quarterly release retrospective, 539 measuring progress on burnup chart, 486
quarterly/feature planning event for, 326–327 signatures on burndown charts, 483–486
scaled quarterly and feature planning, 588 sprint planning, 47
sprint planning meeting, 596 tracking, versus stories on burndown, 482
Triad meetings on user stories, 401 Development
Delivery agile diagnosis and, 22
agile fluency model, 108 BA competency and, 24–25
cadence, 43, 54 empathy in product improvement and, 658
MVP deployment and, 232–234 history of, 17–18
Delphi estimation, 338–340 infrastructure spike (or story), 419
Demo, iteration review, 515–516 investment in technology to accelerate product,
Dependencies 648–649
analyst role in preventing, 557 journey map for investment in, 278
indicated on program board, 589 managers in product owner council, 580
managing/identifying, 344–347 managing stakeholder expectations about, 99–101
prioritizing story to prevent, 381 MVP environment for, 231–232
recurring issues with, 620 preparing infrastructure, 90–93
resolving waterfall, 619 role, XP Quarterly/Release Planning Game, 323
SoS meetings addressing cross-team, 600 story estimation by, 324
story map relationship, 370 track record, 22–23
724 I

of value stream map, 284–285 Discussed, well formed stories as, 421
value streams, 283 Disruptive innovation
Development teams adapting culture for, 644
attending backlog seeding, 197 adapting to sustaining innovations versus, 644
business can lead technical/engineering, 668 business model disruptions, 642–643
collaboration between customers and, 10–11 cost-benefit calculation, 121
extended, 97 determining enterprise agility, 636–644
feature vs. generic teams, 96–97 does not have to be of low quality, 641
forming cross-functional teams around value, 668 as enterprise agility principle, 631
organizing around value, 93–96 as evolutionary leap, 638–639
overview of, 93 litmus test for identifying disruptions, 643–644
pre-alpha testing by, 533 low-end disruptions, 641–642
transitioning to agile development, 108–109 mainstream disruptions, 642
DevOps new-market disruptions, 642
as agile framework, 56 protecting islands of, 644–647
benefits of, 560 Uber and, 640–641
collaborative culture, 559 understanding, 637–638
defined, 559 updates to Christensen's model of, 639–640
delivery cadence and, 43 Dissatisfiers, Kano grades, 206
determining traceability, 130 Distributed authority
history of, 18 agile corporate culture practice of, 659
Microfocus ALM Octane tool integration with, be like the octopus, 661
699 benefits of, 660
MVP deployment, 232 elements that must be present for, 660–661
practices, 560–561 holacratic approach to, 662–663
preparing testing infrastructure, 91–93 localized and individualized, 661–662
quarterly release retrospective, 540 Distribution team, preparing, 103–104
quarterly release retrospective checklist, 695 Documentation
resources on, 562 agile analysis vs. waterfall, 66
scaling agile organization, 578 analysis, 538
DFDs (data flow diagrams), 308–310 feature, 497
Diagnostics, with burndown signatures, 482–486 focus on compliance goals, not means, 105–106
Differentiating quadrant, purpose alignment model, increasing for non-colocated teams, 619
88–89 of information on personas, 268–269
Differentiator MVPs, 360 tracing analysis artifacts, 506–508
Difficult people, business analysts work with, 64 updating BA. see Business analysis (BA)
Digital camera, as low-end disruption, 641 documentation, updating
Discovery-driven financial planning DoD. see Definition of done (DoD)
agile financial planning, 103 Domain-driven design (DDD), 57–58
hypotheses in, 189 DoR. see Definition of ready (DoR)
overview of, 675–676 Downstream (forward) traceability, 129–130, 507
Discovery-driven planning, BestBots case study Downward traceability, 507
background, 701–702 Drivers, for agile organization, 628–629
create assumptions checklist, 708–709 Dropbox, Preorders MVP, 363
create milestone planning chart, 710–711 Duration, iteration planning, 445
create pro forma operations specifications, Dutch polder model, collaboration, 665–666
706–708 Dynamic, product backlog as, 125
determine constraints (required outcomes),
703–705 E
draft reverse income statement, 705–706 Effort, measured by point estimates, 335
initial market analysis, 702–703 Elaboration phase, RUP lifecycle, 49
overview of, 701 Electronic stories, versus physical, 403–404
revise reverse income statement, 709–710 Elements, daily burndown chart, 479–480
I 725

Emergent AC, specifying, 413 long-term planning requirements and, 224


Emergent features nontrivial change to mature products as, 255
agile vision statement leaves room for, 173 ongoing analysis of upcoming, 509–512
specifying, 200 overview of, 38–39
Emotional empathy, 656–657 planning long-term. see Product roadmap
Emotional marketing, 657 portfolio, 574–575
Empathic concern (compassionate empathy), 657 program, 575
Empathy represent user capabilities as, 183
agile corporate culture practice of, 655–656 requirements-related terminology, 38–39
benefits of, 656 Role-feature-Reason template representing, 199
cognitive, 656 rules of thumb for estimating/splitting, 682–683
compassionate, 657 scaling backlog, 566–567
defined, 655 sequencing in backlog, 212–215
emotional, 656–657 specifying acceptance criteria, 260–261
practical tools, 657–659 taxonomy of story size, 395–396
Enabler story. see Spikes, SAFe visioning. see Visioning
End customers, beta testing by, 533–534 Estimates
End event, BPMN private process model, 293 development creates story, 324
End user, user story must deliver to, 395 development has final say on, 325
End-to-end process, agile development, 8–9 done by those who do the work, 663
End-to-end UAT, 263, 286 feature preparation walkthrough, 605
Engineering and component group, 577 feature prioritization using broad, 214
Engineering teams, business can lead product, 668 setting standards for, 134–135
Enhancements for splitting stories and epics, 682–683
actionable metrics for, 188 story, 40
deployment of, 236 for time to complete developer tasks, 454–455
Enterprise, achieving agility using for functional spikes, 418
agile financial planning, 675–676 for well-formed stories, 421
culture. see Corporate culture Estimates, quarterly and feature planning agenda
foundational practices, 629–631 bug fixes, 341
innovative product development process, 631–632 Delphi estimation, 338–340
introduction, 623 describe and estimate features, 332
on the map, 624–626 iteration planning, 341
mapping to IIBA and PMI guides, 681 no-estimating approach, 338
objectives, 623, 626 objectives of, 332
overview of, 627–629 spikes, 340
summary of, 676 technical stories and nonfunctional requirements,
Enterprise, preparing for agile development 341
agile enterprise transition team, 112 time spent on, 332
agile fluency model, 107–108 units and methods for, 334–338
communications plan, 111–112 work included in, 333
overview of, 107 Evans, Eric, 57–58
transition activities, 109–111 Events
transition team, 108–109 developing collaborative culture by holding, 664
transition timeline, 111 overview of scaled, 583–585
Enterprise and strategy analysis, as BA focus, 17 quarterly release retrospective, 542–543
Entry conditions, quarterly/feature planning, 325–326 specifying for interim periods, 243
Environment types, deployment, 231–232 Evolutionary leap, disruption as, 638–639
Epic vision statement, 172–173, 181–184 Excitement feature, Kano grades, 206–207
Epics Exclusive gateway, BPMN private process model, 293
definition of, 196 Executive support, for organizations with no agile
to features and stories from, 38 experience, 110
features often begin as, 254 Expected features, Kano grades, 206
726 I

Experimental failure, agile corporate culture values, Features


633 acceptance criteria, 40, 215–216
Exploration phase, XP Quarterly/Release Planning during alpha testing, 533
Game, 322 attributes of, 201–202
Extended team, 97 during beta testing, 534
External view of process, BPMN public process deferred vs. immediate deployment of, 234–237
model, 288–292 defining independent, 199
Extreme environmental stressor, evolutionary definition of, 196
disruption from, 638–639 deployment of major, 236
Extreme Programming Explained (Beck), 48, 322–325 discovering via circumstance-based market
Extreme Programming (XP) segmentation, 198
as agile framework, 48–49 documenting, 497
history of agile development, 17 from epics to, 38
requirements units as stories in, 38 grading in Kano analysis, 204–207
term “story” originates in, 395 how many to seed up front, 196–197
timeboxed planning in, 121, 555 initial preparation for scaled initiative, 585
lifecycle, 40–41
F narrow/deep versus wide/shallow approach to,
Facilitation 237–240
additional tips, 684–685 ongoing analysis of upcoming, 509–512
consensus-based decision making, 662 other ways to discover, 198–199
daily standup tips for, 472 overview of, 39
quarterly release retrospective, 539–542 PBI attributes, 201–202
scaled quarterly and feature planning, 589 physical representation of, 200
skills of business analysts, 64 planning. see Quarterly and feature planning
stakeholder engagement and analysis, 179–181 preparation activities, 510
tips for cause-effect diagrams, 157–158 preparation of. see Quarterly and feature preparation
tips for stakeholder events, 152 preview meeting, 462–463, 599
using product portrait as visioning tool, 170–171 prioritizing in long-term planning, 224–225
Facilitator, agile analysis vs. waterfall, 65 prioritizing to complete product portrait, 217–220
Fail fast (iterative experimentation) practice, Agile refining incrementally, 327–328
corporate culture, 650–652 Role-feature-Reason template representing, 199
Failure, iterative experimentation and, 651–652 scaled (quarterly) preparation of, 602–605
Feature card, story map, 368–369 scaled quarterly retrospective and, 609–611
Feature Closeout, Short Lane analysis and planning, 78 scaling backlog, 566–567
Feature definition of ready (DoR) sequencing in backlog, 212–215
definition of, 9 specifying emergent, 200
preparing for planning event, 326 specifying for interim periods, 242–243
quarterly and feature planning preconditions, 586 stakeholder productivity expectations of, 100
quarterly and feature preparation using, 258 story maps plan coherent set of, 367
readiness checklist for quarterly planning, 690 target, in Kano analysis, 202
scaled planning and implementation, 557 taxonomy of story size and, 395–396
tuning, 142–144 team dependencies due to interconnected, 553–554
for upcoming requirements item, 9 use cases or user tasks sized as, 382
Feature demo, 606 user capabilities as, 183
Feature ready, rolling analysis, 469 as waste, 52
Feature set, 318, 570–571 Feedback
Feature teams alpha testing for, 533
defined, 575 from beta testing, 359
with extended team, 576 in Delphi estimation, 339, 340
forming, 575–576 fostering cognitive empathy with customer, 658
versus generic teams, 96–97 Lean software development, 53
scaling backlog, 569 successful full-potential plan via, 227–228
structure of, 97 voice of the customer as, 658
I 727

Feelings, as journey map component, 278, 282 measuring past velocity for, 331
Fibonacci sequence, in story estimation, 40, 337–338 quarterly plan as, 321
Field research, circumstance-based market stories that will be delivered, 450–451
segmentation, 630–631 updates, 474
Final review, iterations, 516 using burndown charts for, 482
Financial planning without estimating, 338
achieving enterprise agility, 675–676 Foresight, hindsight as best, 333
data-informed, 672–673 Forward (downstream) traceability, 129–130, 507
preparing organization, 102–103 Foundational practices, enterprise agility, 629–631
Fishbone diagrams, root-cause analysis, 157–161 14th Annual State of Agile Report, 22
Five W questions Frameworks, agile
problem or opportunity statement, 167–168 ATDD. see Acceptance test-driven development
product portrait, 170 (ATDD)
Five Whys method, root-cause analysis, 153–157, BDD. see Behavior-driven development (BDD)
161–162 determining, 121
Flickr DAD, 583
as constructive failure, 364, 651 DevOps, 56
as disruptive innovation, 638 Domain-driven design (DDD), 57–58
empathy in development of, 658 Kanban, 44
Flow-based feature planning Lean software development, 51–55
overview of, 318 Lean startup, 55
quarterly planning versus, 315, 319–320 Lean Thinking, 50–51
Flow-based planning LeSS, 583
continuous implementation/delivery via, 558 NEXUS, 583
determining requirements granularity levels, 127 overview of, 43
feature review via, 607 Rational Unified Process (RUP), 49
frameworks supporting, 121 SAFe. see Scaled Agile Framework (SAFe)
iteration implementation, 451 scaled, 582–583
Kanban board columns for, 459–462 Scrum, 44–48
Kanban using, 42 TDD. see Test-driven development (TDD)
rolling analysis using, 469 UML and BPMN, 57–58
rolling preparatory analysis using, 509 use cases, 49–50
story planning via, 444 XP. see Extreme Programming (XP)
story preparation via, 394 Franklin, Andrea, 105
timeboxed planning versus, 121, 555 Frequency
using for frontend, 555–556 constructing story map ribs, 381
Flows, use-case of POC meeting, 601–602
tracing analysis artifacts, 506–508 for pruning and ordering meetings, 512
updating specifications, 505–506 Frontend, flow-based approach to, 555–556
updating use-case model, 497 FRs (functional requirements), 34, 380–381
Fluency model, agile, 107–109 Full-potential plan
Focusing, agile fluency model, 108 business analyst contribution, 227–228
“Focusing on threes,” embracing change, 653 create detailed plan, 226
Follow-up meeting, monitoring progress, 474 deliver quick wins, 226–227
Forecasting enterprise agility practice, 630
accomplishments in sprint planning meeting, long-term planning, 225
595–597 MVP implementation. see Minimum viable product
all developer tasks in backlog, 480 (MVP), capabilities
commitment to scope, 342 MVPs validate assumptions, 228–230
delivery of feature, 330–331 product roadmap for. see Product roadmap
feature/story delivery time via story points, 335 set bold targets, 225–226
goal and scope of iteration, 447–451 Full-time membership, development team, 95
iteration review, artifacts for, 516 Fully described level, requirements granularity, 129
728 I

Functional requirements (FRs), 34, 380–381 commitment to, 329–330, 341


Functional spikes crafting common iteration, 597
feature preparation, 258–259 crafting for planning agenda, 329–330
naming standards example, 395–396 crafting interim, 242, 383
overview of, 416–417 crafting iteration, 449–450
uncertainty pattern, 427–428 daily standup supports shared team, 472
Fundamentals of agile analysis and planning feature prioritization supports strategic, 214
agile frameworks, 43–58 focus on compliance, 105–106
Agile Manifesto, 28–29 forecasting iteration, 447–451
agile planning, 42–43 identifying persona, 268–269
agile roles and BA, 58–63 iteration planning, 446
key practices in agile vs. waterfall, 65–68 as journey map component, 282
mapping to IIBA and PMI guides, 678 Multiple User Goals pattern for user stories, 426–427
objectives, 27 planning agenda using outcome-based, 329–330
requirements-related terminology, 32–41 Goals and objectives analysis, visioning, 147
rules of thumb, 68 Goldratt, Eliyahu M., 161
soft skills of agile business analysts, 63–66 Google Docs, 360, 700
standards, 31–32 Governance, 104–106
summary of, 68 Grades, in Kano analysis, 204–207
twelve principles, 29–31 Grand Lane
Funnel metrics, split testing outcomes, 492–494 Agile Analysis and Planning Map, 79–81
Future, embracing change in, 653 defined, 73
scaling agility focus on. see Scaling agility
G Grant, Adam, 633
Game Neverending Granularity levels, 125, 127–129
empathy when developing Flickr from, 657–658 Gravity of past success, sustaining, 645
Flickr born out of failure of, 638, 651 Greenleaf, Robert K., 564–565
Games, iteration retrospective, 520–524 Groupon, as constructive failure, 364
Gateway, in BPMN public process model, 288 Growth
Gating, avoiding DOR, 142 accelerating operational capacity for, 654
General availability (GA) stage hypotheses, 186
analysis documentation, 538 investment in technology for, 649–650
general availability checklist, 535–538 Guesstimate stories delivered, for initial capacity, 136
monitoring, 538 Guiding coalition, accelerating change, 109
product is accessible in, 535 Guilds, as communities of practice, 668–682
rules of thumb, 683 Gutsche, Jeremy, 633
value validation, 539
Generic versus feature teams, 96–97 H
Gherkin feature files H&M, purpose brand for, 646
continuous development and, 563–564 Happy-day scenario flow, 497
feature documentation with, 497 Hardening (stabilizing) iterations, 531–532
specifying automated UAT, 93 Healthy signature, burndown chart, 483–484
specifying end-to-end UAT, 263 Heaven painting (Podeswa), 3
specifying feature AC, 56, 260 Heavily regulated sectors, agile in, 629
Gherkin syntax, 408 Hell painting (Podeswa), 3–4
Given-When-Then template, BDD Hesse, Hermann, 565
feature AC examples using, 262–263 High-level functionality, via interim goals, 383
feature documentation, 497 High-level use cases, product-development value,
seeding product backlog, 76 668–669
using, 413 High-quality stories, writing guidelines, 420–421
syntax, 408 High-risk (fixed-price solution), targeting agility level
Goals for, 119–120
BPMN private process model, 294–295 Hindsight, as best foresight, 333
business, 32–33, 181–184 Holacratic approach, distributed authority, 662–663
I 729

Hoote Suite, 556 Information


Horizontal (cross-stream) traceability, 130, 507 checklist of BA artifacts, 122–123
“How,” well-formed AC does not describe, 412 conveying via face-to-face conversation, 29
Humor, business analysts and, 65 corporate culture commitment to transparency of,
Hybrid approach 666–667
supported by most organizations, 121 hoarding via silos, 667
updating use-case model, 498 Information radiators, 52, 54–55
Hypotheses Information security (Infosec) group, 578
assumptions/hypotheses card, story map, 368–369 Informative workspaces, XP, 49
leap of faith. see Leap of faith hypotheses Infrastructure
quarterly plan, 321 investing in technology, 649–650
value, 186, 188 preparing development/testing, 90–93
In-house infrastructure, 649
I Initial capacity (velocity), 135–137
IBM Doors Next tool, 700 Initial market analysis, discovery-driven planning,
Ideal developer days (IDDs), estimation, 40, 335–336 702–703
IIBA (International Institute for Business Analysis), Initial preparation, scaled initiatives, 585–586
14–15 Initiation and Planning zone
IIBA guides, mapping of book chapters to, 677–681 defined, 72
Impact, problem or opportunity statement, 167–169 how long to spend upfront on, 87–88
Impact and influence matrix, stakeholder long-term agile planning, 222–223
communication, 178–179 Prepare the Organization in. see Organizational
Impediments preparation
accelerate growth by removing, 655 Prepare the Process in. see Process preparation
story preparation, 511 scaling agility in. see Scaling agility
Implementation Short Lane analysis and planning in, 74–75
of developer tasks, 471 understanding, and resources for, 86–87
sprint planning meeting, 597–598 visioning tools. see Visioning
Implementation, iteration and story planning Initiatives
case study, 456–458 Big Bet, 226
identifying developer tasks, 452–456 long-term planning, 224
inviting PO, 451 The Inmates Are Running the Asylum (Cooper),
overview of, 451–452 265
steps, 452 Innovation
Implementation sequence, story maps, 366–367, corporate culture practice of data-informed,
379–380 672–673
Improvement plan, iteration retrospective, 519 defining, 637
In Search of Excellence (Peters), 88 disruptive. see Disruptive innovation
Inception phase, RUP lifecycle, 49 protect islands of, 644–647
Increment sustaining, 637
defined, 45 types of, 637
as iteration planning deliverable, 446–447 Innovation and planning (IP) iteration, SAFe, 582
Incremental process, refining features and AC as, Innovation Games, agile collaboration tool, 700
327–328 Innovative development
Incremental scaling, MVPs, 364–365 agile process for, 631–632
Incumbent businesses, 654 approach to, 5
Independent for mainstream businesses, 5–7
features, 199 MVP case study, 357–358
well-formed stories as, 421 process for, 631–632
Indifferent features, Kano grades, 207 The Innovator’s Dilemma (Christensen, Raynor, and
Individual McDonald), 631, 644
as core value of Agile Manifesto, 28 The Innovator’s Solution (Christensen, Raynor, and
decision-making authority, 661–662 McDonald), 631, 644
well-formed stories as, 421 Input artifacts, planning agenda, 331
730 I

Inputs Lean software development, 53


iteration planning, 445 MVP process of, 363
iteration retrospective, 518 starting, 351
iteration review, 514–515 Iteration and story planning
Open Space events, 612 attendees, 445
preparing quarterly/feature planning event for, 326 duration, 445
quarterly and feature planning checklist, 693 feature preview meeting, 462–463
scaled feature preparation, 604 forecast goal and scope, 447–451
scaled quarterly and feature planning, 588 implementation planning, 451–458
sprint planning meeting, 596 introduction to, 441
story map backbone, 373–374 iteration planning deliverables, 446–447
to Triad meetings on user stories, 401 iteration planning inputs, 445
Inspect-and-adapt event, iteration reviews, 514 iteration planning parts, 444
Inspect-and-adapt tool, daily standup, 471–474 Kanban board setup, 458–462
Institution-as-servant principle, 565 looking two iterations ahead, 463
Integration on the map, 442–443
meetings, scaling agile, 599 mapping to IIBA and PMI guides, 680
recurring issues with, 620 objectives, 441, 444
SoS meetings address issues of, 600 overview of, 444–445
Integration Capabilities pattern, splitting stories, 428 planning rules, 447
Integrity, Lean software development principle, 54 rules, 447
Intentional destruction, in agile corporate culture, 633 scaling iteration planning, 462
Interaction, core value of Agile Manifesto, 28 story planning overview, 444
Interconnected features, and team dependencies, summary of, 463
553–554 Iteration backlog, 446
Interim goal card, story map, 368–369 defined (as sprint backlog), 47
Interim goals, Timeline view, 383–387 Iteration Closeout zone
Interim periods, product roadmap for planning, Grand Lane analysis and planning in, 80
241–244, 246–247 rolling analysis in. see Rolling analysis and
Intermediate event, BPMN private process model, 293 preparation (day-to-day activities)
Internal collaboration, culture of, 663–665 Short Lane analysis and planning in, 78
International Institute for Business Analysis (IIBA), Iteration demo, 606. see Iteration review
14–15 Iteration goal, 446
International Requirement Engineering Board (IREB), Iteration goal card, story map, 368–369
31 Iteration Inception zone
Internet of Things (IoT) development, 5, 286 defined, 73
Interoperability Short Lane analysis and planning in, 78
alpha testing for, 533 story planning. see Iteration and story planning
defined, 35 Iteration planning
Inter-team collaboration quarterly and feature planning estimation, 341
analyst role in, 557–558 sprint planning as, 47
choosing approach to, 554–558 Iteration retrospective
DevOps culture for, 559–560 attendees, 518
lightweight tools supporting, 615–617 games, 520–524
of scaled agile teams, 553–554 inputs and deliverables, 518
Introduce Dropship Capability, UAT for end-to-end overview of, 517
workflows, 263 timing considerations, 517
INVEST guidelines, crafting user stories, 421 topics/agenda, 518–520
Investment in enterprise agility, 648–650 Iteration review
Invitees. see Attendees artifacts for forecasting/tracking progress, 516–517
IoT (Internet of Things) development, 5, 286 bring waterfall teams to, 620
Ishikawa diagrams, root-cause analysis, 157–161 inputs and deliverables, 514–515
Iteration overview of, 514
accounting for progress at end of, 513 topics/agenda, 515–516
I 731

Iterative experimentation (fail fast) practice, Agile create questions, 203


corporate culture, 650–652 determining customer value of feature, 212
Iterative-incremental development, 6, 17 grade features, 204–206
It's Not My Problem story, 8–9 grade interpretation, 206–207
natural decay of delight (and its opposite), 208
J process overview, 202
Jacobsen, Ivar, 49 satisfaction versus fulfillment graph, 207–208
JAMA software tool, 699 select customers, 203
Jeffries, Ron, 582 select target features, 202
JIRA tool, 699 test questionnaire internally, 204
Jobs Karl Lagerfeld Pour H&M, 646
circumstance-based market segmentation identifies, KAs (knowledge areas), BABOK Guide, 31–32
630–631 Kasparov, Garry, 645
duration of, 126–127 Key practices, agile vs. waterfall, 65–68
organization based on, 668–669 Knowledge areas (KAs), BABOK Guide, 31–32
titles for, 664 Kofman, Jeffrey, 357–358
Journey mapping Kotter, John, 109–110, 172–173
case study, 279–282 K-T (Cretaceous-Tertiary), extinction event, 638–639
components, 274–278
customer journey map, 273–274 L
defined, 272 Lab, MVP testing in, 358–359
empathy in business operations and, 658 Lanes, Agile Analysis and Planning Map
feature preparation via. see Journey mapping Grand Lane, 79–81
more on, 283 graphics of, 69–71
personas guiding, 269 Long Lane, 79
using, 272–273, 278–279 Short Lane, 74–78
A Journey to the East (Hesse), 565 summary of, 81
“juicy bits” first (user/business value), story map ribs, understanding, 72–73
381 Lanes, BPMN private process model, 293
“Just Talk,” inter-team collaboration via, 616 Lanes, process modeling and swimlanes, 287
Just-in-time requirements analysis, 66 Large Initial Effort pattern, splitting stories, 424
Large Scale Scrum is Scrum, scaled agile framework
K principle, 582
Kanban Large Scale Scrum (LeSS) framework
as agile framework, 44 “Just Talk” guideline, 616
board setup for iteration planning, 458–462 scale agile approach of, 553
cumulative flow diagrams, 487–490 timeboxed planning in, 555
customer-generated requests, 15–16 Last responsible moment (LRM)
feature planning. see Flow-based feature planning agile corporate culture practice of, 659
flow-based planning. see Flow-based planning agile financial planning using Real Options for,
as origin of agile development, 17 675
requirements granularity levels, 127 Lean software development, 53
review practices, 347–348 timing of feature preparation and, 257–258
timing of feature preparation, 257 Latent requirement, Kano grades, 206–207
tuning workflow parameters, 143–144 Law of Two Feet, Open Space events, 613
work items, 38 Lawrence, Richard, 422
Kanban board Lawrence patterns, 394. see also Splitting stories,
setting up, 454–462, 474 patterns
updating, 476–479 Leader as Coach, 564
Kano analysis Leader Who Serves, 565
case study, 209–212 Leadership
conduct survey, 204 effective agile, 564–566
continuous analysis, 208 empowering others to make their own decisions,
create prototypes, 204 660
732 I

Lean pull mechanism, forecast stories to be delivered, Lifecycle


450–451 across states of Kanban board, 476–479
Lean software development Agile analysis and planning. see Agile Analysis and
as agile framework, 51 Planning Map
history of, 628 feature, 40–41
information radiators, 54–55 Rational Unified Process (RUP), 49
principles of, 54 Lightweight tools
seven wastes, 51–52 agile analysis vs. waterfall, 67
tools of, 52–54 for inter-team collaboration, 615–617
Lean Software Development: An Agile Toolkit Lists, roles, and responsibilities table, stakeholder
(Poppendieck and Poppendieck), 628, 675 identification, 176
The Lean Startup, 628 Loblaw, autonomous branding of, 646
Lean startup Localized decision-making, 660–662
actionable metrics in, 186 Long Lane, 73, 79
as agile framework, 55 Longevity, product vision statement design, 173
and MVP, 630 Long-term agile planning
MVP planning in, 356 capabilities for effective MVP implementation,
understanding, 185 231–240
Lean Thinking, 50–51 epic planning, MVP, and overview, 224–225
Leap of faith hypotheses full potential plan, 225–228
analysis of, 185 on the map, 222–223
assumption analysis, 190–191 mapping to IIBA and PMI guides, 679
assumption checklist, 189–190 objectives, 221
crafting iteration goal for learning value, 449 overview of, 221
in discovery-driven planning, 189 planning interim periods, 241–248
growth hypotheses, 186 product roadmap, 240–241, 243–248
identifying in innovative development, 632 summary of, 248
lean startup, 55 validating assumptions using MVPs, 228–230
lean startup and, 185, 630 Loose coupling, DevOps, 560
metrics, 187–188 Low-end disruptions
milestone planning chart, 190 creating purpose brand for, 646
MVP approach begins with, 228–229 litmus test identifies, 643
MVP case study, 357–358 overview of, 641–642
MVP process, 230 Low-fidelity interface story maps, 387–388
pivot-or-persevere meeting to validate, 544–547 Low-level integration tests, automated, 92
understanding, 185–186 LRM. see Last responsible moment (LRM)
validating by creating MVP, 630
value hypotheses, 186 M
visioning process, 147 M&As (mergers and acquisitions), 228, 652
Learn step, MVP process, 230 The Machine That Changed the World: The Story of
Learning Lean Production (Womack, Roos, and Jones), 628
failure as opportunity for, 651 Mainstream businesses
fostering culture of rapid, 566 adopting agile approach, 5–7
making world your classroom, 653 resist experimentation, 651–652
MVP is meant for, 357 Scrum popular with, 44
planning agenda goals for, 329 as source of information for personas, 267
Learning value Mainstream disruptions, 642, 643
constructing story map ribs, 380 Maintainability, defined, 35
crafting iteration goal for, 449 Manifesto for Agile Software Development, 627
of story, 395 Manual tests, 91–93
Leffingwell, Dean, 56–57 Mapping
Legend, cause-effect tree, 161–162 agile analysis and planning. see Agile Analysis and
Lens, as journey map component, 276, 281 Planning Map
LeSS. see Large Scale Scrum (LeSS) framework of book chapters to IIBA/ PMI guides, 677–681
I 733

goals to requirements, 36 provides PO with proposed features, 330


MVPs and. see Minimum viable product (MVP) specifying metrics, 187
and story maps split testing, 491
Market testing for innovative products, 632
accelerating time to, 653 understanding, 229
checklist for quarterly release retrospective, 697 validating long-term plan assumptions, 228–229
differentiation, purpose alignment model, 88–90 Minimum viable product (MVP) and story maps
practice of monitoring adjacent/low-end, 673–675 backbone. see Backbone, constructing story map
prioritizing technical risk versus, 214–215 complementing each other, 356
testing MVPs directly in, 359 on the map, 354–355
timing release of product to, 530–532 MVP planning. see Minimum viable product
Marketing (MVP) planning
emotional, 657 objectives, 353, 356
preparing team for, 103–104 overview of, 353
Marketplace, Open Space events, 612 story maps. see Story maps
McDonald, Raynor summary of, 388
disruptive innovation and, 631 Minimum viable product (MVP), capabilities
updates to disruptive model, 639–640 deferred vs. immediate deployment, 234–237
Measure step, MVP process, 230 deployment and delivery approach, 232–234
Mechanisms, determining traceability, 130–131 narrow/deep versus wide/shallow, 237–240
Merge processes, process modeling to, 286 overview of, 231
Mergers and acquisitions (M&As), 228, 652 technical capabilities, 231–232
Message flow, BPMN, 288, 293 Minimum viable product (MVP) planning. see also
Methods, estimation, 334–338 Story maps
Metrics case study, creating MVP, 365–366
actionable, 187–188 case study, Trint, 357–358
goals and objectives, 182–184 establishing MMP, 365
lean startup, 55 incrementally scaling, 364–365
MVP process determining, 230 iterative process, 363
quarterly release retrospective, 541 MVP, defined, 357
specifying for interim periods, 243 the pivot, 363–364
split testing using funnel, 492–494 summary of, 388
validating leap of faith hypotheses, 187–188 types of MVP, 359–363
value stream maps, 283 venues for experiments, 358–359
Microfocus ALM Octane tool, 699 Mining timeline, quarterly release retrospective, 543
Milestone planning chart, 190, 710–711 Mission criticality, purpose alignment model, 88–90
Milestones Mission statements, product vision statements vs.,
defined, 37 173–174
quarterly release retrospective, 542–543 Mitchell, Dana, 333
specifying for interim periods, 243 Mitigate risk, 346
Minimal quarterly plan, 344 MMFs (minimum marketable features), 224, 365
Minimum marketable features (MMFs), 224, 365 Moments of truth
Minimum marketable product (MMP) innovative development, 631–632
impact of agile on productivity, 23 as journey map component, 278, 282
stakeholder productivity expectations and, 100 Monitoring progress
using MVPs to establish, 365 burndown versus burnup charts, 486–487
Minimum viable product (MVP) burnup charts, 486
begins with leap of faith hypotheses, 185–186 cumulative flow diagrams, 487–490
full potential plan, 228, 630 daily burndown chart, 479–486
hypotheses in discovery-driven financial daily standup, 471–474
planning, 189 follow-up meeting, 474
in lean startup, 55, 630 updating developer task board, 475
overview of, 224–225 updating Kanban board, 476–479
process of, 229–230 Monitoring system, product release and, 538
734 I

Motion, reducing though information radiators, 52 O


Multifunctional flowchart diagram, 287 Objectives
Multiple Devices, Platforms pattern, 428 commitment to planning agenda, 329–330
Multiple teams commitment to quarterly, 341
backlogs, 569 crafting interim, 242
scaled (quarterly) feature preparation, 602–605 crafting planning agenda, 329
scaled iteration retrospective, 607–608 of estimation, 332
scaled iteration retrospective follow-up, 608 Objectory process, RUP, 49
Multiple Use-Case Scenarios pattern, splitting stories, Obsolete quarterly plans, 352
425 Octopus, distributed autonomous authority and, 661
Multiple User Goals pattern, splitting stories, Ohno, Taiichi, 50
426–427 Omotenashi, Kano grades, 206–207
Multiple User Roles pattern, splitting stories, 428–429 One-dimensional features, Kano grades, 206
Must-haves, Kano grades, 206 One-time experiments, governance changes as, 106
Mutations, yielding outsize results, 639 Open (public) beta testing, product release, 534
MVP. see Minimum viable product (MVP) Open Space events, 611–614
MyChatBot example, 570–571, 668–669 Operational
capacity, accelerating growth, 654
N cost, 215
Naming standards, stories, 395–396 mission statement as, 173
Narrative, story map backbone, 372 MVPs, 362
Narrow and deep strategy, long-term feature sequence, story maps, 366–367
implementation, 238–240 value, of cross-functional teams, 668
Natural decay of delight (and its opposite), Kano value streams, 283
analysis, 208 Operations infrastructure spike (or story), 419
Needs, innovative development and, 631–632 Opportunities (pain points, moments of truth)
Negotiable, well-formed stories as, 421 innovative development, 631–632
Negotiation as journey map component, 278, 282
as skill of business analyst, 64 Optimization
of time estimates for developer tasks, 455 agile fluency model, 108
Netflix, 640, 648 BA track record for, 20
Newell curve (cumulative flow diagrams), 487–490 process modeling, 286
New-market disruptions, 642, 643 value stream mapping for process, 284
Nexus, scaled agile framework, 582 Options thinking, Lean software development, 53
NFRs. see Nonfunctional requirements (NFRs) Ordering, product backlog refinement, 509, 512–513
Nickolaisen, Niel, 88–90 Organization. see Scaling agile organization
Noble Inc., 363 Organizational preparation
No-estimating approach agile financial planning, 102–103
forecasting stories, 450–451 channels and supply chains, 104
quarterly and feature planning, 338 customer-developer relationship, 101–102
Non-colocated teams, scaling agility for, 617–619 determine organizational readiness, 112–113
Nonfunctional requirements (NFRs) of enterprise for agile development, 107–112
completing product portrait, 217–220 governance and compliance, 104–106
constraints checklist and, 689 increased demand on resources, 106
constructing story map ribs, 380–381 Initiation and planning, 86–87
defined, 34 managing stakeholder expectations, 99–101
implementation pattern for user stories, 426 on the map, 84–85
operations infrastructure spike for, 419 mapping to IIBA and PMI guides, 678
quarterly and feature planning, 341 marketing and distribution teams, 103–104
seeding product backlog, 216–217 objectives, 83
types of, 35 organizing development teams, 93–98
Noninnovative development, 5 overview of, 83
Nonsolutionized, well formed stories as, 421 preparing infrastructure, 90–93
Normal scenario flow, 497
I 735

purpose alignment model, 88–90 Physical form, of product backlog items, 125
summary of, 113 Physical representation, of features, 200
time spent upfront on initiation and planning, Physical stories, versus electronic, 403–404
87–88 PI (program increment), SAFe, 57, 582
Organizational readiness, determining, 112–113 Pierre Cardin, purpose brand quality, 645–646
Otis Elevator Company, 647 Pivot step, MVP process, 230
Outcomes Pivot-or-persevere meeting, 544–547, 632
agile corporate culture commitment to, 666 Planning
planning agenda goals for, 329–330 agile financial, 102–103
quarterly planning, 321 art of. see Art of Agile analysis and planning
Outputs, commit to outcomes not, 666 do not use story template when actively, 404–405
Outside enterprise, collaborative relationships, 665 flow-based. see Flow-based planning
Outsize results, mutations yielding, 639 fundamentals. see Fundamentals of agile analysis
Outsourced infrastructure, technology investment in, and planning
649 Initiation and Planning zone. see Initiation and
Overestimating signature, burndown charts, 485–486 Planning zone
Overextension, remedies to team, 455 iteration and story. see Iteration and story planning
MVPs. see Minimum viable product (MVP)
P planning
Pain points preparation versus, 256
documenting personas for, 269 principles of, 323–325
innovative development, 631–632 quarterly and feature. see Quarterly and feature
as journey map component, 278, 282 planning
Pair programming, 17 timeboxed. see Timeboxed planning
Parity quadrant, purpose alignment model, 88–90 use story template at end of, 405
Partially done work, as waste, 51 value proposition. see Agile analysis and planning,
Participants, feature preparation, 603 value proposition
Partner quadrant, purpose alignment model, 88, 90 when to use flow-based vs. timeboxed approach,
Patterns. see Splitting stories, patterns 555–557
Patton, Jeff, 366–367 Planning Game rules, 447, 455
PBIs. see Product backlog items (PBIs) The Planning Game, XP, 322–325
PC, as new-market disruption, 642 Planning Poker, 333, 338–340
Performance features, Kano grades, 206 PMI (Project Management Institute), mapping of book
Persevere step, MVP process, 230 chapters to, 677–681
Persistent documentation PMI guides, mapping book chapters to, 677–681
tracing analysis artifacts, 506–508 PMI Professional in Business Analysis (PMI-PBA), 17,
updating BA for stories, 496–506 31
use cases for, 50 PO. see Product owner (PO)
Persisting stories, 496 POC. see Product owner council (POC)
Personas Podeswa, Howard, 2–4
analysis of, 264–265 Podeswa, Yasha, xlv, 556
case study, 270–271 Podeswa, Yidel, 2
creating, 267–268 Point estimates, complexity versus effort, 335
documenting, 268–269 Political intelligence, of business analysts, 64
examples of, 266–267 Pools, BPMN private process model, 293
fostering empathy using personalized marketing Pools, BPMN public process model, 288
data, 658 Portfolio
history of, 265 checklist for quarterly release retrospective, 697
if it feels too contrived, try a real user, 265 structure, scaling agile organization, 574–575
as journey map component, 276 Postconditions, scaled feature preparation, 604
story maps based on, 386–387 Practices, agile corporate culture, 634–635
working with, 269 Pre-alpha stage, product release, 533
Perspective, agile corporate culture, 633 Preconditions, scaled quarterly and feature planning,
Perspective taking (cognitive empathy), 656, 658 586
736 I

Preorders MVP, 363 selecting BPMN. see Business Process Model and
Preparation Notation (BPMN)
backlog refinement as, 47 Process preparation
organizational. see Organizational preparation BA information artifacts and events, 122–123
planning versus, 256 defining requirements types, 123–124
process. see Process preparation determining process readiness, 145–146
quarterly and feature. see Quarterly and feature determining requirements granularity levels,
preparation 127–129
for quarterly/feature planning event, 325–328 on the map, 115–117
story. see Story preparation mapping to IIBA and PMI guides, 679
Principles, agile practices objectives, 115
invest aggressively in enterprise agility, 648–650 overview of, 122
overview of, 634 setting parameters, 134–136
protect islands of innovation, 644–647 summary of, 146
tailor approach to circumstance, 635–643 tailoring agile practice to context, 118–121
Principles, Open Space event, 613 tracing requirements/other configuration items,
Priorities 129–133
commitment to revise, 341 tuning the backlog, 124–127
conflicting, 620–621 understanding, 118
Prioritizing features value stream mapping optimizing, 145
feature preparation walkthrough, 604–605 Product
managing stakeholder expectations, 99 champion, 61, 580
quarterly and feature planning agenda, 332 distribution, 643
as right of customer, not developer, 324 empathy in developing/improving, 658
sequencing epics and features in backlog, group, 577
212–215 journey map for development investment, 278
using personas to determine, 269 Lean development optimizes whole, 54
Private (closed) beta testing, 534, 652 Product area
Private process model, BPMN, 293–298 job-based organization structure, 668–669
Pro forma operations, discovery-driven planning, scaling agile organization, 570–571
706–708 Product backlog items (PBIs)
Problem or opportunity statement, 167–169 attributes, 125–126
Problem-solving backlog refinement (preparation), 47
integration meeting, 599 cost of delay, 126
POC meeting, 602 definition of done (DoD), 45
quarterly retrospective, 540 determining WSJF, 126–127
sprint planning meeting, 597 matching with teams, 597
Process physical form of, 125
agile innovative development, 631–632 quarterly and feature planning estimation, 340
analysis, 8–9 readiness, 46
compliance after design of, 105 requirements granularity levels, 127–129
extra, as waste, 52 as requirements units, 38
feature change initiatives as new, 254–255 in rolling analysis, 469
improvement tasks, 518 scaling, 566–569
improving with journey maps improving, 279 Scrum and, 45
innovative product development, 631–632 seeding. see Seeding the backlog
scaling. see Scaling agile process setting up, 124–125
setting parameters, 134 specifying values for story attributes, 404
Process modeling sprint planning meeting, 597
business, 285–287 as story, 395
discovering initial features, 198 story preparation, 394
feature preparation via, 285–287 traceability, 130–133
product portrait for, 170–171 transparency, 46
I 737

Product backlog refinement SAFe, 57


as essential agile activity, 9 structure, scaling agile organization, 572–575
preparation for, 47 Program board, 588–589, 592–595
Scrum, 509 Program increment (PI), SAFe, 57, 582
Product owner council (POC) Progress
BA responsibilities of product champion, 61 accounting for at end of iteration, 513
composition of, 580 check, POC meeting, 602
frequency and timing, 601–602 monitoring team. see Monitoring progress
overview of, 579 Project Management Institute (PMI), mapping of book
scaling agile process, 601 chapters to, 677–681
waterfall approach to, 602 Projects, not used by high-level organizations, 575
Product owner (PO) Promise, quarterly plan sometimes is a, 321
analyst acting as proxy, 557–558 Proof of concept, technical research spike to create,
attends iteration retrospective, 518 419
and BA, 45 Prototypes, Kano analysis and, 204
BA responsibilities of, 59–60 Provisioning, automated DevOps, 562
as daily standup attendee, 473 Proxy PO, 572, 580
Grand Lane analysis and planning, 79–80 Pruning, product backlog refinement, 509, 512–513
insufficient business resources and, 621 Public process model, BPMN, 288–292
iteration goal proposal by, 449–450 Pull systems, Lean software development, 53
managing change during iteration, 495 Purpose alignment model, 88–90, 213
planning iteration implementation, 451 Purpose brand, low-end disruptions, 646–647
product-level, 568
questions for analyst to ask at Triad event, 402 Q
responsibility for user stories, 398–403 QA subgroup, scaling agile organization, 578
role in scaling agile organization, 571–572 Quadrants, purpose alignment model, 88–90
writing acceptance criteria, 408–409 Qualifiers, adding to spike's AC to avoid waterfall,
Product portrait, 169–171, 217–220 417
Product release. see Releasing the product Quality
Product roadmap agile prioritizing, 566
constructing implementation plan, 343–345 DevOps practice of built-in, 561
creating, 243–248 everyone is responsible for, 561
long-term planning with, 240–241 hardening iterations for, 532
planning agenda for long-term, 329 requirements, 34
planning interim periods, 241–243 of transparency, 667
planning shorter horizons, 248 Quality assurance (QA)
Product vision statement acceptance test–driven development and, 56
case study, 174–175 as extended team member, 97
crafting, 172–173 preparing first stories in backlog, 77–78
defined, 32 quarterly release retrospective and, 78, 696
mission statement vs., 173–174 Quarterly (release) roadmap
translate into goals and objectives, 181–184 case study, creating, 348–350
Product visioning. see Visioning with dependencies, 346–347
Production environment, MVP, 232 implementing, 344
Production process, business model disruption in, 642 on the map, 316
Productivity Quarterly and feature planning
diagnosing with cumulative flow diagrams, 489–490 checklist of deliverables, 694
impact of agile on, 23 checklist of inputs, 693
managing stakeholder expectations, 100–101 commitment, 341–348
quarterly release retrospective checklist, 695–696 flow-based feature planning, 318
Product-level PO, 568, 571–572 on the map, 316–317
Program mapping to IIBA and PMI guides, 680
epics, 575 objectives, 315, 317
quarterly release retrospective checklist, 697 overview of, 315
738 I

preparing for planning event, 325–328 product release. see Releasing the product
quarterly planning overview, 318 scaling agility. see Scaling agility
quarterly planning timing, 325 Short Lane analysis and planning, 78
quarterly planning versus flow-based feature, Quarterly feature retrospective, scaled, 609–611
319–320 Quarterly Inception/Feature Inception zone
quarterly planning, when advised/not advised, 319 defined, 69
quarterly planning, with agility, 320–322 MVPs/story maps. see Minimum viable product
retrospective, 348–350 (MVP) and story maps
reviewing once underway, 351–352 overview of, 72–73
summary of, 352 quarterly and feature planning, 318
timeboxing pros and cons in, 556–557 Short Lane analysis and planning, 78
topics (agenda), 328–341 Quarterly planning
XP's planning game guidelines, 322–325 attendee checklist, 692
Quarterly and feature planning, scaled flow-based feature planning versus, 319–320
checklist of attendees, 587, 698 readiness checklist, 690–691
creating program board, case study, 592–595 rules of thumb, 682
facilitation guidelines, 589 scaled, 80
inputs and deliverables, 588 Quarterly release retrospective
objectives, 586 checklist of questions, 695–697
overview of, 587 guidelines, 539–542
preconditions for, 586–587 overview of, 539
program board, 588–589 preparing timeline, 542–543
timing considerations, 586 recommendations, 544
topics/agenda, 589–592 scaled, 609–611
Quarterly and feature preparation walkthrough, 543–544
activities, 256–257 Quarterly/release burndown chart, 516–517
architecture review, 307–312 Quarterly/Release Planning Game, XP, 322–325
assessing readiness, 258 Questionable features, Kano grades, 207
benefits of feature preparation, 256 Questionnaire, Kano analysis, 204
BPMN. see Business Process Model and Notation Questions
(BPMN) anyone can ask, 325
business process modeling, 285–287 business analysts not afraid to ask, 65
context analysis, 263–264 in Kano analysis, 202–203
developer tasks and functional spikes, 258–259 quarterly release retrospective checklist, 695–697
feature definition of ready (DoR), 258 Quick wins, non-colocated teams, 618
journey mapping. see Journey mapping Quiz, spotting story-splitting patterns, 431–433
map of, 252–253
mapping to IIBA and PMI guides, 680 R
objectives, 251 R&D (Research and Development), for disruptive
overview of, 251 services, 648
overview of features, 254–256 Rapid learning culture, 566
persona analysis, 264–271 Rational Team Concert (RTC) tool, 700
process modeling, 285–287 Rational Unified Process (RUP)
in rolling analysis, 469 as agile framework, 49
specification of feature acceptance criteria, 259–263 history of agile development, 17, 18
stakeholder analysis, 264 risk prioritization and, 214–215
timing of activities, 257–258 use-case model, 298
use-case modeling, 298–299 Raynor, Michael, 631, 639–640
user-role modeling workshops, 300–306 RC (release candidate) stage, product release, 532
value stream mapping, 283–285 Readiness
Quarterly backlog, 327 assessment, features, 258
Quarterly Closeout (Epic, Feature Closeout) zone determining process, 145–146
defined, 73 quarterly planning checklist for, 690–691
Grand Lane analysis and planning, 81 Scrum, 46
I 739

Readiness checklist staging, 532–539


features, 258 summary of, 547–548
product visioning, 152 to users/user representatives prior to, 531
quarterly planning, 690–691 Reliability, defined, 35
visioning, 686 Requirements
Real Options, agile financial planning, 675 alpha testing for gaps in, 533
Real-time estimation BAs for business, 21
case for, 336 business analysts provide leadership for, 62–63
versus real time/IDD estimates, 335–336 communications plan for change in, 112
story points vs., 10 costs of agile development and, 118
Real-world outcomes, goal and objective metrics, defining types of, 123–124
182–183 determining granularity levels for, 127–129
Real-world testing, beta testing as, 359 early BA focus on, 17
Recommendations, quarterly release retrospective, functional versus non-functional, 380–381
543 Kano grades for variable, 206
Recoverability, defined, 35 management tools, 132, 615, 699–700
Recovery actions, developer tasks, 471 managing long-term planning, 224
Reengineering managing stakeholder expectations about delayed,
process modeling for process, 286 99
value stream mapping for process, 284 persisting, 496
Refactoring software, costs of, 118 tracing, 129–133
Refinement uncertainty pattern and, 427–428
of estimate over time, 334 welcoming change, 29
of estimation units for scope, 334–338 why agile should include BA competency, 24–25
of feature and acceptance criteria, 327–328 Requirements-related terminology
product backlog. see Product backlog refinement acceptance criteria, 40
Regions assumptions, 36–37
cumulative flow diagram, 488 business goal, 32–33
story map, 370 business objective, 33
Regulated sectors, agile in heavily, 629 business requirements, 33
Reinforced, well formed stories as, 421 business rule, 37
Reinsertsen, Donald, 651 constraint, 37
Relationship epics, 38–39
dependencies, story maps, 370 from epics to features and stories, 38
persisting between artifacts, 506–508 feature lifecycle, 40–41
Release candidate (RC) stage, 532 features, 39
Release date, in planning agenda, 330–331 functional requirements (FRs), 34
Release management importance and non-importance of, 36
defined, 581 milestone, 37
team, 581 nonfunctional requirements (NFRs), 34–35
Release Planning Game, XP, 322–325 product vision statement, 32
Release to manufacturing/market (RTM) stage, 534 requirements, defined, 33
Releasing the product requirements units, 38, 66
getting stories to done, 530 solution requirements, 34
on the map, 528–529 stakeholder requirements, 34
mapping to IIBA and PMI guides, 681 stories, 39–40
to market, deferring, 531 story estimation, 40
to market, hardening iterations, 531–532 themes, 40
to market, timing considerations, 530–531 trace goals to requirements, 36
objectives, 527 transition requirements, 35
overview of, 527 user requirements, 34
pivot-or-persevere meeting for, 544–547 Research and Development (R&D), for disruptive
quarterly release retrospective, 539–543 services, 648
quarterly retrospective walkthrough for, 543–544 Research users, and personas, 267
740 I

Resources, increased demand for, 106 Risks


Resources and checklists, additional full-potential plan for internal, 228
agile requirements management tools, 699–700 gravity of past success and, 645
checklist of invitees for quarterly planning, 692 interim goals for, 383
checklist of invitees for scaled quarterly and feature managing dependencies and, 345–347
planning, 698 market prioritization versus technical, 214–215
checklist of quarterly and feature planning upfront planning dependent on, 87
deliverables, 694 Roadmap. see Product roadmap
checklist of quarterly and feature planning inputs, Roamers, inter-team collaboration, 616–617
693 Role-Feature-Reason template. see Connextra
checklist of quarterly release retrospective template
questions, 695–697 Roles
facilitation tips, 684–685 backlog seeding, 197
mapping of book chapters to IIBA/ PMI guides, scaling PO, 571–572
677–681 splitting story with multiple user, 428–429
NFRs and constraints checklist, 689 user-role models. see User-role modeling workshop
readiness checklist for quarterly planning, 690–691 XP Quarterly/Release Planning Game, 323
rules of thumb in agile analysis and planning, Roles, business analyst and agile
682–683 agile analysis vs. waterfall, 65
stakeholder checklist, 687–688 agile team analyst, 60
visioning readiness checklist, 686 business systems analyst (BSA), 63
Responsibility, for user stories, 398–403 coach, 61
Responsible procrastination, 659 dedicated business analysts, 61–62
Retrospective overview of, 58–59
planning, 347 product champion (director), 61
quarterly release. see Quarterly release product owner (PO), 59–60
retrospective proxy user, 60–61
in rolling analysis, 469 requirements leadership, 62–63
scaled iteration, 607–609 Rolling analysis and preparation (day-to day-
scaled quarterly/feature, 609–611 activities), monitoring progress
Reveal, Delphi estimation, 339 burndown versus burnup charts, 486–487
Revenue generation (business value), story map ribs, burnup charts, 486
381 cumulative flow diagrams, 487–490
Revenue streams, business model disruption in, 643 daily burndown chart, 479–486
Reverse features, Kano grades, 207 daily standup, 471–474
Reverse income statement, case study, 705–706, follow-up meeting, 474
709–710 monitoring progress. see Monitoring progress
Ribs, constructing story map updating developer task board, 475
dependencies, 381 updating Kanban board, 476–479
frequency, 381 Rolling analysis and preparation (day-to-day
implementation sequence, 379–380 activities)
“juicy bits” first, 381 accounting for progress at end, 513
learning value, 380 actions against developer tasks, 471
overview of, 370, 379 analysis of upcoming epics, features, and stories,
revenue generation, 381 509–512
technological risk, 380–381 analysis tasks, 470
timeline view, 383–384 day in the life of agile analysis, 468–469
user task view, 382 introduction to, 465
WSJF and cost of delay, 380 iteration retrospective, 517–524
Ries, Eric, 54–55, 357 iteration review, 514–517
Risk Reduction and Opportunity Enablement Value managing scope change in iteration, 495
(RR&OE) on the map, 466–467
constructing story map ribs, 379–381 mapping to IIBA and PMI guides, 681
cost of delay, 212–213 objectives, 465, 468
I 741

other analysis documentation, 506–508 and this book, 57


overview of, 468 timeboxed planning, 42–43, 121, 555
story testing and inspection, 491–494 timing of feature preparation, 257
summary of, 524 Scaled iteration
Triad meetings, 470 retrospective, 607–609
updating documentation. see Business analysis (BA) review, 606–607
documentation, updating scaling agile process, 595
updating task progress, 470 sprint planning meetings, 595–597
Rolling lookahead meeting, 462–463 Scaled quarterly and feature planning. see Quarterly
Root-cause analysis and feature planning, scaled
cause-effect diagrams, 157–161 Scaled quarterly and feature retrospective, 609–611
cause-effect tree, 161–166 Scaled quarterly planning, 80
choosing right tool, 162 Scaling agile organization
Five Whys method, 153–157 competency groups, 577–578
at a glance, 153 component teams, 577
overview of approach, 152–153 extended teams, 576
problem or opportunity statement, 167–169 forming feature teams, 575
in visioning process, 147 organizing teams, 79–80
Rose, Willy, 6 overview of, 570
RR&OE (Risk Reduction and Opportunity PO role in, 571–572
Enablement Value) portfolio and program structure, 572–575
constructing story map ribs, 379–381 product owner council (POC), 579–580
cost of delay, 212–213 release management team, 581
RTC (Rational Team Concert) tool, 700 by subproduct and product area, 570–571
RTM (release to manufacturing/market) stage, 534 user task force, 581
Rules, iteration planning, 447 Scaling agile process
Rules of thumb big room iteration planning, 598–599
agile analysis and planning, 682–683 daily standup, 600
agile business analysis, 68 feature preview, 599
RUP. see Rational Unified Process (RUP) initial preparation, 585–586
Ryanair, as example of pivot to established product, integration meetings, 599
364 Open Space event, 611–614
overview of, 581
S plan implementation (team level), 597–598
Sacrificial product, deploying to, 235 product owner council (POC) meeting, 601–602
SAFe. see Scaled Agile Framework (SAFe) scaled (quarterly) feature preparation, 602–605
Safe spaces, for bad news, 653 scaled activities and events, 583–585
Sailboat (or speedboat) game scaled agile frameworks, 582–583
iteration retrospectives, 520–523 scaled iteration (or feature) review, 606–607
quarterly release retrospective, 544 scaled iteration retrospective, 607–609
virtual iteration retrospective, 609 scaled iteration (sprint) planning meetings,
Satisfaction versus fulfillment graph, Kano analysis, 595–598
207–208 scaled quarterly and feature planning, 586–595
Satisfiers, Kano grades, 206 scaled quarterly/feature retrospective, 609–611
Scalabiilty, 35, 649 Scrum of Scrums (SOS) meetings, 600–601
Scaled (quarterly) feature preparation, 602–605 team-level story preparation, 605–606
Scaled Agile Framework (SAFe) Triad meetings, 614
as agile framework, 56–57 user task force meetings, 606
history of agile development, 18 Scaling agility
overview of, 582 culture, 564–566
PO Sync in, 601 interdependency of scaled agile teams, 553–554
process analysis via, 9 inter-team collaboration, light-weight tools,
Scrum of Scrums meetings in, 600–601 615–617
terms, 57 inter-team collaboration, planning, 554–558
742 I

introduction, 549 Seeding the backlog


iteration planning, 462 analyzing NFRs/constraints, 216–220
on the map, 550–551 attendees, 197
mapping to IIBA and PMI guides, 681 circumstance-based market segmentation for,
MVPs incrementally, 364–365 75–77, 198
overview of, 552 epics and stories, 196
potential issues and challenges, 617–621 feature attributes, 201–202
product backlog, 566–569 feature independence, 199
reasons for, 552–553 features to seed upfront, 196–197
requirements management software tools, 615 Kano analysis, 202–212
summary of, 622 on the map, 194–196
Scaling agility, continuous delivery (CD) mapping to IIBA and PMI guides, 679
ATDD and BDD, 563–564 objectives, 193
automation in test-build-deploy steps, 558–559 other ways to discover initial features, 198–199
CD and CI, 561–562 overview of, 193
DevOps practices, 559–562 physical representation of features, 200–201
overview of, 558 sequencing epics and features in backlog, 212–215
Schwaber, Ken, 44 specifying emergent features, 200
Scope summary of, 220
commitment to forecasting, 341–342 template for epics and features, 199
estimation units refine, 334 writing feature acceptance criteria, 215–216
flow-based feature planning and, 318 Seeding the Backlog zone
forecasting iteration, 447–451 activities in, 196–197
iteration, 446 defined, 72
as journey map component, 274 seeding the backlog, 76
managing change during iteration, 495 Seeing waste, Lean software development, 52
planning iteration implementation, 451–452 Segmentation, circumstance-based market, 630–631
Scope line, daily burndown chart, 480 Self-organizing
Scouts, 616, 619 daily standup as, 472
Scrum defined, 95
as agile framework, 44–45 twelve principles for BAs, 30
BA competency and, 45 Self-reflection, twelve principles for BAs, 30
backlog refinement, 47 Self-sufficiency, development team, 94–95
daily standup (or scrum), 47–48 Send event, BPMN private process model, 295
definition of done (DoD), 46 Senior development manager, POC, 580
history of agile development, 17 Senior product manager, POC, 580
iterations known as sprints in, 444 Sequence flows, BPMN, 288, 293
product backlog items (PBIs), 38, 45 Sequencing, epics and features in backlog, 212
product backlog refinement, 509–512 Servant leadership, agile, 564–565
product owner (PO) and BA, 45 Servant Leadership (Greenleaf), 564–565
readiness, 46 Service delivery, business model disruption in, 643
ScrumMaster and BA, 46 Service-level requirements (SLRs), 34
sprint, 45 Seven wastes, Lean software development, 51–52
sprint (iteration) planning, 47 70/20/10 rule, change, 653
sprint goal, 47 Shared
sprint review and retrospective, 48 components, team dependencies, 554
timeboxed planning in, 42–43, 121, 555 team members, 617
transparency, 46 well-formed AC should be, 412
using timeboxed planning. see Timeboxed planning Shift left, DevOps, 560
The Scrum Guide, 9 Shillace, Sam, 360
Scrum Guide, 44 Shingo, Shigeo, 50
Scrum of Scrums (SoS) meetings, 600–601 Short Lane, 73, 74–78
ScrumMaster, 46, 60 Shorter horizons, product roadmap for, 248
Security, defined, 35 Short-term wins, accelerating change via, 110
I 743

Showstopper errors, alpha testing for, 533 Split (A/B) testing


Sign up, developer task, 454 actionable metrics, 187
Signoffs, iteration review, 514 staging release, 539
Silent estimation, Delphi, 339 value validation, 491–494
Silos, busting Splitting stories, patterns
business can lead technical teams, 668 Business Rules pattern, 424
collaborative culture and, 559, 566 Complex UI pattern, 426
communities of practice (guilds) for, 669–672 Data Complexity pattern, 425
cross-functional teams organized around value, how to use, 422
668 Integration Capabilities pattern, 428
everyone works for the business, 667 Large Initial Effort pattern, 424
job-based organization, 668–669 Multiple Devices, Platforms pattern, 428
overview of, 667 Multiple Use-Case Scenarios pattern, 425
Simplicity, as Lean thinking principle, 30, 51 Multiple User Goals pattern, 426–427
Single source of truth, product backlog as, 124–125 Multiple User Roles pattern, 428–429
Site visits, for non-colocated teams, 618 NFR Implementation pattern, 426
Size overview of, 422
determining item, in Kanban, 143 quiz, 431–433
development team should be small, 95 tie-breaker rules, 422–423
DevOps practice of small batch, 561 Too Many Acceptance Criteria pattern, 429–430
diagnosing stories that are too big, 490 Uncertainty pattern, 427
splitting stories into patterns. see Splitting stories, Workflow Steps pattern, 423–424
patterns Splitting stories, rules of thumb for estimating,
taxonomy of story, 395–396 682–683
varying product backlog, 125 Spoken needs, Kano grades, 206
well-formed stories have small, 421 Spreadsheets, traceability, 132
SLRs (service-level requirements), 34 Sprint
SMART, 330 backlog, 47
SMEs (subject matter experts), attending Triad goals, 47, 383–387
meetings, 402–403 planning. see Iteration planning
Smoke tests, analyze-code-build-test cycle, 492 review, 48, 606
Smoke-and-Mirrors MVPs (or Swivel Chairs), Sprints, Scrum, 44, 45, 444
360–361 Stabilization (or IP) iteration, quarterly planning, 325
Soft skills, agile business analysts, 63–66 Stabilizing (hardening) iterations, 531–532
Software Stages, product release
core value of Agile Manifesto, 28–29 alpha testing, 533
costs of refactoring, 118 analysis documentation, 538
delivering frequently, 29–30 beta testing, 533–534
Solitude, developing collaborative culture in, 665 closed (private) beta testing, 534
Solution requirements, 34, 123 general availability, 535–539
Spanning application, 361, 383–385 general availability checklist, 535–538
Specification as journey map component, 276–277, 281–282
by example, 66, 409–410 monitoring, 538
updating use-case, 503–506 open (public) beta testing, 534
Spike card, story map, 368–369 overview of, 532–533
Spikes pre-alpha, 533
functional. see Functional spikes release candidate (RC), 534
technical, 418–419 release to manufacturing/market (RTM), 534
Spikes, SAFe value validation, 539
feature preparation, 258–259 Staging environment, MVP, 231–232
process analysis, 9 Stakeholder
quarterly and feature planning estimates, 340 agile analysis vs. waterfall, 67
story preparation, 415–416 agile impacts satisfaction of, 23
744 I

attending backlog seeding, 197 preparing for scaled initiative, 585


cause-effect diagram tips for, 157–158 requirements-related terminology for, 39–40
checklist, 687–688 scaling backlog, 567
feature preparation, 264 testing and inspection, 490–494
identifying, 152 themes, 40
requirements, 34, 123 tracking developer tasks on burndown versus, 482
visioning of, 147, 150 tuning, 138–142
visioning process of, 147 updating BA documentation, 496–506
Stakeholder analysis and engagement updating developer task board, 475
collaboration plan, 176–178 updating Kanban board, 476–479
communication plan, 178–179 use case vs. user, 50
identify via checklist, 176 user story, 39–40
list, roles, and responsibilities table, 176 writing, 323
ongoing engagement and analysis, 179–181 as XP contribution to agile, 17
overview of, 175–181 XP functional units as, 38
Standard operating procedures, process modeling, 286 Stormboard, brainstorming tool, 700
Standards, business analysis, 30–31 Story maps
Standish Group, 18–20, 22–23 anatomy of, 368–370
Start event, BPMN, 288, 293 benefits of, 367
Startups, must accelerate growth, 654 case study, creating backbone, 375–379
State-transition diagram, updating Kanban board, case study, stories for MVP, 384–386
476–479 completing, 438–440
Statistical group response, Delphi estimation, 339 constructing ribs, 379–384
Status updates, daily standup, 473–474 defined, 366
Steering phase, XP Quarterly/Release Planning Game, dependency relationships, 370–375
322 Jeff Patton's, 366–367
Steps outlined level, requirements granularity, 128 journey maps build, 279
Stories MVPs and, 353–356
acceptance criteria, 40 MVPs complement, 356
acceptance template, 40 other forms of, 386–388
actions against developer, 471 summary of, 388
avoiding gating, 142 Story planning. see Iteration and story planning
business prioritizing, 324 Story point estimation
communicating via, 7 case for, 336–337
continuous basis activities to get product done, 530 as estimation unit, 335
daily burndown chart, 481–482 Fibonacci sequence for, 337–338
definition of, 39, 196, 395 measuring complexity versus effort via, 335
development estimates, 324 versus real time/IDDs estimates, 335–336
from epics to features to, 38 Story points
estimating, 40 estimating functional spikes via, 418
estimating/splitting, 682–683 estimating other kinds of stories, 340–341
features are bigger than, 254 real-time estimates vs., 10
goals and objectives are represented in, 183–184 story estimation using, 40
grouping into themes, 40 Story preparation
incomplete, 513 alternative terminology, 395
Kanban board setup for iteration planning, analyzing business rules/AC, 433–438
458–462 case study, complete story map, 438–440
long-term planning requirements, 224 definition of story, 395
mapping. see Minimum viable product (MVP) and introduction to, 391
story maps map of, 392–393
measuring progress on burnup chart, 486 mapping to IIBA and PMI guides, 680
ongoing analysis of upcoming, 509–512 naming standards, 396–397
persisting requirements, 496 objectives, 391, 394
I 745

overview of, 394 feature preparation, 258–259


physical versus electronic stories, 403–404 identifying developer, 452–456
product backlog refinement, 510–512 planning iteration implementation, 451–452
responsibility for user stories, 398–403 product backlog refinement, 509
size taxonomy, 395–396 updating progress of, 470
splitting. see Splitting stories, patterns TDD. see Test-driven development (TDD)
stories that are not user stories, 414–420 Team PO, 572, 580
story acceptance criteria, 407–414 Teams
summary of, 440 cumulative flow diagrams, 608
team-level, 605–606 feature review, 607
Three Cs of stories, 397–398 interdependency of scaled agile, 553–554
user story examples, 397 inter-team collaboration via, 616
values for story attributes, 404 matching present backlog items with, 597
writing high-quality stories, 420–421 role of agile analyst, 60
writing story description, 404–407 in SAFe, 582
Story telling, Delphi estimation, 339 scaling agility for non-colocated, 617–619
Strategic initiatives, accelerating change, 109 self-organizing, 30
Strengthening, agile fluency model, 108 story preparation, 605–606
Subconscious requirement, Kano grades, 206–207 working with waterfall, 619–620
Subject matter experts (SMEs), attending Triad Technical benefits, of story, 395
meetings, 402–403 Technical capabilities, MVP implementation, 231–232
Submit claim Technical constraints, hardening iterations, 531
developer task board and, 453, 475 Technical debt
tracing analysis artifacts, 506–508 balancing user features and, 341
updating use-case specifications, 503–504, 506 defined, 118
Subproducts, 568, 570–571 prioritize new development, and payment of, 215
Success, agile financial planning and, 103 technical debt-payment spike, 419
Supplementary requirements, 34 Technical limitations, deferred deployment due to, 236
Supply chains, preparing, 104 Technical research spike (or story), 419
Survey, Kano analysis, 204 Technical risk, 214–215
Sustain acceleration, for change, 110 Technical spikes (or stories)
Sustaining innovation, 637, 644 business benefits, 324
Sutherland, Jeff, 44 quarterly and feature planning estimates, 341
Swarm, developer tasks, 471 types of, 418–419
Swimlanes, process modeling with/without, 287 Technical teams, 668
Swimlane-workflow, 287 Technological risk, constructing story map ribs, 380
Swivel Chairs (Smoke-and-Mirrors MVPs), 360–361 Technology
Symbol set, BPMN standard, 287, 295 invest aggressively in enterprise agility, 648–650
Systems analysts, many BAs were, 18 quarterly release retrospective, 695
uncertainty pattern regarding, 427–428
T Template
Tables, product portrait, 170–171 BDD. see Behavior-driven development (BDD)
Targeted features, versus committed, 343 Given-When-Then. see Behavior-driven
Targets development (BDD)
full potential plan defines bold, 225–226, 630 information persona, 268–269
selecting features in Kano analysis, 202 journey map, 275
trade-off of costs and benefits, 119–121 minimal quarterly plan, 344
visioning as essential, 151 product portrait, 170–171
Task switching, as waste, 52 product roadmap, 241
Tasks quarterly roadmap, 344
actions against developer, 471 Role-Feature-Reason. see Connextra template
analysis, 470 story map, 369
developer task board in iteration planning, 446 user story, 39–40
746 I

Tentative acceptance, feature preparation, 605 setting WIP limits, 144


Test pyramid, 92–93 starting iteration, 351
Test-build-deploy steps, automation in, 558–559 Timeline
Test-driven development (TDD) constructing story map ribs, 383–384
as agile framework, 56 interim, 242
ATDD and BDD, 563–564 quarterly release retrospective, 542–543
continuous development and, 562–563 updating in quarterly release, 543
history of agile development, 18 Timing
Testing. see also Test-driven development (TDD) daily standup considerations, 472
alpha, 359, 533 of feature preparation, 257–258, 510
beta. see Beta testing of feature preview meeting, 462–463
as continuous and automated, 562 functional spikes and, 416
developer task, 471 iteration retrospective, 517
focus on compliance goals, 105 Open Space events, 612
inviting representative to Triad meetings, 401 pivot-or-persevere meetings, 545
Lean software development, 53–54 POC meeting, 601–602
MVP, 358–359 as quality of transparency, 667
of planning assumptions, 190 quarterly planning, 325
preparing infrastructure for, 90–93 releasing product to market, 530–532
questions for analyst to ask at Triad event, 403 scaled feature preparation, 603
in rolling analysis, 469 scaled quarterly and feature planning, 586
split (or A/B), 187–188 story preparation, 511–512
story inspection and, 490–494 of Triad meetings as story develops, 400
well-formed AC and, 412 Titles, generic job, 664
well-formed stories and, 421 Too Many Acceptance Criteria pattern, splitting
Themes, 40, 183 stories, 429–430
Thoughts, as journey map component, 278, 282 Tools
Three Cs of stories, 397–398 agile analysis vs. waterfall, 67
Time agile requirements management, 699–700
estimates for developer tasks, 454–455 Lean software development, 52–54
spent on estimation, 332 lightweight, 67, 615–617
spent on product backlog refinement, 509 traceability, 130–131
Time criticality, 126, 380–381 Topics/agenda
Time-and-materials contract, 120 iteration retrospective, 518–520
Time-based estimates, 335 iteration review, 515–516
Timeboxed planning Open Space events, 613–614
feature preparation timing, 257 quarterly and feature planning, 328–331
feature review via, 607 quarterly and feature planning, scaled, 589–592
flow-based planning versus, 555 sprint planning meeting, 596
flow-based vs., 121 Top-level product, only one, 568
frameworks supporting, 121 Total potential person-days, forecasting capacity,
for frontend, 556–557 448–449
history of agile development, 17 Touchpoints
increments in, 446–447 BPMN private process model, 296
iteration implementation, 451 BPMN public process model, 288, 291
iteration planning, 444 journey map, 278, 282
Kanban board columns, 459–462 Toyota Production System, lean software
overview of, 42–43 development, 628
quarterly planning for, 319–320 Tracing, 129–133, 506–508
requirements granularity levels, 127 Track record, of business analysis, 19–23
in rolling analysis, 469, 509 Transfer risk, 346
Scrum, 44 Transition
setting process cadence, 134 phase of RUP lifecycle, 49
I 747

requirements, 35, 123 task progress in rolling analysis, 470, 471


timeline, 111 timeline, in quarterly release retrospective, 543
Transparency Upfront planning, 87–88
agile corporate culture practice of, 666–667 Upward (backward) traceability, 130, 506
Scrum control through, 46 Urgency
siloing versus, 667 accelerating change, 109
Triad meetings of agile corporate culture, 633
analyst guidelines for, 402–403 Usability
analyzing AC during, 262 alpha testing for, 533
attendees, 401–402 defined, 35
backlog preparation via, 400–403 Use cases
benefits of, 400 as agile framework, 49–50
inputs and deliverables, 401 discovering initial features, 198–199
in rolling analysis, 470 implementing change initiative via, 255–256
scaling agile, 614 job-based organization as high-level, 668–669
story preparation for, 605–606 modeling, 298–299
timing considerations, 400 Multiple Use-Case Scenarios pattern, 425
Trint narrative, 50
case study, MVP, 357–358 scenario, 50
as Differentiator MVP, 360 slice, 395
as disruptive innovation, 638 tracing analysis artifacts, 506–508
as mainstream disruption, 642 versus user stories, 50
Trunk-based development, DevOps, 562 Use-case brief, 505
T-shirt sizes, as estimation units, 334 Use-case model, updating BA documentation
Twelve principles, 29–31 capturing artifacts, 499–501
12 Principles behind the Manifesto, agile, 18 case study, 501–502
Two-hands rule, daily standups, 473 hybrid approach, 498
overview of, 497
U updating use-case specifications, 503–506
UAT. see User acceptance testing (UAT) use-case-only approach, 498
Uber, as disruptor, 640–641 Use-case only approach, 498
UML Use-case slice, 50, 497, 498
and BPMN, 57–58 Use-case specifications
communication diagrams, 308 updating, 503–506
UML for the IT Business Analyst (Podeswa), 6 updating use-case model, 497
Unambiguous, well formed story as, 421 use-case only approach, 498
Uncertainty pattern, splitting stories, 427 User acceptance testing (UAT)
Underestimating signature, burndown charts, in analyze-code-build-test cycle, 490–491
484–485 as continuous and automated, 562
Undesirable effects (UDEs) do not skip BA in agile development, 8–9
BLInK improved outcomes for, 166 lifecycle across states of Kanban board, 479
cause-effect trees, 162–165 specifying feature AC, 259
problem statement, 168–169 specifying using Gherkin feature files, 263
product vision statement and, 174–175 test pyramid, 92
Unit tests, most automated tests should be, 92 validating value with, 493
Units, estimation, 334–338 User feedback, stakeholder expectations and, 100
Updates User interface design, 269, 426
acceptance criteria, 409 User involvement, BA track record for, 20
BA documentation, 496–506 User journeys, product portrait for, 170–171
daily standup status, 473–474 User proxy, 581
developer task board, 475 User requirements, 34, 123
iteration goal and scope, 448 User stories
Kanban board setup, 476–479 analyst value added to, 399–400
748 I

decision table example, 435–436 Value Stream Skeleton MVPs, 361–362


defined, 39–40, 395 Vanity metrics, 187
examples of, 396 Variable requirements, Kano grades, 206
implementing change initiative as, 255 Velocity
responsibility for, 398 capacity, in planning agenda, 331
stories that are not, 414–416 revising in quarterly plan, 351–352
templates, 39–40 Vertical slices of functionality, agile vs. waterfall, 67
Triad approach, 400–403 Virtual iteration retrospectives, 608–609
updating use-case model with, 498 Visio, creating diagrams with, 700
using personas, 269 Vision
writers of, 398–399 accelerating change, 109
User story card, story map, 368–369 articulating change, 652
User tasks, 368–369, 374, 581, 606 determining, 632
User value distributed authority equals clear span of, 660
crafting iteration goal, 449 leadership with accountability and, 565
of story, 395 reviewing in planning agenda, 329
story mapping, 381 Visioning
User-role modeling workshop crafting product or vision statement, 172–175
agenda, 300 goals and objectives analysis, 181–184
case study, 304–306 leap of faith and. see Leap of faith hypotheses
consolidate user roles, 302 on the map, 148–149
overview of, 300 mapping to IIBA and PMI guides, 679
refine user roles, 303 objectives, 148–149
user roles, 300–301 overview of, 147, 150–151
User-task view, story mapping, 382 problem or opportunity statement, 167–169
UX designers, POC, 580 product and epic, 150–152
product portrait, 169–171
V readiness checklist, 686
Value root-cause analysis, 152–166
assessing backlog items, 193 specifying product or epic, 166–167
business analyst adds to iteration goal, 450 stakeholder analysis/engagement activities,
in cost of delay, 126 175–181
delivered by iteration goal, 449 summary of, 192
focusing estimates on, 334 Visual cues, 132
hypotheses, 186, 188 Visualization, inter-team collaboration via, 616
organize cross-functional teams around, 668 VoC (voice of the customer), 658
organize teams around, 95–96 Voice of the customer (VoC), 658
specifying story attributes, 402 Void risk, 345
story is work item that delivers, 39, 395 Volunteer army, accelerating change, 109
validation, 493, 539 Vulnerabilities, alpha testing for, 533
well-formed stories deliver, 421
Value proposition. see also Agile analysis and W
planning, value proposition Waiting, as waste, 52
as decisions maximizing business value, 7 Walk the board, alternative to daily standup, 474
for organizations with no agile experience, 110 Walking Skeleton (spanning application) MVP, 361
Value stream analysis, 658 Walkthrough
Value stream, defined, 283 big room iteration planning, 598–599
Value stream mapping Open Space events, 613–614
business process models versus, 285 pivot-or-persevere meetings, 545–546
feature change initiative delivered as, 255 POC meeting, 602
feature preparation via, 283–285 quarterly release retrospective, 543–544
Lean software development, 53 scaled feature preparation, 604–605
optimizing process, 145 scaled quarterly feature retrospective, 610–611
overview of, 283–285 virtual iteration retrospective, 609
I 749

Waste Who cares? quadrant, purpose alignment model, 88, 90


acceptance criteria helps drive out, 410 Whole teams, XP, 49
estimation is, 338 Whole-product level, 568
Lean software development tools and, 52–54 Whole-team culture, 94
reducing with responsible procrastination, 659 Who-What-Why template. see Connextra template
seven wastes in Lean software development, 51–52 “Why?”
stakeholder productivity expectations and, 101 Connextra template clause, 406–407
updating BA documentation to avoid, 496 culture embracing change does not ask, 653
value stream mapping highlights/reduces, 283 decision tables, 434
Waterfall problem or opportunity statements, 167–169
agile key practices versus, 65–68 product portrait, 170
agile long-term planning versus, 224 “Why not?,” culture embracing change asks, 653
agile replacing, 22 Wide and shallow, long-term implementation of
changes in, 14 features, 238–240
failure of, 22 WIP. see Work-in-progress (WIP) limits, Kanban
step-by-step analysis and planning of, 5 Wireframes, product portrait for, 170–171
success of agile vs., 23 Work, included in estimates, 333
using qualifiers to avoid, 417 Work items
working with teams, 619–620 inter-team collaboration via sequential, 617
Weighted shortest job first (WSJF) Kanban, 37, 44, 395
constructing story map ribs, 380 Workaround card, story map, 368–369
determining for PBI, 126–127 Workflow Steps pattern, splitting stories, 423–424
for sequencing decisions, 341 Work-in-progress (WIP) limits, Kanban, 44, 144,
sequencing epics and features in backlog, 202, 213 476–479
“What?” Wow feature, 206–207, 269
Connextra template clause, 406, 407 Writely, as Differentiator MVP, 360
decision tables, 434 WSJF. see Weighted shortest job first (WSJF)
problem or opportunity statement, 167–169
product portrait, 170 X
well-formed AC describes, 412 XP (Extreme Programming)
“When?” The Planning Game guidelines, 322–325
problem or opportunity statement, 167–169 using timeboxed planning. see Timeboxed planning
product portrait, 170
“Where?,” problem or opportunity statement, Y
167–169 Y-axis units, cumulative flow diagram, 488
Whiteboarding, scaled feature preparation, 602–603
“Who?” Z
Connextra template clause, 406 Zones, Agile Analysis and Planning Map, 69–71,
problem or opportunity statement, 167–169 72–73

You might also like