SlideShare a Scribd company logo
 
	
  
	
  
	
  
	
  
	
  
	
  
	
  
T20	
  
Metrics	
  
5/11/17	
  15:00	
  
	
  
	
  
	
  
	
  
	
  
An	
  Agile	
  Testing	
  Dashboard:	
  Metrics	
  
that	
  Matter	
  
	
  
Presented	
  by:	
  	
  
	
  
	
   Prachi	
  Maini	
  
	
  
Morningstar	
  
	
  
Brought	
  to	
  you	
  by:	
  	
  
	
  	
  
	
  
	
  
	
  
	
  
350	
  Corporate	
  Way,	
  Suite	
  400,	
  Orange	
  Park,	
  FL	
  32073	
  	
  
888-­‐-­‐-­‐268-­‐-­‐-­‐8770	
  ·∙·∙	
  904-­‐-­‐-­‐278-­‐-­‐-­‐0524	
  -­‐	
  info@techwell.com	
  -­‐	
  http://www.starwest.techwell.com/	
  	
  	
  
 
	
  	
  
	
  
Prachi	
  Maini	
  
	
  
Prachi	
  Maini	
  is	
  the	
  quality	
  assurance	
  manager	
  at	
  Morningstar.	
  With	
  fourteen	
  years	
  
of	
  global	
  quality	
  assurance	
  experience,	
  Prachi	
  has	
  been	
  involved	
  with	
  providing	
  
technical	
  direction,	
  automation	
  vision,	
  tool	
  selection,	
  resource	
  planning,	
  budgeting,	
  
forecasting,	
  and	
  hiring	
  for	
  leading	
  insurance,	
  financial,	
  healthcare,	
  and	
  investment	
  
organizations	
  with	
  complex	
  systems	
  and	
  multiple	
  dependencies.	
  An	
  agile	
  enthusiast,	
  
Prachi	
  loves	
  talking	
  about	
  agile	
  measurements	
  and	
  strategizing	
  the	
  QA	
  role	
  as	
  part	
  
of	
  agile	
  squads.	
  Committed	
  to	
  operational	
  excellence	
  and	
  customer	
  satisfaction,	
  she	
  
is	
  a	
  strong	
  test	
  automation	
  and	
  continuous	
  integration	
  advocate	
  who	
  loves	
  to	
  act	
  as	
  
liaison	
  between	
  the	
  technical	
  and	
  business	
  teams.	
  Reach	
  out	
  to	
  Prachi	
  via	
  Linkedin.	
  
	
  	
  
	
  
©2015 Morningstar, Inc. All rights reserved.
Prachi Maini
Manager, QA Engineering
Morningstar, Inc.
Agile Metrics that Matter
Executive Summary
The
Concept
The
Opportunity
The
Potential
•  Define Metrics that
can be used by Agile
teams and Team
management
•  Reduced costs
•  Increased team
satisfaction
•  Auto Generate using
e x p o s e d A P I s
provided by various
tools
Ø Independent
investment research
and management
firm headquartered in
Chicago
Ø Consumer report for
securities
Ø Agile Squads
(5-7dev, 1-2 QA, PO,
SM, Designer)
Ø Two week sprints
Ø Toolset
•  QAC for test case management
•  Selenium for functional
automation
•  ReadyAPI for webservices
•  Webload for Performance
ar
3
Why do we need Metrics
Ø Drive strategy and direction.
Ø Provide measurable data and trend to the
project team and management.
Ø Ensure that the project remains on track.
Ø Quantify risk and process improvements.
Ø Ensure customer satisfaction with the
deployed product.
Ø Assist with resource/budget estimation and
forecasting.
Effective Metrics
Ø Metrics should be clearly defined so the
team or organization can benchmark its
success.
Ø Secure buy-in from management and
employees.
Ø Have clearly defined data and collection
process.
Ø Are measurable and shared.
Ø Can be automatically generated and
scheduled.
Agile QA Dashboard
Ø Focus on trends rather than absolute values.
Ø Agile defines team members as anyone
responsible for project execution and delivery
w h i c h i n c l u d e d e v e l o p e r, Q A a n d
documentation.
Ø Used by teams to perform introspection on
their own performance and feed into release
planning
Ø Core Agile metrics should not be used to
compare different teams or focus on
underperforming teams.
Project progress
•  Burndown Chart
–  Graphical representation of work remaining
vs time.
•  Committed vs Completed
–  The percentage of points completed by the
squad as a percentage of the committed point
for the sprint
•  Tech Category
–  This helps identify how an agile team is
spending its time. The possible values for tech
category can be client customization, new
product development, operations or
maintenance etc.
Committed vs
Completed	
   Sprint 74	
   Sprint 75	
  Sprint 76	
  
Sprint
77	
   Sprint 78	
  
Committed	
   148	
   105	
   98	
   53	
   154	
  
Completed	
   74	
   87	
   75	
   51	
   125	
  
0
50
100
150
200
Sprint
74
Sprint
75
Sprint
76
Sprint
77
Sprint
78
Committed vs Completed
Committed Completed
Tech Category
Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78
Client Customization 0 0 0 6 16
New Product Development 157 171 91 144 109
Maintenance 57 54 52 46 59
Other 23 10 0 0 0
0
20
40
60
80
100
120
Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78
Work done by Tech Category
Other Maintenance
New Product Development Client Customization
What to watch for:-
ü The team finishes early sprint after sprint
because they are not committing enough
points
ü The team is not meeting is commitment
because they are overcommitting each sprint.
ü Burndown is steep rather than gradual
because work is not broken into granular units
ü Scope is often added or changed mid sprint
Velocity
•  Velocity
–  Points of work completed by an agile team within
a given sprint
•  Adjusted Velocity
–  Points of work completed by an agile team
accounting for holidays, team absence etc.
–  Calculated as Velocity / Available Man Days
–  Running Average of Last three Sprints is reported
0	
  
50	
  
100	
  
150	
  
Sprint	
  74	
   Sprint	
  75	
   Sprint	
  76	
   Sprint	
  77	
   Sprint	
  78	
  
Adjusted	
  Velocity	
  
Adjusted	
  Velocity	
   Avg	
  Velocity	
  (Last	
  3	
  Sprints)	
  
Capacity	
   Sprint 74	
  
Sprint
75	
  
Sprint
76	
   Sprint 77	
   Sprint 78	
  
Team Size 8	
   8	
   8	
   8	
   8	
  
Available Days 80	
   80	
   80	
   80	
   80	
  
Unavailable Days 10	
   12	
   11	
   5	
   0	
  
Net Days (Capacity) 70	
   68	
   69	
   75	
   80	
  
Velocity	
   	
   	
   	
   	
   	
  
Total Points Completed 73	
   87	
   75	
   51	
   125	
  
Adjusted Velocity 83	
   102	
   87	
   54	
   125	
  
Avg Velocity (Last 3
Sprints) 88	
   90	
   91	
   81	
   89	
  
What to watch for
ü  An erratic average velocity over a period of
time requires revisiting the team’s estimation
practices.
ü Are there unforeseen challenges not
accounted for when estimating the work
DO NOT
ü Use velocity to compare two different teams
since the level of work estimation is different
from team to team
ü Use velocity to identify lower performing
teams.
Quality of Code
•  First Pass Rate
–  Used for measuring the amount of rework in
the process
–  Defined as no. of test cases passed on first
execution.
FPR = PassedTotal on First Execution
–  Only for stories that deal with the
development of new APIs or Features
Story Total TCs Pass Fail FPR
Navy
PHX-10112 2 1 1 0.5
PHX-10411 8 6 2 0.75
PHX- 10382 15 8 7 0.8
PHX- 7703 10 6 4 6
PHX - 10336 1 1 0 1
	
   34	
   21	
   13	
  0.62
What to watch for
Ø Lower first pass rates indicate that Agile tools
like desks checks , unit testing are not used
sufficiently.
Ø Lower first pass rate could indicate lack of
understanding of requirements.
Ø Higher first pass rate combined with high
defect rate in production could indicate lack of
proper QA.
Bug Dashboard
–  Net Open Bugs / Created vs Resolved
•  This gives a view of the team flow rate. Are we
creating more technical debt and defects than
what the team can resolve.
–  Functional Vs Regression Bugs Trend
•  This helps identify the defects found in new
development vs regression.
–  Defects Detected in
•  This helps identify the environment in which the
defect is detected. (QA, Staging, UAT, Production)
•  For defects detected in environment higher than
QA, an RCA is needed.
Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78
Bugs Opened 68 36 33 41 17
Bugs Closed 30 30 16 38 15
Net Open
Bugs 438 444 461 464 466 420
425
430
435
440
445
450
455
460
465
470
Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78
Net Open Bugs
Net Open Bugs
16
24
16
38 40
68
36
33
41
17
0
10
20
30
40
50
60
70
80
Sprint 74Sprint 75Sprint 76Sprint 77Sprint 78
Regression
Feature
Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78
Regression 16 24 16 38 40
Feature 68 36 33 41 17
Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78
QA 25	
   30	
   28	
   30	
   32	
  
Staging 5	
   4	
   4	
   4	
   3	
  
UAT 1	
   3	
   1	
   3	
   1	
  
Prod 2	
   1	
   2	
   1	
   4	
  
25
30
28
30
26
5 4 4 4 3
1
3
1
3
12 1 2 1
8
0
5
10
15
20
25
30
35
Sprint 74 Sprint 75 Sprint 76 Sprint 77
QA
Stg
UAT
Prod
	
  
Sprint 74
	
  
Sprint 75
	
  
Sprint 76
	
  
Sprint 77
	
  
Sprint 78
	
  
Code	
   15	
   21	
   21	
   22	
   19	
  
Config	
   5	
   4	
   4	
   6	
   3	
  
Requirements 	
   1	
   3	
   1	
   3	
   1	
  
Hardware	
   2	
   1	
   2	
   1	
   8	
  
Data 	
   2	
   9	
   7	
   4	
   5	
  
15
21 21 22
19
5 4 4
6
3
1
3
1
3
12 1 2 1
8
2
9
7
4 5
0
5
10
15
20
25
Sprint 74Sprint 75Sprint 76Sprint 77Sprint 78
Code
Config
Requirements
Hardware
Data
15
21 21
22
19
5
4 4
6
3
1
3
1
3
1
4 4
5
4
6
0
5
10
15
20
25
Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78
QA oversight
Environment
Requirements
Existing Issue
Data
	
  
Sprint 74
	
  
Sprint 75
	
  
Sprint 76
	
  
Sprint 77
	
  
Sprint 78
	
  
QA oversight	
   15	
   21	
   21	
   22	
   19	
  
Environment	
   5	
   4	
   4	
   6	
   3	
  
Requirements 	
   1	
   3	
   1	
   3	
   1	
  
Existing Issue	
   2	
   1	
   2	
   1	
   8	
  
Data	
   4	
   4	
   5	
   4	
   6	
  
What to watch for
Ø An increase in regression bug count indicates
the impact of code refactoring.
Ø An increase bug count in non-QA environment
due to environment differences requires
revisiting the environment strategy
Ø  An increase bug count in non-QA environment
due to QA oversight requires revisiting the
testing strategy.
Automation
•  Number of automated test cases
–  Percentage of automated test cases as part of
total automation candidates
–  Percentage of automated test cases as part of
total test cases
–  Can be reported separately for API ,
Functional , Migration testing
•  Defects Found by Automation
–  Number of defects found via automation and
manual testing
V2 Functional Test Cases	
   Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78
# of Total Test Cases 2444 2611 2684 2782 2822
# of Automatable Test Cases 1541 1642 1690 1755 1774
# of Test Cases Automated 1138 1234 1267 1339 1351
% Coverage of Automatable Test Cases 73.85% 75.15% 74.97% 76.30% 76.16%
% Coverage of Total Test Cases 47% 47.26% 47% 48.13% 48%
0.00%
10.00%
20.00%
30.00%
40.00%
50.00%
60.00%
70.00%
80.00%
90.00%
0
500
1000
1500
2000
2500
3000
Sprint 74Sprint 75Sprint 76Sprint 77Sprint 78
# of Total Test Cases
# of Automatable Test Cases
# of Test Cases Automated
% Coverage of Automatable
Test Cases
% Coverage of Total Test
Cases
0
20
40
60
80
100
120
140
Jan-16 Feb-16 Mar-16 Apr-16
AEM Legacy Site Regression Testing
Execution Time
(Average - min)
Analysis Time
(Average - min)
Total Time (Average - Min)
What to watch for
Ø A decrease in automation coverage could
indicate that a lot of automation capacity is
being spent in script maintenance.
Ø The coverage helps in identifying the
percentage of application that can be
effectively monitored and regressed on a
recurring basis
Team Sentiments
•  Survey questions distributed to Agile
Teams
–  Team members are anonymously asked to
rate on a scale of 1 to 10 on questions
pertinent to the project. Example of question
include but are not limited to
•  Understanding of the vision of the project
•  Quality of the user stories
•  Collaboration between team members
–  Responses are tabulated and shared with the
team
–  Trends are noted over time to identify teams
alignment with the company and project vision
and general satisfaction with the project and
An Agile Testing Dashboard: Metrics that Matter
Thank You!!
•  Questions
•  Prachi Maini
•  prachi.maini@morningstar.com
•  pmaini@gmail.com
•  https://www.linkedin.com/in/prachimaini
•  Phone: 630-818-6472

More Related Content

Similar to An Agile Testing Dashboard: Metrics that Matter (20)

PPT
Agile Metrics
Alexey Krivitsky
 
PPT
Agile Metrics
Alexey Krivitsky
 
PPTX
Top 10 Agile Metrics
XBOSoft
 
PDF
Agile Metrics
Alexey Krivitsky
 
PDF
Test Metrics in Agile: A Powerful Tool to Demonstrate Value
TechWell
 
PPTX
Team Foundation Server - Tracking & Reporting
Steve Lange
 
PDF
Big Apple Scrum Day 2015 - Advanced Scrum Metrics Reference Sheet
Jason Tice
 
PDF
Agile Base Camp - Agile metrics
Serge Kovaleff
 
PPTX
2008 Metrics for agile software development
Andreas Wintersteiger
 
PDF
What to expect in 30 60-90 days in agile transformation journey?
SwatiKapoor43
 
PPTX
Agile metrics
Ankit Tandon
 
PPTX
Agile Metrics...That Matter
Erik Weber
 
PPTX
Test Metrics in Agile - powerful tool to support changes - Zavertailo Iuliia
Yulia Zavertailo
 
PDF
Agile Metrics
Emiliano Grande
 
PDF
Benzne Webinar : What to expect in 30-60-90 days in Agile Transformation Jour...
Tarun Singh
 
PPTX
Sagi Smolarski ITG - Enterprise Metrics on Agile
AgileSparks
 
PDF
Agile software-development-overview-1231560734008086-2
shankar chinn
 
PPTX
Software Project Health Check: Best Practices and Techniques for Your Product...
Velvetech LLC
 
PPTX
Agile Project Manager slideshow template with sample data and analysis.
namratasikarwar23
 
PDF
20220621 Project Management Innovation Conference Harrisburg PA Seatbelts and...
Craeg Strong
 
Agile Metrics
Alexey Krivitsky
 
Agile Metrics
Alexey Krivitsky
 
Top 10 Agile Metrics
XBOSoft
 
Agile Metrics
Alexey Krivitsky
 
Test Metrics in Agile: A Powerful Tool to Demonstrate Value
TechWell
 
Team Foundation Server - Tracking & Reporting
Steve Lange
 
Big Apple Scrum Day 2015 - Advanced Scrum Metrics Reference Sheet
Jason Tice
 
Agile Base Camp - Agile metrics
Serge Kovaleff
 
2008 Metrics for agile software development
Andreas Wintersteiger
 
What to expect in 30 60-90 days in agile transformation journey?
SwatiKapoor43
 
Agile metrics
Ankit Tandon
 
Agile Metrics...That Matter
Erik Weber
 
Test Metrics in Agile - powerful tool to support changes - Zavertailo Iuliia
Yulia Zavertailo
 
Agile Metrics
Emiliano Grande
 
Benzne Webinar : What to expect in 30-60-90 days in Agile Transformation Jour...
Tarun Singh
 
Sagi Smolarski ITG - Enterprise Metrics on Agile
AgileSparks
 
Agile software-development-overview-1231560734008086-2
shankar chinn
 
Software Project Health Check: Best Practices and Techniques for Your Product...
Velvetech LLC
 
Agile Project Manager slideshow template with sample data and analysis.
namratasikarwar23
 
20220621 Project Management Innovation Conference Harrisburg PA Seatbelts and...
Craeg Strong
 

More from TechWell (20)

PDF
Failing and Recovering
TechWell
 
PDF
Instill a DevOps Testing Culture in Your Team and Organization
TechWell
 
PDF
Test Design for Fully Automated Build Architecture
TechWell
 
PDF
System-Level Test Automation: Ensuring a Good Start
TechWell
 
PDF
Build Your Mobile App Quality and Test Strategy
TechWell
 
PDF
Testing Transformation: The Art and Science for Success
TechWell
 
PDF
Implement BDD with Cucumber and SpecFlow
TechWell
 
PDF
Develop WebDriver Automated Tests—and Keep Your Sanity
TechWell
 
PDF
Ma 15
TechWell
 
PDF
Eliminate Cloud Waste with a Holistic DevOps Strategy
TechWell
 
PDF
Transform Test Organizations for the New World of DevOps
TechWell
 
PDF
The Fourth Constraint in Project Delivery—Leadership
TechWell
 
PDF
Resolve the Contradiction of Specialists within Agile Teams
TechWell
 
PDF
Pin the Tail on the Metric: A Field-Tested Agile Game
TechWell
 
PDF
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
TechWell
 
PDF
A Business-First Approach to DevOps Implementation
TechWell
 
PDF
Databases in a Continuous Integration/Delivery Process
TechWell
 
PDF
Mobile Testing: What—and What Not—to Automate
TechWell
 
PDF
Cultural Intelligence: A Key Skill for Success
TechWell
 
PDF
Turn the Lights On: A Power Utility Company's Agile Transformation
TechWell
 
Failing and Recovering
TechWell
 
Instill a DevOps Testing Culture in Your Team and Organization
TechWell
 
Test Design for Fully Automated Build Architecture
TechWell
 
System-Level Test Automation: Ensuring a Good Start
TechWell
 
Build Your Mobile App Quality and Test Strategy
TechWell
 
Testing Transformation: The Art and Science for Success
TechWell
 
Implement BDD with Cucumber and SpecFlow
TechWell
 
Develop WebDriver Automated Tests—and Keep Your Sanity
TechWell
 
Ma 15
TechWell
 
Eliminate Cloud Waste with a Holistic DevOps Strategy
TechWell
 
Transform Test Organizations for the New World of DevOps
TechWell
 
The Fourth Constraint in Project Delivery—Leadership
TechWell
 
Resolve the Contradiction of Specialists within Agile Teams
TechWell
 
Pin the Tail on the Metric: A Field-Tested Agile Game
TechWell
 
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
TechWell
 
A Business-First Approach to DevOps Implementation
TechWell
 
Databases in a Continuous Integration/Delivery Process
TechWell
 
Mobile Testing: What—and What Not—to Automate
TechWell
 
Cultural Intelligence: A Key Skill for Success
TechWell
 
Turn the Lights On: A Power Utility Company's Agile Transformation
TechWell
 
Ad

Recently uploaded (20)

PPTX
Comprehensive Guide: Shoviv Exchange to Office 365 Migration Tool 2025
Shoviv Software
 
PPTX
Platform for Enterprise Solution - Java EE5
abhishekoza1981
 
PDF
Efficient, Automated Claims Processing Software for Insurers
Insurance Tech Services
 
PPTX
Human Resources Information System (HRIS)
Amity University, Patna
 
PDF
Executive Business Intelligence Dashboards
vandeslie24
 
PPTX
Migrating Millions of Users with Debezium, Apache Kafka, and an Acyclic Synch...
MD Sayem Ahmed
 
PPTX
Tally_Basic_Operations_Presentation.pptx
AditiBansal54083
 
PDF
Linux Certificate of Completion - LabEx Certificate
VICTOR MAESTRE RAMIREZ
 
PPTX
The Role of a PHP Development Company in Modern Web Development
SEO Company for School in Delhi NCR
 
PDF
Continouous failure - Why do we make our lives hard?
Papp Krisztián
 
PDF
Powering GIS with FME and VertiGIS - Peak of Data & AI 2025
Safe Software
 
PDF
Unlock Efficiency with Insurance Policy Administration Systems
Insurance Tech Services
 
PPTX
Hardware(Central Processing Unit ) CU and ALU
RizwanaKalsoom2
 
PDF
MiniTool Partition Wizard 12.8 Crack License Key LATEST
hashhshs786
 
DOCX
Import Data Form Excel to Tally Services
Tally xperts
 
PPTX
A Complete Guide to Salesforce SMS Integrations Build Scalable Messaging With...
360 SMS APP
 
PPTX
Fundamentals_of_Microservices_Architecture.pptx
MuhammadUzair504018
 
PDF
Why Businesses Are Switching to Open Source Alternatives to Crystal Reports.pdf
Varsha Nayak
 
PDF
Beyond Binaries: Understanding Diversity and Allyship in a Global Workplace -...
Imma Valls Bernaus
 
PDF
vMix Pro 28.0.0.42 Download vMix Registration key Bundle
kulindacore
 
Comprehensive Guide: Shoviv Exchange to Office 365 Migration Tool 2025
Shoviv Software
 
Platform for Enterprise Solution - Java EE5
abhishekoza1981
 
Efficient, Automated Claims Processing Software for Insurers
Insurance Tech Services
 
Human Resources Information System (HRIS)
Amity University, Patna
 
Executive Business Intelligence Dashboards
vandeslie24
 
Migrating Millions of Users with Debezium, Apache Kafka, and an Acyclic Synch...
MD Sayem Ahmed
 
Tally_Basic_Operations_Presentation.pptx
AditiBansal54083
 
Linux Certificate of Completion - LabEx Certificate
VICTOR MAESTRE RAMIREZ
 
The Role of a PHP Development Company in Modern Web Development
SEO Company for School in Delhi NCR
 
Continouous failure - Why do we make our lives hard?
Papp Krisztián
 
Powering GIS with FME and VertiGIS - Peak of Data & AI 2025
Safe Software
 
Unlock Efficiency with Insurance Policy Administration Systems
Insurance Tech Services
 
Hardware(Central Processing Unit ) CU and ALU
RizwanaKalsoom2
 
MiniTool Partition Wizard 12.8 Crack License Key LATEST
hashhshs786
 
Import Data Form Excel to Tally Services
Tally xperts
 
A Complete Guide to Salesforce SMS Integrations Build Scalable Messaging With...
360 SMS APP
 
Fundamentals_of_Microservices_Architecture.pptx
MuhammadUzair504018
 
Why Businesses Are Switching to Open Source Alternatives to Crystal Reports.pdf
Varsha Nayak
 
Beyond Binaries: Understanding Diversity and Allyship in a Global Workplace -...
Imma Valls Bernaus
 
vMix Pro 28.0.0.42 Download vMix Registration key Bundle
kulindacore
 
Ad

An Agile Testing Dashboard: Metrics that Matter

  • 1.                 T20   Metrics   5/11/17  15:00             An  Agile  Testing  Dashboard:  Metrics   that  Matter     Presented  by:         Prachi  Maini     Morningstar     Brought  to  you  by:                 350  Corporate  Way,  Suite  400,  Orange  Park,  FL  32073     888-­‐-­‐-­‐268-­‐-­‐-­‐8770  ·∙·∙  904-­‐-­‐-­‐278-­‐-­‐-­‐0524  -­‐  [email protected]  -­‐  http://www.starwest.techwell.com/      
  • 2.         Prachi  Maini     Prachi  Maini  is  the  quality  assurance  manager  at  Morningstar.  With  fourteen  years   of  global  quality  assurance  experience,  Prachi  has  been  involved  with  providing   technical  direction,  automation  vision,  tool  selection,  resource  planning,  budgeting,   forecasting,  and  hiring  for  leading  insurance,  financial,  healthcare,  and  investment   organizations  with  complex  systems  and  multiple  dependencies.  An  agile  enthusiast,   Prachi  loves  talking  about  agile  measurements  and  strategizing  the  QA  role  as  part   of  agile  squads.  Committed  to  operational  excellence  and  customer  satisfaction,  she   is  a  strong  test  automation  and  continuous  integration  advocate  who  loves  to  act  as   liaison  between  the  technical  and  business  teams.  Reach  out  to  Prachi  via  Linkedin.        
  • 3. ©2015 Morningstar, Inc. All rights reserved. Prachi Maini Manager, QA Engineering Morningstar, Inc. Agile Metrics that Matter
  • 4. Executive Summary The Concept The Opportunity The Potential •  Define Metrics that can be used by Agile teams and Team management •  Reduced costs •  Increased team satisfaction •  Auto Generate using e x p o s e d A P I s provided by various tools
  • 5. Ø Independent investment research and management firm headquartered in Chicago Ø Consumer report for securities Ø Agile Squads (5-7dev, 1-2 QA, PO, SM, Designer) Ø Two week sprints Ø Toolset •  QAC for test case management •  Selenium for functional automation •  ReadyAPI for webservices •  Webload for Performance ar 3
  • 6. Why do we need Metrics Ø Drive strategy and direction. Ø Provide measurable data and trend to the project team and management. Ø Ensure that the project remains on track. Ø Quantify risk and process improvements. Ø Ensure customer satisfaction with the deployed product. Ø Assist with resource/budget estimation and forecasting.
  • 7. Effective Metrics Ø Metrics should be clearly defined so the team or organization can benchmark its success. Ø Secure buy-in from management and employees. Ø Have clearly defined data and collection process. Ø Are measurable and shared. Ø Can be automatically generated and scheduled.
  • 8. Agile QA Dashboard Ø Focus on trends rather than absolute values. Ø Agile defines team members as anyone responsible for project execution and delivery w h i c h i n c l u d e d e v e l o p e r, Q A a n d documentation. Ø Used by teams to perform introspection on their own performance and feed into release planning Ø Core Agile metrics should not be used to compare different teams or focus on underperforming teams.
  • 9. Project progress •  Burndown Chart –  Graphical representation of work remaining vs time. •  Committed vs Completed –  The percentage of points completed by the squad as a percentage of the committed point for the sprint •  Tech Category –  This helps identify how an agile team is spending its time. The possible values for tech category can be client customization, new product development, operations or maintenance etc.
  • 10. Committed vs Completed   Sprint 74   Sprint 75  Sprint 76   Sprint 77   Sprint 78   Committed   148   105   98   53   154   Completed   74   87   75   51   125   0 50 100 150 200 Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78 Committed vs Completed Committed Completed Tech Category Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78 Client Customization 0 0 0 6 16 New Product Development 157 171 91 144 109 Maintenance 57 54 52 46 59 Other 23 10 0 0 0 0 20 40 60 80 100 120 Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78 Work done by Tech Category Other Maintenance New Product Development Client Customization
  • 11. What to watch for:- ü The team finishes early sprint after sprint because they are not committing enough points ü The team is not meeting is commitment because they are overcommitting each sprint. ü Burndown is steep rather than gradual because work is not broken into granular units ü Scope is often added or changed mid sprint
  • 12. Velocity •  Velocity –  Points of work completed by an agile team within a given sprint •  Adjusted Velocity –  Points of work completed by an agile team accounting for holidays, team absence etc. –  Calculated as Velocity / Available Man Days –  Running Average of Last three Sprints is reported
  • 13. 0   50   100   150   Sprint  74   Sprint  75   Sprint  76   Sprint  77   Sprint  78   Adjusted  Velocity   Adjusted  Velocity   Avg  Velocity  (Last  3  Sprints)   Capacity   Sprint 74   Sprint 75   Sprint 76   Sprint 77   Sprint 78   Team Size 8   8   8   8   8   Available Days 80   80   80   80   80   Unavailable Days 10   12   11   5   0   Net Days (Capacity) 70   68   69   75   80   Velocity             Total Points Completed 73   87   75   51   125   Adjusted Velocity 83   102   87   54   125   Avg Velocity (Last 3 Sprints) 88   90   91   81   89  
  • 14. What to watch for ü  An erratic average velocity over a period of time requires revisiting the team’s estimation practices. ü Are there unforeseen challenges not accounted for when estimating the work DO NOT ü Use velocity to compare two different teams since the level of work estimation is different from team to team ü Use velocity to identify lower performing teams.
  • 15. Quality of Code •  First Pass Rate –  Used for measuring the amount of rework in the process –  Defined as no. of test cases passed on first execution. FPR = PassedTotal on First Execution –  Only for stories that deal with the development of new APIs or Features
  • 16. Story Total TCs Pass Fail FPR Navy PHX-10112 2 1 1 0.5 PHX-10411 8 6 2 0.75 PHX- 10382 15 8 7 0.8 PHX- 7703 10 6 4 6 PHX - 10336 1 1 0 1   34   21   13  0.62
  • 17. What to watch for Ø Lower first pass rates indicate that Agile tools like desks checks , unit testing are not used sufficiently. Ø Lower first pass rate could indicate lack of understanding of requirements. Ø Higher first pass rate combined with high defect rate in production could indicate lack of proper QA.
  • 18. Bug Dashboard –  Net Open Bugs / Created vs Resolved •  This gives a view of the team flow rate. Are we creating more technical debt and defects than what the team can resolve. –  Functional Vs Regression Bugs Trend •  This helps identify the defects found in new development vs regression. –  Defects Detected in •  This helps identify the environment in which the defect is detected. (QA, Staging, UAT, Production) •  For defects detected in environment higher than QA, an RCA is needed.
  • 19. Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78 Bugs Opened 68 36 33 41 17 Bugs Closed 30 30 16 38 15 Net Open Bugs 438 444 461 464 466 420 425 430 435 440 445 450 455 460 465 470 Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78 Net Open Bugs Net Open Bugs 16 24 16 38 40 68 36 33 41 17 0 10 20 30 40 50 60 70 80 Sprint 74Sprint 75Sprint 76Sprint 77Sprint 78 Regression Feature Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78 Regression 16 24 16 38 40 Feature 68 36 33 41 17
  • 20. Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78 QA 25   30   28   30   32   Staging 5   4   4   4   3   UAT 1   3   1   3   1   Prod 2   1   2   1   4   25 30 28 30 26 5 4 4 4 3 1 3 1 3 12 1 2 1 8 0 5 10 15 20 25 30 35 Sprint 74 Sprint 75 Sprint 76 Sprint 77 QA Stg UAT Prod   Sprint 74   Sprint 75   Sprint 76   Sprint 77   Sprint 78   Code   15   21   21   22   19   Config   5   4   4   6   3   Requirements   1   3   1   3   1   Hardware   2   1   2   1   8   Data   2   9   7   4   5   15 21 21 22 19 5 4 4 6 3 1 3 1 3 12 1 2 1 8 2 9 7 4 5 0 5 10 15 20 25 Sprint 74Sprint 75Sprint 76Sprint 77Sprint 78 Code Config Requirements Hardware Data 15 21 21 22 19 5 4 4 6 3 1 3 1 3 1 4 4 5 4 6 0 5 10 15 20 25 Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78 QA oversight Environment Requirements Existing Issue Data   Sprint 74   Sprint 75   Sprint 76   Sprint 77   Sprint 78   QA oversight   15   21   21   22   19   Environment   5   4   4   6   3   Requirements   1   3   1   3   1   Existing Issue   2   1   2   1   8   Data   4   4   5   4   6  
  • 21. What to watch for Ø An increase in regression bug count indicates the impact of code refactoring. Ø An increase bug count in non-QA environment due to environment differences requires revisiting the environment strategy Ø  An increase bug count in non-QA environment due to QA oversight requires revisiting the testing strategy.
  • 22. Automation •  Number of automated test cases –  Percentage of automated test cases as part of total automation candidates –  Percentage of automated test cases as part of total test cases –  Can be reported separately for API , Functional , Migration testing •  Defects Found by Automation –  Number of defects found via automation and manual testing
  • 23. V2 Functional Test Cases   Sprint 74 Sprint 75 Sprint 76 Sprint 77 Sprint 78 # of Total Test Cases 2444 2611 2684 2782 2822 # of Automatable Test Cases 1541 1642 1690 1755 1774 # of Test Cases Automated 1138 1234 1267 1339 1351 % Coverage of Automatable Test Cases 73.85% 75.15% 74.97% 76.30% 76.16% % Coverage of Total Test Cases 47% 47.26% 47% 48.13% 48% 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 0 500 1000 1500 2000 2500 3000 Sprint 74Sprint 75Sprint 76Sprint 77Sprint 78 # of Total Test Cases # of Automatable Test Cases # of Test Cases Automated % Coverage of Automatable Test Cases % Coverage of Total Test Cases 0 20 40 60 80 100 120 140 Jan-16 Feb-16 Mar-16 Apr-16 AEM Legacy Site Regression Testing Execution Time (Average - min) Analysis Time (Average - min) Total Time (Average - Min)
  • 24. What to watch for Ø A decrease in automation coverage could indicate that a lot of automation capacity is being spent in script maintenance. Ø The coverage helps in identifying the percentage of application that can be effectively monitored and regressed on a recurring basis
  • 25. Team Sentiments •  Survey questions distributed to Agile Teams –  Team members are anonymously asked to rate on a scale of 1 to 10 on questions pertinent to the project. Example of question include but are not limited to •  Understanding of the vision of the project •  Quality of the user stories •  Collaboration between team members –  Responses are tabulated and shared with the team –  Trends are noted over time to identify teams alignment with the company and project vision and general satisfaction with the project and
  • 27. Thank You!! •  Questions •  Prachi Maini •  [email protected] •  [email protected] •  https://www.linkedin.com/in/prachimaini •  Phone: 630-818-6472