0% found this document useful (0 votes)
15 views221 pages

ROI basics

Uploaded by

missdragonq
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views221 pages

ROI basics

Uploaded by

missdragonq
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 221

Patricia Pulliam Phillips and Jack J.

Phillips
Contents

iii
Contents

iv
About the Training Basics Series

v
Preface

The Same, Only Better

vii
Preface

viii
Preface

What’s New

What’s Inside?

ix
Preface

Icons to Guide You

What’s Inside This Chapter

x
Preface

Basic Rules

Noted

Think About This

Getting It Done

Who Should Read This Book?

What Do We Mean?

xi
Preface

Acknowledgments

xii
1
The Basics

What’s Inside This Chapter


This chapter explores the fundamentals of the
ROI Methodology, a process that has become
fundamental to many organizations around the world.
The chapter covers three key topics:
• defining return on investment (ROI)
• following the ROI process model
• putting ROI to use.
1
The Basics

Defining ROI

Table 1-1. Financial Measures

Financial Acronym Description


Measure
Return on ROI Used to evaluate the efficiency or profitability of an investment or to compare the
Investment efficiency of a number of investments.

Calculation: Compares the annual net benefits of an investment to the cost of the
investment; expressed as a percentage.

ROI (%) = (Net Benefits / Costs) x 100


Return on ROE Measures a corporation’s profitability by revealing how much profit a company
Equity generates with the money that shareholders have invested. Used for comparing the
profitability of a company to that of other firms in the same industry.

Calculation: Compares the annual net income to shareholder equity.

ROE = (Net Income / Shareholder Equity) x 100


Return on ROA Indicates how profitable a company is in relation to its total assets. Measures how
Assets efficient management is at using its assets to generate earnings.

Calculation: Compares annual net income (annual earnings) to total assets; expressed
as a percentage.

ROA (%) = (Net Income / Total Assets) x 100


Return on ROAE Modified version of ROA referring to a company’s performance over a fiscal year.
Average
Calculation: Same as ROA except the denominator is changed from total assets to
Equity
average shareholders’ equity, which is computed as the sum of the equity value at the
beginning and end of the year divided by two.

ROAE = Net Income / Average Shareholder Equity

2
The Basics

Financial Acronym Description


Measure
Return on ROCE Indicates the efficiency and profitability of a company’s capital investments. ROCE
Capital should always be higher than the rate at which the company borrows; otherwise any
Employed increase in borrowing will reduce shareholders’ earnings.

Calculation: Compares earnings before interest and tax (EBIT) to total assets minus
current liabilities.

ROCE = EBIT / Total Assets – Current Liabilities


Present Value PV Current worth of a future sum of money or stream of cash flows given a specified rate
of return. Important in financial calculations including net present value, bond yields,
and pension obligations.

Calculation: Divides amount of cash flows (C; or sum of money) by the interest rate (r)
over a period of time (t).

PV = C / (1 + r)t
Net Present NPV Measures the difference between the present value of cash inflows and the present
Value value of cash outflows. Another way to put it: measures the present value of future
benefits with the present value of the investment.

Calculation: Compares the value of a dollar today to the value of that same dollar in the
future, taking into account a specified interest rate over a specified period of time.
T

t
/ (1 + r)t) – C0
t=1

Internal Rate IRR Makes the net present value of all cash flows from a particular project equal to zero.
of Return Used in capital budgeting. The higher the IRR, the more desirable it is to undertake the
process.

Calculation: Follows the NPV calculation as a function of the rate of return. A rate of
return for which this function is zero is the internal rate of return.
T

t
/ (1 + r)t) – C0 = 0
t=0

Payback PP Measures the length of time to recover an investment.


Period
Calculation: Compares the cost of a project to the annual benefits or annual cash inflows.

PP = Costs / Benefits
Benefit-Cost BCR Used to evaluate the potential costs and benefits of a project that may be generated if
Ratio the project is completed. Used to determine financial feasibility.

Calculation: Compares project annual benefits to its cost.

BCR = Benefits / Costs

3
Chapter 1

4
The Basics

Noted
Periodically, someone will report a BCR of 3:1 and an ROI of 300 percent. This is not possible. ROI is the
net benefits divided by the costs, which translates to 200 percent. The net benefit is equal to benefits
minus costs.

ROI and the Levels of Evaluation

5
Chapter 1

Think About This


A blended learning program led to the reduction in calls escalated from the service desk to field support by an
average of 20 per month. The monthly value of this reduction was $175 per call x 20 per month, or $3,500 per
month. The first-year benefit of the program was $3,500 x 12, or $42,000. The fully loaded cost for designing,
developing, implementing, and evaluating the program was approximately $30,000. Below are the calcula-
tions for the BCR, ROI, and PP. Note how, while they tell a similar story, the math and metrics are different.

BCR = Program Benefits / Program Costs


= $42,000 / $30,000
= 1.40 or 1.40:1
Translation: For every $1 invested in the program, the organization gained $1.40 in gross benefits.

ROI (%) = (Net Program Benefits / Program Costs) × 100


= ([$42,000 – $30,000] / $30,000) x 100
= ($12,000 / $30,000) x 100
= 40 percent
Translation: For every $1 invested in the program, the organization recovered its $1 investment and gained an
additional $0.40 in net benefit. Or, a 40 percent return over and beyond the investment.

PP = Program Costs / Program Benefits


= $30,000 / $42,000
= 0.71 x 12 = 8.57 months
Translation: Given an investment of $30,000 and benefits of $42,000, the organization should recover the pro-
gram costs within 8.57 months. This suggests that all benefits beyond those gained in 8.57 months will be addi-
tional value.

6
The Basics

Noted
The levels of evaluation are categories of data; timing of data collection does not necessarily define the level
to which you are evaluating. Level 1 data can be collected at the end of the program (as is typical) or in a
follow-up evaluation months after the program (not ideal).
Levels 4 and 5 data can be forecasted before a program is implemented or at the end of the program.
The true impact is determined after the program is implemented when actual improvement in key measures
can be observed. Through analysis, this improvement is isolated to the program, accounting for other fac-
tors. The basics of forecasting ROI are described in the appendix.

7
Chapter 1

Table 1-2. Framework of Data and Key Questions

Level of Evaluation Key Questions


Level 0: • How many people attended the program?
Input • Who were the people attending the program?
• How much was spent on the program?
• How many hours did it take to complete the program?
Level 1: • Was the program relevant to participants’ jobs and missions?
Reaction and • Was the program important to participants’ jobs and mission success?
Planned Action • Did the program provide new information?
• Do participants intend to use what they learned?
• Would participants recommend the program to others?
• Is there room for improvement with facilitation, materials, and the learning environment?
Level 2: • Did participants acquire the knowledge and skills presented in the program?
Learning • Do participants know how to apply what they learned?
• Are participants confident to apply what they learned?
Level 3: • How effective are participants at applying what they learned?
Application and • How frequently are participants applying what they learned?
Implementation • If participants are applying what they learned, what is supporting them?
• If participants are not applying what they learned, why not?
Level 4: • So what if the application is successful?
Impact • To what extent did application of learning improve the measures the program was intended
to improve?
• How did the program affect output, quality, cost, time, customer satisfaction, employee
satisfaction, and other measures?
• How do you know it was the program that improved the measures?
Level 5: • Do the monetary benefits of the improvement in impact measures outweigh the cost of the
Return on program?
Investment

The Chain of Impact

Think About This


In 2010, ROI Institute and ATD part-
nered on a study to determine what
measures are most compelling to senior
leaders. Impact data ranked first, ROI
ranked second, and awards ranked third.

8
The Basics

Figure 1-1. Chain of Impact

Input

Reaction

Learning

Application

Isolate the Effects


of the Program

Impact

ROI

Intangible Benefits

9
Chapter 1

The Evaluation Puzzle

Figure 1-2. Evaluation Puzzle

Source: Phillips (2017).

Process Model

10
The Basics

Noted
The ROI Methodology was originally developed in 1973 by Jack J. Phillips. Jack, at the time, was an electrical
engineer at Lockheed Aircraft (now Lockheed Martin) in Marietta, Georgia, who taught test pilots the electrical
and avionics systems on the C-5A Galaxy. He was also charged with managing a co-operative education program
designed as part of Lockheed’s engineer recruiting strategy. His senior leader told him that in order to continue
funding the co-operative education program, Jack needed to demonstrate the return on Lockheed’s investment
(ROI). The senior leader was not looking for an intangible measure of value, but the actual ROI.
ROI and cost-benefit analysis had been around for decades, if not centuries. But neither had been applied
to this type of program. Jack did his research and ran across a concept referred to as four-steps to training eval-
uation, developed by an industrial-organizational psychologist named Raymond Katzell. Don Kirkpatrick wrote
about these steps and cited Katzell in his 1956 article titled “How to Start an Objective Evaluation of Your Train-
ing Programs.” Because the concept had not been operationalized nor did it include a financial metric describing
the ROI, Jack added the economic theory of cost-benefit analysis to the four-step concept and created a model
and standards to ensure that reliable data, including the ROI, could be reported to his senior leadership team.
Jack’s 1983 Handbook of Training Evaluation and Measurement Methods put the five-level evaluation frame-
work and the ROI process model on the map. As he moved up in organizations to serve as head of learning and
development, senior executive VP of human resources, and president of a regional bank, he had his learning
and talent development and HR teams apply this approach to major programs.
Then, in 1994, his book, Measuring Return on Investment Volume 1, published by the American Society of
Training & Development (ASTD), now the Association for Talent Development (ATD), became the first book of
case studies describing how organizations were using the five-level framework and his process to evaluate talent
development programs.
Over the years, Jack Phillips, Patti Phillips, and their team at ROI Institute have authored more than 100
books describing the use of the ROI Methodology. The application of the process expands well beyond talent
development and human resources. From humanitarian programs to chaplaincy, and even ombudsmanship,
Jack’s original work has grown to be the most documented and applied approach to demonstrating value for
money of all types of programs and projects.

11
Chapter 1

Think About This


Is your organization large with autonomous divisions? Many organizations pursuing ROI fit this description. If
competition exists between divisions, it can lead to divisions purposefully approaching evaluation (and many
other things) differently. If each division approaches evaluation, including ROI, using different methodologies
and different standards, doesn’t it stand to reason that you won’t be able to compare the results? Whether it
is the approach presented in this book or something else, find a method, develop it, use the standards that
support it, and apply it consistently.

Operating Standards and Philosophy

12
Successfully Leading Virtual Teams

Applications and Practice

Implementation

The ROI Process Model

Plan the Evaluation

13
Chapter 1

Figure 1-3. ROI Methodology Process Model

PLAN THE EVALUATION COLLECT DATA

LEVEL 0: LEVEL 1: LEVEL 3:


INPUT REACTION AND APPLICATION AND
PLANNED ACTION IMPLEMENTATION

Make It Expect Make It Make It


Start with Feasible: Success: Matter: Stick:
Why: Select Plan for Design Design for
Align the Right Results for Input, Application
Programs Solution Reaction, and
With the and Impact
Business Learning

LEVEL 2: LEVEL 4:
LEARNING IMPACT

14
The Basics

Make It
Credible:
Capture
ANALYZE DATA Costs of OPTIMIZE RESULTS
Program

Make It Make It Make It


Credible: Credible: Credible:
Isolate the Convert Calculate Tell the Story: Optimize Results:
Effects Data to Return on Communicate Use Black Box
of the Monetary Investment Results to Key Thinking to Increase
Program Value Stakeholders Funding

LEVEL 5: ROI

Make It
Credible:
Identify
Intangible
Measures

INTANGIBLES

Collect Data

15
Chapter 1

Analyze Data

Optimize Results

16
The Basics

Figure 1-4. Evaluation Leads to Allocation

EVALUATION OPTIMIZATION ALLOCATION

Measure Improve Fund

Putting ROI to Use

Justify Spending

New Programs

17
Chapter 1

Existing Programs

Improve the Talent Development Process

Noted
Many people fear a negative ROI; howev-
er, more can be learned through evalua-
tion projects that achieve a negative ROI
than those achieving a high, positive ROI.

Set Priorities

18
The Basics

Eliminate Unsuccessful Programs

Think About This


Imagine that talent development staff, participants, and participant supervisors know a vendor-supplied
customer service program provides little value to the organization. Participants provide evidence of this
with their comments on the end-of-course questionnaire. Unfortunately, senior leaders ignore the Level 1
data, asking for stronger evidence that the program is ineffective.
With this edict, the evaluation team sets the course for implementing a comprehensive evaluation.
The evaluation results show that in the first year, the program achieved a –85 percent ROI; the sec-
ond-year forecast shows a slightly less negative ROI of –54 percent. Leaders immediately agree to drop
the program.
Sometimes you need to speak the language of business to get your point across.

Reinvent the Talent Development Function

Basic Rule 1
Not every program should be evaluated to impact and ROI. ROI is reserved for those programs that are
expensive, have a broad reach, drive business impact, have the attention of senior managers, or are highly
visible in the organization. However, when evaluation does go to impact and ROI, results should be reported
at the lower levels to ensure that the complete story is told.

19
Chapter 1

Gain Support

Key Executives and Administrators

Managers and Supervisors

20
The Basics

Employees

Getting It Done

Exercise 1-1. Talent Development Program Assessment

Instructions: For each of the following statements, circle the response that best matches the talent development
function at your organization.

1. The direction of the talent development function at your organization:


a. shifts with requests, problems, and changes as they occur
b. is determined by talent development and adjusted as needed
c. is based on a mission and a strategic plan for the function

2. The primary mode of operation of the talent development function is to:


a. respond to requests by managers and other employees to deliver training services
b. help management react to crisis situations and reach solutions through training services
c. implement many talent development programs in collaboration with management to prevent problems and
crisis situations

3. The goals of the talent development function are:


a. set by the talent development staff based on perceived demand for programs
b. developed consistent with talent development plans and goals
c. developed to integrate with operating goals and strategic plans of the organization

4. Most new programs are initiated:


a. by request of top management
b. when a program appears to be successful in another organization
c. after a needs analysis has indicated that the program is needed

21
Chapter 1

Exercise 1-1. Talent Development Program Assessment (cont.)


5. When a major organizational change is made you:
a. decide only which programs are needed, not which skills are needed
b. occasionally assess what new skills and knowledge are needed
c. systematically evaluate what skills and knowledge are needed

6. To define talent development plans:


a. management is asked to choose talent development programs from a list of canned, existing courses
b. employees are asked about their talent development needs
c. talent development needs are systematically derived from a thorough analysis of performance problems

7. When determining the timing of training and the target audiences you:
a. have lengthy, nonspecific talent development training courses for large audiences
b. tie specific talent development training needs to specific individuals and groups
c. deliver talent development training almost immediately before its use, and it is given only to those people who
need it

8. The responsibility for results from talent development:


a. rests primarily with the talent development staff to ensure that the programs are successful
b. is the responsibility of the talent development staff and line managers, who jointly ensure that results are
obtained
c. is a shared responsibility of the talent development staff, participants, and managers all working together to
ensure success

9. Systematic, objective evaluation, designed to ensure that participants are performing appropriately
on the job, is:
a. never accomplished; the only evaluations are during the program and they focus on how much the participants
enjoyed the program
b. occasionally accomplished; participants are asked if the training was effective on the job
c. frequently and systematically pursued; performance is evaluated after training is completed

10. New programs are developed:


a. internally, using a staff of instructional designers and specialists
b. by vendors; you usually purchase programs modified to meet the organization’s needs
c. in the most economical and practical way to meet deadlines and cost objectives, using internal staff and vendors

11. Costs for training and talent development are accumulated:


a. on a total aggregate basis only
b. on a program-by-program basis
c. by specific process components, such as development and delivery, in addition to a specific program

12. Management involvement in the talent development process is:


a. very low with only occasional input
b. moderate, usually by request, or on an as-needed basis
c. deliberately planned for all major talent development activities, to ensure a partnership arrangement

13. To ensure that talent development is transferred into performance on the job, you:
a. encourage participants to apply what they have learned and report results
b. ask managers to support and reinforce training and report results
c. use a variety of training transfer strategies appropriate for each situation

14. The talent development staff’s interaction with line management is:
a. rare; you almost never discuss issues with them
b. occasional; during activities, such as needs analysis or program coordination
c. regular; to build relationships, as well as to develop and deliver programs

22
The Basics

15. Talent development’s role in major change efforts is to:


a. conduct training to support the project, as required
b. provide administrative support for the program, including training
c. initiate the program, coordinate the overall effort, and measure its progress—in addition to providing training

16. Most managers view the talent development function as:


a. a questionable function that wastes too much time of employees
b. a necessary function that probably cannot be eliminated
c. an important resource that can be used to improve the organization

17. Talent development programs are:


a. activity oriented (all supervisors attend the Talent Development Workshop)
b. individual results based (the participants will reduce their error rate by at least 20 percent)
c. organizational results based (the cost of quality will decrease by 25 percent)

18. The investment in talent development is measured primarily by:


a. subjective opinions
b. observations by management and reactions from participants
c. dollar return through improved productivity, cost savings, or better quality

19. The talent development effort consists of:


a. usually one-shot, seminar-type approaches
b. a full array of courses to meet individual needs
c. a variety of talent development programs implemented to bring about change in the organization

20. New talent development programs and projects, without some formal method of evaluation, are implemented at
your organization:
a. regularly
b. seldom
c. never

21. The results of talent development programs are communicated:


a. when requested, to those who have a need to know
b. occasionally, to members of management only
c. routinely, to a variety of selected target audiences

22. Management involvement in talent development evaluation:


a. is minor, with no specific responsibilities and few requests
b. consists of informal responsibilities for evaluation, with some requests for formal training
c. is very specific. All managers have some responsibilities in evaluation

23. During a business decline at your organization, the talent development function will:
a. be the first to have its staff reduced
b. be retained at the same staffing level
c. go untouched in staff reductions and possibly beefed up

24. Budgeting for talent development is based on:


a. last year’s budget
b. whatever the training department can “sell”
c. a zero-based system

25. The principal group that must justify talent development expenditures is:
a. the talent development department
b. the human resources or administrative function
c. line management

23
Chapter 1

Exercise 1-1. Talent Development Program Assessment (cont.)


26. Over the last two years, the talent development budget as a percentage of operating expenses has:
a. decreased
b. remained stable
c. increased

27. Top management’s involvement in the implementation of talent development programs:


a. is limited to sending invitations, extending congratulations, and passing out certificates
b. includes monitoring progress, opening and closing speeches, and presentations on the outlook of the
organization
c. includes participating in the program to see what’s covered, conducting major segments of the program, and
requiring key executives be involved

28. Line management involvement in conducting talent development programs is:


a. very minor; only talent development specialists conduct programs
b. limited to a few supervisors conducting programs in their area of expertise
c. significant; on the average, over half of the programs are conducted by key line managers

29. When an employee completes a talent development program and returns to the job, their supervisor is likely to:
a. make no reference to the program
b. ask questions about the program and encourage the use of the material
c. require use of the program material and give positive rewards when the material is used successfully

30. When an employee attends an outside seminar, upon return, they are required to:
a. do nothing
b. submit a report summarizing the program
c. evaluate the seminar, outline plans for implementing the material covered, and estimate the value of the
program

Interpreting the Talent Development Program Assessment


Score the assessment instrument as follows:
• 1 point for each (a) response
• 3 points for each (b) response
• 5 points for each (c) response

Score Range and Analysis


120–150: Outstanding environment for achieving results with talent development. Great management support. A truly
successful example of results-based talent development.
90–119: Above average in achieving results with talent development. Good management support. A solid and
methodical approach to results-based talent development.
60–89: Needs improvement to achieving desired results with talent development. Management support is
ineffective. Talent development programs do not usually focus on results.
30–59: Serious problems with the success and status of talent development. Management support is nonexistent.
Talent development programs are not producing results.

24
2
Plan Your Work

What’s Inside This Chapter


This chapter presents the basics in planning
your evaluation:
• aligning programs with the business
• defining program objectives
• developing the evaluation plan.
2
Plan Your Work

Aligning Programs With the Business

Noted
“There is nothing so useless as doing efficiently that which should not be done at all.”
—Peter Drucker, Austrian-born American management consultant, educator, and author

26
Plan Your Work

Figure 2-1. Alignment Model

Payoff Needs

27
Chapter 2

Business Needs

28
Plan Your Work

Performance Needs

Table 2-1. Diagnostic Tools to Support Performance Needs Analysis

• Statistical process control • Force-field analysis • Diagnostic instruments • Engagement surveys


• Brainstorming • Mind mapping • Focus groups • Exit interviews
• Problem analysis • Affinity diagrams • Probing interviews • Exit surveys
• Cause-and-effect diagram • Simulations • Job satisfaction surveys • Nominal group technique

29
Chapter 2

Noted
You can use collaborative analytics to discern opportunities to improve output, quality, and cost, as well as
employee engagement, customer experience, and other business measures. It is also useful in determining
the impact change in collaborative networks has on business measures. While its use is still in its infancy, it
is important that talent development professionals become familiar with the opportunities it offers. A good
place to begin this learning journey is a research piece authored by Rob Cross, Tom Davenport, and Peter
Gray, titled “Driving Business Impact Through Collaborative Analytics” (Connected Commons, April 2019).

Learning Needs

30
Plan Your Work

Preference Needs

Input Needs

The Alignment Model in Action

31
Chapter 2

Table 2-2. Output of Alignment Process

Level of Need Needs


Payoff Needs What is the economic opportunity or problem?
• Specific dollar amount is unknown. Estimate thousands in U.S. dollars due to time
wasted in meetings.
Business Needs What are the specific business needs?
• Too many meetings (frequency of meetings per month)
• Too many people attending meetings (number of people per month)
• Meetings are too long (average duration of meetings in hours)
Performance What is happening or not happening on the job that is causing this issue?
Needs • Meetings are not planned
• Agendas are not developed prior to the meeting
• Agendas are not being followed
• Consideration of the time and cost of unnecessary meetings is lacking
• Poor facilitation of meetings
• Follow-up action resulting from the meeting is not taking place
• Conflict that occurs during meetings is not being appropriately managed
• Proper selection of meeting participants is not occurring
• Good meeting management practices are not implemented
• Consideration of cost of meetings is not taking place
Learning Needs What knowledge, skill, or information is needed to change what is happening or not
happening on the job?
• Ability to identify the extent and cost of meetings
• Ability to identify positives, negatives, and implications of meeting issues and dynamics
• Effective meeting behaviors
Preference Needs How best can this knowledge, skill, or information be communicated so that change on the
job occurs?
• Facilitator-led workshops
• Job aids and tools
• Relevant and useful information is required
Input Needs What is the projected investment?
• 72 supervisors and team leaders who lead meetings
• Average salary $219 per day
• Break out in three groups
• Two-day workshop for all 72 people
• Program fee for 72 people (includes facilitations and materials)
• Estimated travel and lodging
• Cost of facilities for six days (2 days x 3 offerings)
• Prorated cost of needs assessment

• Estimated cost: $125,000

32
Plan Your Work

Think About This


Sometimes it is important to forecast the ROI for a program prior to investing in it; for example, if the pro-
gram is very expensive or when deciding between different delivery mechanisms. Pre-program forecasting
is also important when deciding between two programs intended to solve the same program. The appendix
includes a basic description of pre-program forecasting as well as descriptions of how to forecast ROI with
data representing the other levels of evaluation.

Defining Program Objectives

Noted
Specificity drives results.
Vague and nebulous leads
to vague and nebulous.

Level 1: Reaction and Planned Action Objectives

33
Chapter 2

Objective Measure
At the end of the course, • 80 percent of participants rate program relevance a 4.5 out of 5 on a Likert scale.
participants will perceive
program content as
relevant to their jobs.

Objective Measures
At the end of the course, • 80 percent of participants indicate that they can immediately apply the knowledge
participants will perceive and skills in their work as indicated by a 4.5 rating out of 5 on a Likert scale.
program content as • 80 percent of participants view the knowledge and skills as reflective of their day-to-
relevant to their jobs. day work activity as indicated by rating this measure a 4.5 out of 5 on a Likert scale.

34
Plan Your Work

Think About This


Overall satisfaction is sometimes referred to as a measure of how much participants liked the program’s
snacks. Recent analysis of a comprehensive Level 1 end-of-course questionnaire showed that participants
viewed the program as less than relevant and not useful, and had little intention to use what they learned.
Scores included:
° Knowledge and skills presented are relevant to my job: 2.8 out of 5.
° Knowledge and skills presented will be useful to my work: 2.6 out of 5.
° I intend to use what I learned in this course: 2.2 out of 5.
Surprisingly, however, respondents scored the overall satisfaction measure, “I am satisfied with the
program,” as 4.6 out of 5. Hmm, it must have been the cookies!

Level 2: Learning Objectives

Basic Rule 2
When conducting a higher-
level evaluation, collect
data at lower levels.

35
Chapter 2

Table 2-5. Compare Broad Objective With Implementation Measures

Objective Measures
At the end of the course, Within a 10-minute time period, participants will be able to demonstrate to the
participants will be able to facilitator the following applications of Microsoft Word with zero errors:
use Microsoft Word. • File, save as, save as web page
• Format, including font, paragraph, background, and themes
• Insert tables, add columns and rows, and delete columns and rows

Level 3: Application and Implementation Objectives

36
Plan Your Work

Table 2-6. Compare Application Objective With Measurable Behaviors

Objective Measures
Participants will use • Participants will develop a detailed agenda outlining the specific topics to be covered
effective meeting for 100 percent of meetings.
behaviors. • Participants will establish meeting ground rules at the beginning of 100 percent of
meetings.
• Participants will follow up on meeting action items within three days following 100
percent of meetings.

Level 4: Impact Objectives

37
Chapter 2

Table 2-7. Compare Broad Objective With Impact Measures

Objective Measure
Improve the quality of the • Reduce the number of warranty claims on the X-1350 by 10 percent within six
X-1350. months after the program.
• Improve overall customer satisfaction with the quality of the X-1350 by 10 percent as
indicated by a customer satisfaction survey taken six months after the program.
• Achieve top scores on product quality measures included in industry quality survey.

Level 5: ROI Objectives

38
Plan Your Work

Set ROI at the Level of Other Investments

Set ROI at a Higher Standard

Set ROI at Break-Even

Set ROI Based on Client Expectations

Developing the Evaluation Plan

39
Chapter 2

Purpose

Making Decisions About Programs

Noted
Decisions are made with
or without evaluation
data. By providing data,
the talent development
team can influence the
decision-making process.

40
Plan Your Work

Table 2-8. Decisions Made With Evaluation Data

Decision Level of Evaluation


Talent development staff want to decide whether they should invest in skill development Level 1
for facilitators.
Course designers are concerned the exercises do not cover all learning objectives and Level 2
need to decide which skills need additional support.
Supervisors are uncertain as to whether they want to send employees to future training Levels 3 and 4
programs.
The clients of the talent development team are deciding if they want to invest in Level 5
expanding a pilot leadership program for the entire leadership team.
Senior managers are planning next year’s budget and are concerned about allocating Levels 1–5 (scorecard)
additional funding to the talent development function.
The talent development staff are deciding whether they should eliminate an expensive Level 5
program that is getting bad reviews from participants, but a senior executive plays golf
with the training supplier.
A training supplier is trying to convince the talent development team that their Level 5 (forecast/pilot)
leadership program will effectively solve the turnover problem.
Supervisors want to implement a new initiative that will change employee behavior Level 3 (focus on
because they believe the talent development program did not do the job. barriers and enablers)

Improving Programs and Processes

Demonstrating Program Value

41
Chapter 2

Table 2-9. Value Perspectives

Levels of Evaluation

Level 1: Reaction and Planned Action


Consumer
Level 2: Learning

System Level 3: Application and Implementation

Level 4: Impact
Economic
Level 5: Return on Investment

42
Plan Your Work

43
Chapter 2

Table 2-10. Value Perspective Versus Use

Levels of Evaluation Value of Customer Frequency of Difficulty of


Information Focus Use Assessment

Least Participants Frequent Easy

Level 1: Reaction and


Planned Action
Consumer
Level 2: Learning

Level 3: Application
System
and Implementation

Level 4: Impact
Economic
Level 5: Return
on Investment

Greatest Client Infrequent Difficult

Feasibility

Program Objectives

Availability of Data

44
Plan Your Work

Appropriateness for ROI

45
Chapter 2

Noted
Not all programs are suitable for impact and ROI evaluation; but when you do evaluate to these levels, use
at least one method to isolate the effects of the program and credibly convert data to monetary value.

46
Plan Your Work

Table 2-11. Percentage of Programs Evaluated at Each Level

Level 1 Level 2 Level 3 Level 4 Level 5


Targets 90–100% 60–90% 30–40% 10–20% 5–10%

Data Collection Plan

What Do You Ask?

How Do You Ask?

Whom Do You Ask?

47
Chapter 2

When Do You Ask?

Who Does the Asking?

ROI Analysis Plan

Methods for Isolating the Effects of the Program

48
Plan Your Work

Methods for Converting Data to Monetary Value

Cost Categories

Intangible Benefits

Communication Targets for the Final Report

Other Influences and Issues During Application

Comments

49
50
Table 2-12. Completed Data Collection Plan

Program: Effective Meetings Responsibility: Date:


Chapter 2
Level Program Measures of Success Data Collection Data Sources Timing Responsibilities
Objectives Method
1 Reaction and Planned
Action

• Positive reaction • Average rating of at least 4.0 on • End-of-course • Participants • End of Course • Facilitator
5.0 scale on quality, usefulness, questionnaire
and achievement of program
objectives

• Planned actions • 100% submit planned actions • Completed


action plans
2 Learning

• Identify the extent and • Given cost guidelines, identify the • Meeting profile • Participants • At the beginning • Facilitator
cost of meetings cost of the last three meetings of the program
(pre)

• Identify positives, • From a list of 30 positive and • Written test • At the end of
negatives, and negative meeting behaviors, the program
implications of basic correctly identify the implications (post)
meeting issues and of each behavior
dynamics

• Acquisition of effective • Demonstrate the appropriate • Skill practice • During program


meeting behaviors response to 8 of 10 active role- observation
play scenarios
3 Application and
Implementation

• Use of effective meeting • Reported change in behavior to • Action plan • Participants • Three months • Program
behaviors planning and conducting meetings owner

• Barriers • Number and variety of barriers • Questionnaire


(for three
• Enablers • Number and variety of enablers groups)
4 Impact

• Time savings from • Time savings • Questionnaire • Participants • Three months • Program
fewer meetings, shorter (for three owner
meetings, and fewer groups)
participants (hours
savings per month)

• Variety of business • Time savings, cost savings,


results measures output improvement, quality
from more successful improvement, project turnaround,
meetings as reported
5 ROI

Comments:
• Target an ROI of at
least 25%

51
Plan Your Work
Chapter 2

Getting It Done

Exercise 2-1. Questions to Start Thinking About Data Collection

Program:

Evaluation Team:

Expected Date of Completion:

1. What is your purpose in conducting an ROI evaluation on this program?

2. What are the broad program objectives at each level of evaluation?


Level 1:
Level 2:
Level 3:
Level 4:
Level 5:

3. What are your measures of success for each objective?


Level 1:
Level 2:
Level 3:
Level 4:
Level 5:

4. Transfer your answers to questions 2 and 3 to the first two columns in the data collection plan (Table 2-14).

52
Table 2-13. Completed ROI Analysis Plan
Program: Effective Meetings Responsibility: Date:

Data Items Methods for Methods of Cost Categories Intangible Communication Other Comments
(Usually Level Isolating the Converting Data Benefits Targets for Final Influences or
4) Effects of the to Monetary Report Issues During
Program Values Application
• Time savings • Participants’ • Hourly wage • Prorated cost • Improvement • Business unit • Participants Participants will
• Miscellaneous estimates and benefits of needs in individual president must see identify specific
business • Participants’ • Participants’ assessment productivity • Senior manag- the need for improvements
measures estimates estimates • Program fee not captured ers providing as a result of
(using stan- per participant elsewhere • Managers of measurement meetings being
dard values • Travel, lodging, • Stress reduc- participants • Follow-up conducted more
when available) and meals tion • Participants process will effectively
• Facilities • Improved • Training and be explained
• Participants’ planning and development to participants
salaries plus scheduling staff during the
benefits for • Greater program
time in work- participation in • Three groups
shop meetings will be
• Evaluation cost measured
• Participants
must report
productivity
gains due to
time saved

53
Plan Your Work
54
Table 2-14. Data Collection Plan
Program: Responsibility: Date:
Chapter 2
Level Data Collection Data Timing Responsibilities
Program Objectives Measures of Success Method Sources
1

5 ROI Comments:
3
Collect Data

What’s Inside This Chapter


This chapter presents the basics in collecting data for
your ROI study, which includes:
• selecting the data collection method
• defining the source of data
• determining the time of data collection.
3
Collect Data

Selecting the Method

End-of-Course Data Collection Methods

56
Collect Data

Table 3-1. End-of-Course Questionnaire

Leading Change in Organizations

Thank you for participating in the Leading Change in Organizations course. This is your opportunity to provide
feedback as to how we can improve this course.

Please respond to the following questions regarding your perception of the program as well as your anticipated
use of the skills learned during the program. We also would like to know how you think the skills applied from this
course will affect business measures important to your function.

You will receive a summary of these results by June 6.

Strongly Strongly
Disagree Agree
I. Your reaction to the course facilitation
1 2 3 4 5

1. The instructor was knowledgeable about the subject.

2. The instructor was prepared for the class.

3. Participants were encouraged to take part in class discussions.

4. The instructor was responsive to participants’ questions.

5. The instructor’s energy and enthusiasm kept the participants


actively engaged.
6. The instructor discussed how I can apply the knowledge and skills
taught in the class.
Strongly Strongly
Disagree Agree
II. Your reaction to the course content
1 2 3 4 5

7. The course content is relevant to my current job.

8. The course content is important to my current job.

9. The material was organized logically.

10. The exercises and examples helped me understand the material.

11. The course content provided new information.

12. I intend to use what I learned in this course.


Strongly Strongly
Disagree Agree
III. New knowledge and skills acquired in the course
1 2 3 4 5

13. I learned new knowledge and skills from this course.

14. I am confident that I can effectively apply the skills learned in


the course.

57
Chapter 3

Table 3-1. End-of-Course Questionnaire (cont.)


Strongly Strongly
Disagree Agree
IV. Your expected application of knowledge and skills
1 2 3 4 5

15. I will effectively apply what I have learned in this course.

16. What percentage of your total work time requires the knowledge and skills presented in this course?
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

17. On a scale of 0% (not at all) to 100% (extremely critical), how critical is applying the content of this course to
your job success?
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

18. What percentage of the new knowledge and skills learned from this course do you estimate you will directly
apply to your job?
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

19. What potential barriers could prevent you from applying the knowledge and skills learned from this course?

20. What potential enablers will support you in applying the knowledge and skills learned from this course?

IV. How what you learned will impact the business


21. As a result of your applying the knowledge and skills learned in this course, to what extent will the following
measures be improved?

Not at All Completely

1 2 3 4 5

Productivity

Sales

Quality

Costs

Time

Job Satisfaction

Customer Satisfaction

58
Collect Data

Table 3-2. Action Plan

Action Plan

Name: _____________________________________________ Date: _________________________________________________

Course: ____________________________________________ Instructor: ____________________________________________

Planned Actions Completion Date

1. __________________________________________________________________ ______________________________________

2. __________________________________________________________________ ______________________________________

3. __________________________________________________________________ ______________________________________

4. __________________________________________________________________ ______________________________________

5. __________________________________________________________________ ______________________________________

59
Chapter 3

Post-Program Data Collection Methods

60
Collect Data

Table 3-3. Follow-Up Data Collection Methods

Method Level 3 Level 4 Level 5


Survey

Questionnaire

Interviews

Focus groups

Program assignments

Action planning

Performance contracting

Program follow-up session

Performance monitoring

Monetary values

Cost data

Questionnaires

61
Chapter 3

Noted
Technology enables us to ask questions in such a way that analysis has never been easier. For example,
questions about monetary value can be asked and simply calculated, so that neither the respondents nor
the talent development professional has to worry about math. Qualtrics.com is one such tool that provides
survey developers and respondents with an improved survey experience.

Interviews

62
Collect Data

Table 3-4. Sample Data Collection Instrument, Level 4

Coaching Questions
1. To what extent did coaching positively influence the following measures:

Significant Influence No Influence


5 4 3 2 1 N/A

Productivity
Sales
Quality
Costs
Efficiency
Time
Employee Satisfaction
Customer Satisfaction

2. What other measures were positively influenced by coaching?

3. Of the measures listed above, improvement in which one is most directly linked to coaching? (Check only one)
Productivity Employee Satisfaction Sales Quality
Cost Customer Satisfaction Efficiency Time

4. Please define the measure above and its unit for measurement

5. How much did the measure identified in Questions 3 and 4 improve since you began this process?
Weekly Monthly Annually

6. What other processes, programs, or events may have contributed to this improvement?

7. Recognizing that other factors may have caused this improvement, estimate the percentage of improvement
related directly to coaching?

8. For this measure, what is the monetary value of improvement for one unit of this measure? (Although this is
difficult, please make every effort to estimate the value.)

9. Please state your basis for the estimated value of improvement you indicated above.

10. What is the annual value of improvement in the measure you selected above?

11. What confidence do you place in the estimates you have provided in the prior questions? (0 percent is no
confidence, 100 percent is complete certainty.)

63
Chapter 3

64
Collect Data

Focus Groups

Noted
Collecting data using qualitative techniques such as interviews and focus groups is a noble idea, but one
that often falls short of its real potential. Two challenges present themselves. The first challenge is tran-
scribing interview and focus group responses. The second is making meaning out of the data. Gig workers,
machine learning, and artificial intelligence (AI) are enabling researchers to tackle both issues with more
ease than in the past, enabling evaluators to leverage the value qualitative data have to offer.

65
Chapter 3

Table 3-5. Focus Group Protocol for a Study of an Emergency Response Support Program

Focus Group Facilitator Protocol


Purpose

This focus group is intended to help us understand how knowledge and skills gained in the program have been
applied (Level 3). During the focus group you will identify effectiveness with application, frequency of application,
barriers, and enablers to application.

What to Do

1. Give yourself extra time.


2. Arrive a few minutes early to prepare the room.
3. Introduce yourself to the point of contact. Reinforce the purpose and explain the process.
4. Set up the room so that the tables or chairs are in a U-shape so participants can see each other, and you
can become part of the group.
5. Place tent cards at each seat.
6. As participants arrive, introduce yourself, give them refreshments, and chat a few minutes.
7. As you ask questions, your partner should write the answers, but not try to write every word. Listen for key
issues and quotes that are meaningful, make important points, and reinforce use of knowledge and skills.
8. When you have gathered the information you need, thank each person. Clean up, thank your point of
contact, and leave.
9. Find a place to debrief with your partner and clarify notes. Do it immediately, because you will surely forget
something.
10. When you return to your workplace, analyze the data.

What to Take

1. Directions.
2. Point of contact’s telephone numbers.
3. Tent cards. Each tent card should have a number in a corner. Participants can write their first name just so
you call them by name, but your notes will refer to the participant number.
4. Refreshments—something light, but a treat because people respond to food, and it relaxes the environment.
5. Flipchart.
6. Markers for the tent cards and the flipchart.
7. Focus group notepads.
8. An umbrella.

What to Wear

You will be in a comfortable environment, so ties and high-heels are not necessary, but do dress professionally. No
jeans and tennis shoes: business casual.

66
Collect Data

What to Say

The intent is to understand how participants are applying what they learned during training. Start on time. You do
not want to keep the participants over the allotted time.

1. Thank everyone for participating.


2. Introduce yourself and your partner. Tell them you are part of a research team conducting a study on the
program. Reinforce with them that their input is important to this study. The results of the study will be
used to improve training and other program support initiatives.
3. Share the purpose of the focus group.
4. Explain how the process will work and that their input is strictly confidential.
5. Have them put their first name on the tent card. Explain that the numbers in the corner of the tent card are
for recording purposes and that in no way will their name be recorded. Explain that after the focus group
you and your partner will compile notes; your notes will be later compiled with those of the other focus
groups. Also, tell them that their input in the focus group is supplemental to a questionnaire that they may
have already received.
6. Begin Question 1 with Participant 1.

Questions

Each person will answer each question before moving to the next question. The idea is to allow each person to hear
what the others say so that they can reflect on their responses. You want to know what each individual thinks.

1. Now that you have had a chance to apply what you learned regarding your emergency response duties, how
effectively have you been able to execute those duties?
2. What specific barriers have interfered with your ability to execute your duties?
3. What has supported your efforts?

Focus Group Note Pad

Question:

Notes Notable Quotes

Date: Page ___ of ___

Location:

Facilitator:

67
Chapter 3

Action Plans

Basic Rule 3
Extreme data items and unsupported claims
should not be used in ROI calculations.

68
Table 3-6. Sample Action Plan for Levels 3 and 4 Data

Part I. Action Plan for the Leadership 101 Training Program SAMPLE

A Name: Medicine Gelatin Manager Instructor Signature: Follow-Up Date:

Objective: Elimination of Gel Waste Evaluation Period: June 1 to November 30

Improvement Measure: Quality Current Performance: 8,000 kg wasted monthly Target Performance: Reduce waste by 80 percent

Specific Steps: I will do this End Result: So that


1. Take a more active role in daily gelatin schedule to ensure the 1. Better control of gelatin production on a daily basis. This will
B manufacture and processing control of gelatin quantities. eliminate the making of excess gelatin that could be waste.
2. Inform supervisors and technicians on the value of gelatin and 2. Charts and graphs with dollar values of waste will be provided to
make them aware of waste. give awareness and a better understanding of the true value of
3. Be proactive to gelatin issues before they become a problem. waste.
4. Constantly monitor hours of encapsulation lines on all shifts 3. Make gelatin for encapsulation lines and making better decisions
to reduce downtime and eliminate the possibility of leftover on the amounts.
batches. 4. Eliminate the excess manufacturing of gelatin mass and the
5. Provide constant feedback to all in the department including probability of leftover medicine batches.
encaps machine operators. 5. Elimination of unnecessary gelatin mass waste.
Expected Intangible Benefits
C Gel mass will decrease to a minimum over time, which will contribute to great financial gains for our company (material variance) and put
dollars into the bottom line.
Collect Data

69
70
Table 3-6. Sample Action Plan for Levels 3 and 4 Data (cont.)

Part II. Action Plan for the Leadership 101 Training Program SAMPLE Chapter 3

D Name: Medicine Gelatin Manager Objective: Elimination of Gel Waste

Improvement Measure: Quality Current Performance: 8,000 kg wasted monthly Target Performance: Reduce waste by 80 percent

Analysis
1. What is the unit of measure? Waste reduction Does this measure reflect your performance alone? Yes No
If not, how many employees are represented in the measure? 32
2. What is the value (cost) of one unit? $3.60 per kilogram of gelatin mass

E 3. How did you arrive at this value? This is the cost of raw materials and is the value we use for waste.

4. How much did this measure change during the last month of the evaluation period compared to the average before the
training program? (monthly value) 2,000 kg monthly waste.
Please explain the basis of this change and what you or your team did to cause it. 6,000 kilograms of waste eliminated. Reduction in
machines from 19 to 12 created additional savings, but did not calculate. Gains in machine hours (efficiency) in the encaps dept. More
awareness of gel mass waste and its costs. Key contributing factors were problem solving skills, communicating with my supervisors
and technicians and their willing response, as well as my ability to manage the results.

5. What level of confidence do you place on the above information? (100% = certainty | 0% = no confidence) 70%

6. What percentage of this change was actually caused by the application of the skills from the Leadership 101 training program
(0–100%) 20%

7. If your measure is time savings, what percentage of the time saved was actually applied toward productive tasks? (0–100%) N/A

Actual Intangible Benefits

F Gelatin mass waste has been a problem for our company since startup; with low efficiency in the encapsulation department and the
mistakes made in the gel department, the waste was out of control. In the past few months efficiency has increased and the gel department
has stabilized. As a result, waste is down considerably.
Collect Data

Basic Rule 4
Estimates of improvements should be adjust-
ed for the potential error of the estimate.

71
Chapter 3

72
Collect Data

Performance Records

Response Rates

73
Chapter 3

Think About This


Collecting the right data at the right time from the right people is critical to the assessment and evaluation of
talent development programs. We asked Trish Uhl, founder of Talent Learning Analytics Leadership Forum and
recognized leader in the digitization of learning measurement, a question: How is digital transformation influ-
encing what data we collect, how we collect it, and from whom we collect it?
Here is what she told us:
Organizations are on a digital transformation journey, which is shifting traditional business economics
from valuing tangible things to valuing information flows as business assets. In fact, many organizations are
building proprietary data science platforms that integrate historical operational data, real-time operational
data, big data, and extant data into artificial intelligence and analytics-driven technology ecosystems. These
are engineered to explore the combined data to discover actionable insights that can be operationalized for
competitive advantage.
For learning and talent development to be part of this flow, we must radically change how we consider,
collect, and analyze data. This will require us to dramatically reposition ourselves, our products, our programs,
and our data within the larger organizational context. We can no longer operate in a linear value chain where
our output (products, programs, data) is divorced from the rest of the business. Instead, we should draw from
and contribute to the larger organizational data exchange and ecosystem. In this ecosystem of cross-functional
departments, customers, and suppliers, talent development provides value by exchanging data, sharing data,
and making data available, as well as using it to discover, deliver, and take action on analytically driven insights
that promote organizational agility and competitive differentiation. To be effective, our solutions should be
engineered with an eye toward delivering and capturing data. Our product and program design must leverage
broader enterprise data and embed bidirectional feedback loops that add and analyze data from the ecosystem.
Consider digital learning technologies, such as intelligent agents or chatbots. We can create a chatbot to
provide post-training performance support, and it can be trained to provide that support using insights from
enterprise ecosystem data. We can also add the data from conversations with the chatbot to the enterprise eco-
system and gain further analysis from that. Dynamic connections can then be made to evaluate outcomes—to
determine whether the solution is hitting the mark. If it’s not, we can optimize, redeploy, and reassess, contin-
uously improving the program until we achieve the desired results.
Talent development professionals must commit to our own digital transformation. We can only raise our
awareness of the organization’s broader digital strategy by becoming digitally fluent ourselves. Only then can
we have clarity and understanding of how our products, programs, data, and platforms coexist within the enter-
prise ecosystem, positioning us as meaningful contributors to the people and organizations we serve.

74
Collect Data

Think About This


Consider how you would manage the administration of a detailed, follow-up questionnaire. The following is
a data collection administration plan with three sections. The first section represents actions you can take
prior to the distribution of the questionnaire. The second section represents actions you can take during
the evaluation process. The third section represents actions you can take after the evaluation process. Think
about the things you can do that will help ensure you get a successful response rate to your data collection
efforts and add them to the list.

Before the evaluation begins, we will:


Ask our senior executive to submit a letter announcing the importance of the evaluation.

During the evaluation, we will:


Send a reminder one week after the questionnaire is administered.

After the evaluation is complete, we will:


Send all respondents a summary copy of the results.

75
Chapter 3

Table 3-7. Actions to Improve Response Rates for Questionnaires

Increasing Questionnaire Response Rates

Provide advance communication about the Provide an incentive (or chance of incentive) for
questionnaire. quick response.
Clearly communicate the reason for the Send a summary of results to target audience.
questionnaire. Distribute the questionnaire to a captive audience.
Indicate who will see the results of the questionnaire. Consider an alternative distribution channel, such as
Show how the data will be integrated with other data. email.
Let participants know what actions will be taken Have a third party collect and analyze the data.
based on the data. Communicate the time limit for submitting
Keep the questionnaire simple and brief. responses.
Allow for responses to be anonymous—or at least Review the questionnaire at the end of the formal
confidential. session.
Make it easy to respond; include a self-addressed, Allow for completion of the survey during normal
stamped envelope or return email address. work hours.
If appropriate, let the target audience know that they Add emotional appeal.
are part of a carefully selected sample. Design the questionnaire to attract attention using a
Provide one or two follow-up reminders using a professional format.
different medium. Provide options to respond (paper, email, website).
Get the introduction letter signed by a top executive Use a local coordinator to help distribute and collect
or administrator. questionnaires.
Enclose a giveaway item with the questionnaire (pen, Frame questions so participants can respond
money, and so forth). appropriately and accurately.

Source: Phillips and Phillips (2016)

Considerations When Selecting a Method

Validity and Reliability

76
Collect Data

Think About This


In a study of a state-level capacity-building program, the evaluators were asked to design a questionnaire
to see if program volunteers believed that it was achieving its intended objectives. The evaluators asked
the corporate office that was funding this program to sample a small number of participants to ensure the
questions were measuring what was intended to be measured and that participants understood questions
being asked. Rather than count on the participants to test the questionnaire, the corporate office ran the
questionnaire up the ladder; all managers tied to the program said, “Yes, the questions represented the cor-
rect measures.” However, when the questionnaire was distributed to the volunteers, the volunteers indicated
that in no way did the questions represent what the program was intended to do.
Take care when developing your questionnaires to ensure that participants realize the intent of the
program and that subject matter experts realize the actual application of the program.

77
Chapter 3

Time and Cost

Utility

Defining the Source

78
Collect Data

Performance Records

Participants

Participants’ Supervisors and Managers

79
Chapter 3

Participants’ Peers and Direct Reports

Senior Managers and Executives

Customers

Other Sources

80
Collect Data

Determining the Time of Data Collection

Noted
Determining the timing of data collection for follow-up data can be tricky, so it is important to make the
timing decision when establishing the program objectives. When deciding on the timing, consider the cur-
rent state with the measure, the time it will take for participants to use what they learn on a routine basis,
the availability of the data, and the convenience and constraints of collecting it.

Getting It Done

81
4
Isolate Program Impact

What’s Inside This Chapter


This step in the ROI Methodology attempts to delineate
the direct contribution caused by the talent development
program, isolating it from other influences. This chapter
covers three critical areas:
• understanding why isolating impact is a key issue
• identifying the methods to do it
• building credibility with the process.
4
Isolate Program Impact

Understanding Why Isolating Impact Is a Key Issue

Other Factors Always Exist

Without Isolating Impact, There Is No Alignment—


Evidence Versus Proof

Other Factors and Influences Have Protective Owners

84
Isolate Program Impact

Think About This


You have conducted a sales training program to improve sales competencies for client relationship manag-
ers. This program is designed to increase sales as the managers use the competencies. Three months after
the training, sales have increased. However, during the evaluation period, product marketing and promotion
increased. Also, prices were lowered in two key product lines and new technologies enabled the sales repre-
sentatives to secure quotes faster, thus increasing efficiency and boosting sales. All these factors influence
sales. From the perspective of the sales training function, the challenge is to determine how much of the
sales increase is due to the training. If a method is not implemented to show the contribution and talent
development claims full credit for improvement in measures, the talent development staff will lose credibility.

Without Isolating Impact, the Study Is Not Valid

Myths About Isolating the Effects of the Program

85
Chapter 4

86
Isolate Program Impact

Applying the Techniques

Basic Rule 5
Use at least one method to
isolate the effects of a project.

87
Chapter 4

Control Group Arrangement

Figure 4-1. Post-Test Only, Control Group Design

Control Group
(Untrained) Measurement

Experimental Group
Program Measurement
(Trained)

88
Isolate Program Impact

Case Study

89
Chapter 4

Think About This


You have been tasked with developing the criteria to match the control and experimental groups in this case
study. What are your criteria for matching the two groups?



90
Isolate Program Impact

Disadvantages and Advantages

91
Chapter 4

Noted
A challenge is when the control group outperforms the experimental group. In some cases, the program
was, in fact, a poor solution to the opportunity. But more times than not, when the control group outper-
forms the experimental design, there is a problem with the research design. Therefore, it is important to
have an alternative approach readily available to determine how much improvement is due the program.

92
Isolate Program Impact

Trend Line Analysis

93
Chapter 4

Case Study

Figure 4-2. Trend Line of Productivity

Shipment Productivity

100%
Team Training Program
Percentage of Schedule Shipped

Actual Average 94.4%

95% Average of Trend


Projected 92.3%

tion
ojec
rend Pr
T
90% Pre-Program Average

87.3%

85%
J F M A M J J A S O N D J

Months

94
Isolate Program Impact

Disadvantages and Advantages

Mathematical Modeling

95
Chapter 4

Case Study

Disadvantages and Advantages

96
Isolate Program Impact

Expert Estimation

97
Chapter 4

Focus Group Approach

98
Isolate Program Impact

99
Chapter 4

100
Isolate Program Impact

Table 4-1. Example of a Participant’s Estimation

Factor That Influenced Improvement Percentage of Percentage Adjusted Percentage


Improvement of Confidence of Improvement
Expressed
1. Talent development program 50% 70% 35%
2. Change in procedures 10% 80% 8%
3. Adjustment in standards 10% 50% 5%
4. Revision to incentive plan 20% 90% 18%
5. Increased management attention 10% 50% 5%
Total 100%

Basic Rule 6
Adjust estimates of improve-
ment for potential errors of
estimation.

Questionnaire Approach

101
Chapter 4

102
Isolate Program Impact

Case Study

Basic Rule 7
If no improvement data are available
for a population or from a specific
source, it is assumed that little or no
improvement has occurred.

103
Chapter 4

Table 4-2. Sample of Input From Participants in a Leadership Program for New Managers

Participant Annual Basis for Value Confidence Isolation Adjusted


Number Improvement Factor Value
Value
11 $36,000 Improvement in efficiency of group. 85% 50% $15,300
$3,000 per month × 12 (group estimate)

42 $90,000 Turnover reduction. Two turnover statistics 90% 40% $32,400


per year. Base salary × 1.5 = 45,000

74 $24,000 Improvement in customer response time 60% 55% $7,920


(8 to 6 hours). Estimated value:
$2,000 per month

55 $2,000 5% improvement in individual effectiveness 75% 50% $750


($40,500 × 5%)

96 $10,000 Absenteeism reduction 85% 75% $6,375


(50 absences per year × $200)

117 $8,090 Team project completed 10 days ahead of 90% 45% $3,276
schedule. Annual salaries:
$210,500 = $809 per day × 10 days

118 $159,000 Under budget for the year by this amount 100% 30% $47,700

Total $113,721

Basic Rule 8
Avoid use of extreme data
items and unsupported claims
when calculating ROI.

104
Isolate Program Impact

Basic Rule 9
Use only the first year of annual
benefits in ROI analysis of
short-term solutions

Disadvantages and Advantages

105
Chapter 4

Data Collection From Other Experts

Building Credibility With the Process

106
Isolate Program Impact

Selecting the Techniques

Table 4-3. Best Practice Use of Techniques


Isolating the Effects of Talent Development Programs
Method1 Best Practice Use2
1. Control group arrangement 35%
2. Trend line analysis 20%
3. Expert estimation 50%
4. Other 20%
1. Listed in order of credibility.
2. Percentages exceed 100 percent.

Using Multiple Techniques

107
Chapter 4

Strengthening Credibility

• Reputation of the source of the data • Personal bias of audience • Realism of the outcome data
• Reputation of the source of the study • Methodology of the study • Type of data
• Motives of the researchers • Assumptions made in the analysis • Scope of analysis

108
Isolate Program Impact

Getting It Done

109
110
Table 4-5. ROI Analysis Plan
Program: Responsibility:________________________________ Date:____________
Chapter 4
Data Items Methods for Methods of Cost Intangible Communication Other Comments
(Usually wLevel 4) Isolating the Converting Data Categories Benefits Targets for Influences or
Effects of the to Monetary Final Report Issues During
Program Values Application
5
Calculate ROI

What’s Inside This Chapter


To continue building credibility for your talent
development programs, you need to demonstrate the
economic value they add to the organization. Specifically,
in this chapter you will learn the basic steps to move from
Level 4 to Level 5 by:
• converting data to monetary value
• tabulating fully loaded costs
• calculating the ROI.
5
Calculate ROI

Converting Data to Monetary Value

Hard Data Versus Soft Data

112
Calculate ROI

Table 5-1. Hard Data

Output Quality
Units produced Errors
Tons manufactured Waste
Items assembled Rejects
Reports processed Rework
Students graduated Shortages
Research grants awarded Defects
Tasks completed Failures
Number of shipments Malicious intrusions
New accounts generated Accidents
Cost Time
Budget variances Cycle time
Unit costs Response time
Variable costs Equipment downtime
Overhead costs Overtime
Operating costs Processing time
Penalties/fines Supervisory time
Project cost savings Meeting time
Accident costs Work stoppages
Sales expense Order response time

Table 5-2. Soft Data

Work Habits New Skills


Absenteeism Decisions made
Tardiness Problems solved
First aid treatments Grievances resolved
Safety violations Conflicts avoided
Communication Interaction with staff
Climate Development
Number of grievances Number of promotions
Employee complaints Number of pay increases
Employee engagement Requests for transfer
Organizational commitment Performance appraisal ratings
Employee turnover Job effectiveness
Satisfaction Initiative
Job satisfaction Implementation of new ideas
Customer satisfaction Innovation
Employee loyalty Goals achieved
Increased confidence Completion of projects

113
Chapter 5

Think About This


Select whether you think the measure represents hard data or soft data. What is improvement in the mea-
sure worth?

Objective Hard Soft


Decrease error rates on reports by 20 percent.

Decrease the amount of time required to complete a project.

Increase the customer satisfaction index by 25 percent in three months.

Reduce litigation costs by 24 percent.

Improve teamwork.

Enhance creativity.

Increase the number of new patents.

Reduce absenteeism.

Tangible Versus Intangible Data

114
Calculate ROI

Noted
There are five levels of data. Intangible benefits are impact data not converted to money. They represent a
sixth type of data when reporting an ROI due to their importance to the organization.

Figure 5-1. Data Conversion

Revenue Converted to Profit

Productivity

Quality
Cost Savings and
Time Cost Avoidance

Cost

115
Chapter 5

Think About This


Rank the following research results in order of credibility based on your definition of credibility. Have a
colleague do the same. Compare your rankings and discuss why you ranked the items as you did. These
are likely the same considerations others will give to your evaluation projects. Rank: 1 = most credible
and 4 = least credible.

Research Rank

Fatigued workers cost employers $136 billion per year.


Source: Fareed Zakaria, CNN Global Public Square, June 9, 2019.

Vulcan Materials Company produced 195 million tons of crushed stone during 2018.
Source: Annual Report.

IAMGOLD showed an ROI of 345 percent on a leadership program involving first-level managers.
Source: Parker, L., and C. Hubble. 2015. “Measuring ROI in a Supervisory Leadership Development
Program.” In Measuring the Success of Leadership Development, by P.P. Phillips, J.J. Phillips, and
R.L. Ray. Alexandria, VA: ATD Press.

St. Mary-Corwin’s Farm Stand Prescription Pantry saved money for the organization and avoid-
ed medical costs for recipients of service so much so that it resulted in a 650 percent ROI.
Source: Phillips, P.P., J.J. Phillips, G. Paone, and C.H. Gaudet. 2019. Value for Money: How to
Show the Value for Money for All Types of Projects and Programs. Hoboken, NJ: John Wiley & Sons.

Data Conversion Methods

Table 5-3. Techniques for Data Conversion

• Standard values • Historical costs • Estimations


» Output to contribution • Internal and external experts » Participants’ estimates
» Cost of quality • External databases » Supervisors’ and managers’ estimates
» Employee’s time • Linking with other measures » Talent development staff estimates

116
Calculate ROI

Standard Values

Basic Rule 10
When collecting and analyzing data,
use only the most credible sources.

117
Chapter 5

Think About This


Using salary plus benefits as the basis for placing value on time for all positions enables you to stan-
dardize your approach. One caveat in this approach is when working with commissioned salespeople.
The value of their work is ultimately the profit gain for the sales they make. However, if they are selling
services or products that add no profit (for example, loss leaders), there is no direct value added by sell-
ing that specific product or service. When treating their time as money, use their average commission.
This will ensure you capture value for their time and in such a way that it can be standardized across all
commission sales positions. If they are paid base plus commission, use their salary plus benefits plus
commission as the basis for time value.

Historical Costs

118
Calculate ROI

Internal and External Experts

External Databases

119
Chapter 5

Linking With Other Measures

Think About This


Effective database research takes time. Consider the following steps to help reduce time and increase ef-
fectiveness of your search:
1. Select a database that aligns with the measures you are trying to convert to money.
2. Formulate a specific research question or objective.
3. Define the key words in the research question.
4. Identify synonyms for the key words, just to ensure you get complete coverage on the topic.
5. Search your databases.
Keep track of your findings. It may even be helpful to document them in a software application database so
you can easily reference them in the future. If you don’t have time to search, call a librarian!

Estimations

120
Calculate ROI

121
Chapter 5

122
Calculate ROI

Table 5-4. Absenteeism Is Converted Using Supervisor Estimates

Supervisor Estimated Per Percent Adjusted Per


Day Cost Confidence Day Cost
1 $1,000 70% $700
2 $1,500 65% $975
3 $2,300 50% $1,150
4 $2,000 60% $1,200
5 $1,600 80% $1,280
$5,305

Average adjusted per day cost of one absence $1,061

Figure 5-2. Estimated Value of Absenteeism

$2,500

$2,000
Estimated Values

$1,500

$1,000

$500

$0

1 2 3 4 5
Supervisors

Estimated Value Adjusted Value

Data Conversion Four-Part Test

123
Chapter 5

Figure 5-3. To Convert or Not to Convert

124
Calculate ROI

Five Steps to Calculating Monetary Benefits

Basic Rule 11
In converting data to monetary
value, when it doubt, leave it out!

Focus on the Unit of Measure

Determine the Value of Each Unit

Calculate the Change in the Performance of the Measure

Determine the Annual Improvement in the Measure.

125
Chapter 5

Calculate the Total Monetary Value of the Improvement

1. Focus on the unit of measure.


1 credit card account

2. Determine the value of each unit.


$1,000 profit per 1 credit card account per year

3. Calculate the change in the performance of the measure.


5 new credit card accounts per month (after isolating other variables)

4. Determine the annual improvement in the measure.


5 accounts per month × 12 months = 60 new credit card accounts per year

5. Calculate the total monetary value of the improvement


60 per year × $1,000 per account = $60,000 annual value of the improvement

126
Calculate ROI

Exercise 5-1. Converting Data to Monetary Values

Scenario: Placing monetary value on grievance reduction

Step 1 Focus on the unit of measure


Our unit of measure is 1 grievance.

Step 2 Determine the value of each unit


The value of each unit is $6,500, as determined by internal experts.

Step 3 Calculate the change in the performance of the measure


The number of grievances declined by 10 per month; and after isolating the effects of the program, 7 of
the 10 fewer grievances were due to the program.

Step 4 Determine the annual improvement in the measure


The annual change in performance equals _____.

Step 5 Calculate the total monetary value of the improvement


The annual change in performance times the value equals _____.

The value that you put in step 5 is the value that goes in the numerator of the formula.

Program Benefits (Value From Step 5)


BCR =
Program Costs

(Program Benefits [Value From Step 5] – Program Costs)


ROI (%) = x 100
Program Costs

Tabulating Fully Loaded Costs

127
Chapter 5

Table 5-6. Cost Categories

Which Cost Category Is Appropriate for ROI?


A B
• Operating costs • Administrative costs
• Support costs • Participant compensation
• Facility costs
• Classroom costs
C D
• Program development costs • Analysis costs
• Administrative costs • Development costs
• Classroom costs • Implementation costs
• Participant costs • Delivery costs
• Evaluation costs
• Overhead and administrative costs

Basic Rule 12
When developing the denom-
inator, when in doubt, leave
it in.

128
Calculate ROI

Table 5-7. Allocation of Overhead and Administrative Costs

Unallocated budget $548,061

Total number of participant-days 7,400


(5-day program offered 10 times a year equals 50 participant days)

Per-day unallocated budget $74


($548,061 ÷ 7,400)

Overhead and administrative costs allocated to a three-day program $222


(3 × $74)

Calculating the ROI

Noted
It is incorrect to multiply the BCR
by 100 and report it as an ROI.

129
Chapter 5

Table 5-8. Cost Estimating Worksheet

Analysis Costs Total

Salaries and employee benefits—talent development staff


(no. of people × average salary × employee benefits factor × no. of hours on project)
Meals, travel, and incidental expenses
Office supplies and expenses
Printing and reproduction
Outside services
Equipment expenses
Registration fees
Other miscellaneous expenses
Total Analysis Cost
Development Costs Total

Salaries and employee benefits


(no. of people × avg. salary × employee benefits factor × no. of hours on project)
Meals, travel, and incidental expenses
Office supplies and expenses
Program materials and supplies
Film
Videotape
Audiotapes
Artwork
Manuals and materials
Other
Printing and reproduction
Outside services
Equipment expense
Other miscellaneous expenses

Total development costs

Delivery Costs Total

Participant costs
Salaries and employee benefits
(no. of participants × avg. salary × employee benefits factor ×
hrs. or days of training time)
Meals, travel, and accommodations
(no. of participants × avg. daily expenses × days of training)
Program materials and supplies
Participant replacement costs (if applicable)
Lost production (explain basis)

130
Calculate ROI

Instructor costs
Salaries and benefits
Meals, travel, and incidental expenses
Outside services
Facility costs
Facilities rental
Facilities expense allocation
Equipment expenses
Other miscellaneous expenses

Total delivery costs

Evaluation Costs Total

Salaries and employee benefits—talent development staff


(no. of people × avg. salary × employee benefits factor × no. of hours on project)
Meals, travel, and incidental expenses
Participant costs
Office supplies and expenses
Printing and reproduction
Outside services
Equipment expenses
Other miscellaneous expenses

Total evaluation costs

General Overhead Allocation

Total program costs


w

Getting It Done

131
Chapter 5

Answers to Exercise 5-1. Converting Data to Monetary Values

Scenario: Placing monetary value on grievance reduction

Step 1 Focus on the unit of measure


Our unit of measure is 1 grievance.

Step 2 Determine the value of each unit


The value of each unit is $6,500, as determined by internal experts.

Step 3 Calculate the change in the performance of the measure


The number of grievances declined by 10 per month; and after isolating the effects of the program, 7 of
the 10 fewer grievances were due to the program.

Step 4 Determine the annual improvement in the measure


The annual change in performance equals 84.

Step 5 Calculate the total monetary value of the improvement


The annual change in performance times the value equals $546,000.

The value that you put in step 5 is the value that goes in the numerator of the formula.

Program Benefits ($546,000)


BCR =
Program Costs

(Program Benefits [$546,000] – Program Costs)


ROI (%) = x 100
Program Costs

132
6
Optimize Results

What’s Inside This Chapter


The chapter describes the basics of the last phase in
the ROI Methodology: Optimize results. Specifically, this
chapter covers:
• telling the story
• developing reports
• using black box thinking.
6
Optimize Results

Telling the Story

134
Optimize Results

Figure 6-1. People Analytics Competencies That Are Important But Lacking

Quantitative analysis 42%

Qualitative methodologies 35%

Communication and storytelling 35%

Psychometrics 24%

Change management 24%

Performance consulting 22%

Business operations 20%

Human resource management systems 17%

IT systems 14%

Relationship building 11%

General knowledge of human resources 6%

Employment law 2%

Other 11%
Source: i4cp and ROI Institute (2018)

What Do You Need?

135
Chapter 6

Basic Rule 13
Communicate the results of the ROI
Methodology to all key stakeholders.

Table 6-1. Checklist of Needs for Communicating Results

1. Needs Related to Talent Development Programs


Demonstrate accountability for client expenditures. Explain a program’s negative ROI.
Secure approval for a program. Seek agreement for changes to a program.
Gain support for all programs. Stimulate interest in upcoming programs.
Enhance reinforcement of the program. Encourage participation in programs.
Enhance the results of future programs. Market future programs.
Show complete results of the program.

2. Needs Related to Talent Development Staff


Build credibility for the staff. Provide opportunities for staff to develop skills.
Prepare the staff for changes.

3. Needs Related to the Organization at Large


Reinforce the need for system changes to support Explain current processes.
learning transfer.
Demonstrate how tools, skills, and knowledge add
value to the organization.

Who Can Give It to You?

136
Optimize Results

Talent Development Team

Participants

137
Chapter 6

Participants’ Supervisors

Think About This


Just like there are guiding principles to the ROI Methodology, there are principles for communicating the
results of an ROI study. The following list provides a broad view of these principles:
• Keep communication timely.
• Target communication to specific audiences.
• Carefully select communication media.
• Keep communication consistent with past practices.
• Incorporate testimonials from influential individuals.
• Consider the talent development function’s reputation when developing the overall strategy.

Clients

138
Optimize Results

How Do You Ask?

Meetings

Internal Publications

139
Chapter 6

Electronic Media

Brochures

Formal Reports

Developing Reports

140
Optimize Results

Detailed Reports

Need for the Program

Need for the Evaluation

Evaluation Methodology

141
Chapter 6

Results

142
Optimize Results

Basic Rule 14
Hold reporting the actual ROI until
the end of the results section.

Conclusions and Next Steps

Appendixes

143
Chapter 6

Table 6-2. Detailed Report Outline

Detailed Report Outline Entry Purpose


General Information
• Objectives of Study
• Background

Methodology for Impact Study


• Levels of Evaluation
• ROI Process
• Collecting Data Builds credibility for
• Isolating the Effects of Training the process.
• Converting Data to Monetary Values
• Costs
Assumptions (Guiding Principles)

Results
• General Information
» Response Profile
» Relevance of Materials
• Participant Reaction
• Learning
The results with six
• Application and Implementation
measures: Levels
» Success With Use
1, 2, 3, 4, 5, and
» Barriers
intangibles
» Enables
• Impact
» General Comments
» Linkage With Business Measures
• ROI
• Intangible Benefits

Conclusions and Recommendations


• Conclusions
• Recommendations

Appendix

144
Optimize Results

Executive Summaries

General Audience Reports

Single-Page Reports

145
Chapter 6

Table 6-3. Single-Page Report

Sexual Harassment Prevention Program

Level 1: Reaction and Planned Action—Results


• Overall rating of 4.11 out of a possible 5
• 93% provided list of action items

Level 2: Learning—Results
• Post-test scores average 84
• Pretest scores average 51
• Improvement 65%
• Participants demonstrated they could use skills successfully

Level 3: Application and Implementation—Results


• Survey distributed to a sample of 25% of participants (1,720)
• Response rate of 64% (1,102 returned)
» 96% conducted meetings with employees and completed meeting record
» On a survey of nonsupervisory employees, significant behavior change was noted (4.1 out of 5 scale)
» 68% of participants report that all action items were completed
» 92% reported that some action items were completed

Level 4: Impact—Results
Sexual Harassment Business One Year Prior One Year After Factor for Isolating the
Performance Measures to Program Program Effects of Program
Internal complaints 55 35 74%
External charges 24 14 62%
Litigated complaints 10 6 51%
Legal fees and expenses $632,000 $481,000
Settlement/losses $450,000 $125,000
Total cost of sexual harassment
$1,655,000 $852,000
prevention, investigation, and defense
Turnover (nonsupervisory annualized) 24.2% 19.9%

Level 5: ROI—Results
• Total annual benefits $3,200,908
• Total costs $277,987
• ROI 1,052%

Intangible Benefits
• Increased job satisfaction
• Increased teamwork
• Reduced stress

146
Optimize Results

Figure 6-2. Micro-Level Dashboard

Source: Explorance’s Metrics That Matter. Used with permission.

Macro-Level Scorecard

147
Chapter 6

Table 6-4. Macro-Level Scorecard Outline

Seven Categories of Data Entry


Indicators • Number of employees involved
• Total hours of involvement
• Hours per employee
• Training investment as a percentage of payroll
• Cost per participant
Level 1: • Percentage of programs evaluated at this level
Reaction and Planned Action • Ratings on seven items vs. targets
• Percentage with action plans
• Percentage with ROI forecast
Level 2: • Percentage of programs evaluated at this level
Learning • Types of measurements
• Self-assessment ratings on three items vs. targets
• Pre/post—average differences
Level 3: • Percentage of programs evaluated at this level
Application and Implementation • Ratings on three items vs. targets
• Percentage of action plans completed
• Barriers (list of top 10)
• Enablers (list of top 10)
• Management support profile
Level 4: • Percent of programs evaluated at this level
Impact • Linkage with business measures (list of top 10)
• Types of measurement techniques
• Types of methods to isolate the effects of programs
• Investment perception
Level 5: • Percentage of programs evaluated at this level
ROI • ROI summary for each study
• Methods of converting data to monetary value
• Fully loaded cost per participant
Intangible Benefits • Intangibles (list of top 10)
• How intangibles were captured

148
Optimize Results

Figure 6-3. Example of a Talent Development Operations Report


For 2016

2015 June Comparison Comparison


Effectiveness Measures Unit of Measure Actual Plan YTD to Plan Forecast to Plan

Level 1: Participant Feedback (all programs)


Quality of content % favorable 76% 80% 79% 1% below 79% 1% below
Quality of instructor % favorable 80% 85% 86% 1% above 85% on plan
Relevance % favorable 72% 78% 73% 5% below 75% 3% below
Alignment to goals % favorable 68% 75% 69% 6% below 71% 4% below
Total for Level 1 Average of measures 74% 80% 77% 3% below 78% 2% below

Level 1: Sponsor Feedback (select programs) % favorable 66% 80% 68% 12% below 75% 5% below

Level 2: Learning (select programs) Score 78% 85% 83% 2% below 85% on plan

Level 3: Application rate (select programs)


Intent to apply (from survey at end of course) % top two boxes 70% 75% 70% 5% below 72% 3% below
Actual application (after 3 months) % who applied it 51% 65% 55% 10% below 63% 2% below

Level 4: Impact (select programs)


Estimate by participants (end of course) % contribution to goal 20% 25% 15% 10% below 20% 5% below

Level 5: ROI (select programs)


Net benefits Thousands $ $546 $800 $250 31% $650 81%
ROI % 29% 35% 25% 10% below 30% 5% below

Source: Center for Talent Reporting (www.centerfortalentreporting.org)

Think About This


There are fundamental guidelines in reporting the results of an ROI results study to senior management. Two
critical questions to consider prior to communicating with senior management are whether you will be believed
if you have an extremely high ROI and whether senior managers can handle it if you have a negative ROI.
With those two questions in mind, you need to consider the following guidelines:
• Plan a face-to-face meeting with senior managers (first one or two ROI studies).
• Hold results until the end of the presentation.
• Present the complete and balanced sets of measures beginning with Level 1.
• Emphasize the attributes of the methodology that ensure conservative results.
• Present a plan for program improvement.
For the first one or two ROI studies, present your detailed report during a regularly scheduled executive
staff meeting. If senior executives know that you have an ROI study to present, they will make room for you on
the agenda. Ask for one hour of their time. Present the study in full detail. Have a copy of the comprehensive
report for each senior manager available at the meeting. When you begin your presentation, be ready and have
copies of your detailed report, but don’t give it out before your presentation. If you give them the report, they
will be flipping through the pages to find the ROI calculation. Keep the reports beside you as you present your
results.
Present the results to the senior management team just as you have written the report: need for program,
need for evaluation, evaluation methodology, results, conclusion, and next steps. Be thorough in reporting Levels

149
Chapter 6

1 through 4, and do not fixate on or hurry to the ROI calculation—the entire chain of impact is important to
reporting the success of the programs. Report Level 5: ROI and the intangible benefits. Then, present your con-
clusions and next steps. At the end of your presentation, provide each senior manager a copy of your final report.
Do you really expect the senior management team to read this detailed report? No. At best, they will hand it
off to someone else to read and summarize the contents that you will have presented in the meeting. Why then
go to the trouble of preparing this printed copy of the detailed final report for senior managers? To build trust.
You’ve told them your story; now, all they have to do is look in the report to see that you covered the details and
that you provided a thorough and accurate presentation of the report’s contents.
After the first one or two studies, senior management will have bought into the ROI Methodology. Of
course, if you’ve worked the process well, they will have begun to learn the methodology long before your initial
presentation. Given that, after the first or second study, you can start distributing the executive summary. Limit
your report to senior management to the 10 to 15-page report. Again, it has all the components, but not so
many details.
After about five ROI studies, you can begin reporting to senior management using the single-page report,
dashboard, scorecard, or even infographic. This will save time and money. Do remember, the talent development
staff will always have a copy of the detailed, comprehensive report. This will serve as a backup and a blueprint
for future studies.

Data Visualization

150
Optimize Results

Tables

Noted
Only use visual displays of data
if it makes the information more
accessible and better nudges
the audience toward action.

Table 6-5. Frequency and Percentage Table

Test Scores Frequency Percent1 Valid Percent1 Cumulative Percent


70 2 11.1 11.1 11.1
77 2 11.1 11.1 22.2
82 1 5.6 5.6 27.8
85 5 27.8 27.8 55.6
87 1 5.6 5.6 61.1
88 2 11.1 11.1 72.2
90 1 5.6 5.6 77.8
92 1 5.6 5.6 83.3
93 2 11.1 11.1 94.4
95 1 5.6 5.6 100.0

Total 18 100 100

1. Percentages exceed 100 percent.

151
Chapter 6

Table 6-6. One-Way Table

Variable 1 Variable 2

Participant Name Employment Date

Andrea Adams November 4, 2015

Benjamin Johnson January 26, 2019

Robert Ladnier August 19, 2014

Aisha Mizner March 15, 2011

Joannetta Ramsey June 23, 2018

Diagrams

152
Optimize Results

Figure 6-4. A Phased Approach to a Comprehensive Evaluation


Project Phase Completion Dates

Phase 2
Summer 2017

2nd Course
Phase 1 Evaluation Phase 3 Phase 4
Summer 2016 Fall 2018 Fall 2018

Evaluation
1st Course 3rd Course Certification
Implementation
Evaluation Evaluation ROI Study
Strategy

4th Course
Evaluation

Figure 6-5. Depiction of Conceptual Framework

• Continuous learning and practice


Course • Research
• Real-world exercises
• Knowledge application
• Reliable staff

• Increased network security


Positive Reaction • Alleviated work stoppage
Knowledge Acquisition • Reduced equipment downtime
Knowledge Application • Increased uptime
• Reduced cost of troubleshooting

As participants react positively to the course, acquire knowledge and skills, and apply knowledge and skills,
results occur. However, other intervening variables also influence measures; therefore, steps must be taken to
isolate the effects of the course on these measures.

153
Chapter 6

Figure 6-6. Social Network Analysis

Graphs

154
Optimize Results

Histograms

Figure 6-7. Histogram

16

14

12

10
Frequency

0
Std. Dev = 12.00
35.0 45.0 55.0 65.0 75.0 85.0
Mean = 72.1
40.0 50.0 60.0 70.0 80.0 90.0
N = 60.00
Score on Training Exam

Box Plots

155
Chapter 6

Figure 6-8. Box Plot

Scores On Training Exam By Sales Training Group


100

90

80
Score On Training Exam

70

60

50

40

30

20
N= 20 20 20
1 2 3

Sales Training Group

Line Graphs

156
Optimize Results

Figure 6-9. Line Graph

Goal Versus Actual


20
Average Use of Knowledge and Skill

18
16
14
12
10
8
6
4
2
0
Determine Define Reconcile Manage Troubleshoot Recommend
performance root causes requests implementation implementation solutions
gaps

Supervisor Participant Customer

Using Black Box Thinking to Improve Performance

157
Chapter 6

Elevate the Conversation

Uses of Data

158
Optimize Results

Table 6-7. Uses of Evaluation Data

Use of Evaluation Data Appropriate Level of Data


1 2 3 4 5
Adjust program design
Improve program delivery
Influence application and impact
Enhance reinforcement
Improve management support
Improve stakeholder satisfaction
Recognize and reward participants
Justify or enhance budget
Develop norms and standards
Reduce costs
Market programs
Expand implementation to other areas

When to Act

159
Chapter 6

160
Optimize Results

Getting It Done

161
Chapter 6

Exercise 6-1. Sample Communication Plan

Need for Communication Target Audience Communication Document Distribution Method

162
7
Sustain Momentum

What’s Inside This Chapter


Now that you know the basics of developing an ROI
impact study, it’s time to learn how to keep up the
momentum. This includes:
• identifying resistance to implementation
• overcoming resistance to implementation
• making the ROI Methodology routine.
7
Sustain Momentum

Identifying Resistance

Start With the Talent Development Team

164
Sustain Momentum

Table 7-1. Typical Objections to ROI.

• This costs too much. • I do not understand this. • The ROI process is too subjective.
• We don’t need this. • Our clients will never buy this. • Our managers will not support
• This takes too much time. • What happens when the results this.
• Who is asking for this? are negative? • ROI is too narrowly focused.
• This is not in my job duties. • How can we be consistent with • This is not practical.
this?
• I did not have input on this.

Go to the Management Team

165
Chapter 7

Table 7-2. Typical Accountability Reactions

Accountability Issues Reaction to ROI


• Is all this training really needed? • Is this more new jargon?
• How is talent development helping our business? • Is this the ROI that I know?
• Can we do this with less cost? • How can you do this?
• Do we have what it takes? • Why didn’t you do this earlier?
• Why does this take so long? • Is this credible?
• Show me the money. • Can we do this for every program?

Conduct a Gap Analysis

Table 7-3. Typical Gap Categories

• Staff capability for ROI • Appropriate environment for transfer of learning


• Results-based talent development • Effective management support
• Alignment with business needs • The perception of value of talent development
• Effective policies, procedures, and templates

166
Sustain Momentum

Overcoming Resistance to Implementation

167
Chapter 7

Figure 7-1. Building Blocks for Overcoming Resistance

Preparing the Management Team

Sharing Information

Using Technology

Completing ROI Projects

Establishing Goals, Plans, and Timetables

Revising Policies and Procedures

Preparing the Staff

Identifying Roles and Responsibilities

Identifying Roles and Responsibilities

Identifying a Champion

Table 7-4. Roles of the ROI Champion

• Technical expert • Consultant • Developer • Teacher • Process monitor


• Initiator • Designer • Cheerleader • Communicator • Interpreter
• Coordinator • Problem solver • Planner • Analyst

168
Sustain Momentum

Delegating Responsibilities to Ensure Success

169
Chapter 7

Noted
It will only take 3 to 5 percent of your talent development budget to create and integrate a robust measurement
and evaluation practice. That’s pennies compared to value of the opportunities lost if you don’t have one.

Preparing the Staff

Developing the ROI Leaders

170
Sustain Momentum

Developing the Staff

Using ROI As a Learning Tool—Not a Performance Evaluation Tool

Revising Policies and Procedures

171
Chapter 7

Table 7-5. Results-Based Internal Talent Development Policy

1. Purpose.
2. Mission.
3. Evaluate all programs, which will include the following levels:
• Level 1: Reaction and Planned Action (100%)
• Level 2: Learning (no less than 70%)
• Level 3: Application and Implementation (50%)
• Level 4: Impact (usually through sampling) (10%) (highly visible, expensive)
• Level 5: ROI (7%).
4. Evaluation support group (corporate) will provide assistance and advice in measurement and evaluation,
instrument design, data analysis, and evaluation strategy.
5. New programs are developed following logical steps beginning with needs analysis and ending with
communicating results.
6. Evaluation instruments must be designed or selected to collect data for evaluation. They must be valid,
reliable, economical, and subject to audit by evaluation support group.
7. Responsibility for talent development program results rests with facilitators, participants, and supervisors of
participants.
8. An adequate system for collecting and monitoring talent development costs must be in place. All direct costs
should be included.
9. At least annually, the management board will review the status and results of talent development. The review
will include plans, strategies, results, costs, priorities, and concerns.
10. Line management shares in the responsibility for program evaluation through follow-up, pre-program
commitments, and overall support.
11. Managers and supervisors must declare competence achieved through talent development programs. When
not applicable, the talent development staff should evaluate.
12. External consultants must be selected based on previous evaluation data. A central data or resource base
should exist.
13. All external programs of more than one day in duration will be subjected to evaluation procedures. In addition,
participants will assess the quality of external programs.
14. Talent development program results must be communicated to the appropriate target audience. As a
minimum, this includes management (participants’ supervisors), participants, and all learning staff.
15. Key talent development staff members should be qualified to do effective needs analysis and evaluation.
16. A central database for program development must be in place to prevent duplication and serve as
program resource.
17. Union involvement is necessary in total talent development plan.

172
Sustain Momentum

Establishing Goals, Plans, and Timetables

Setting Targets

173
Chapter 7

Table 7-6. Evaluation Targets

Level of Evaluation Percentage of Programs Evaluated at This Level


Level 1: Reaction and Planned Action 90–100%
Level 2: Learning 60–90%
Level 3: Application and Implementation 30–40%
Level 4: Impact 10–20%
Level 5: ROI 5–10%

Think About This


What percentage of your programs do you
evaluate at each level? How do your targets
compare to the recommended targets above?
• Level 1 ______ percent
• Level 2 ______ percent

Developing a Project Plan • Level 3 ______ percent


• Level 4 ______ percent
• Level 5 ______ percent

174
Sustain Momentum

Figure 7-2. ROI Implementation Project Plan for a Large Petroleum Company

J F M A M J J A S O N D J F M A M J J A S O N

Team Formed
Policy Developed
Targets Set
Network Formed
Workshops Developed
ROI Project (A)
ROI Project (B)
ROI Project (C)
ROI Project (D)
WLP Staff Trained
Suppliers Trained
Managers Trained
Support Tools Developed
Evaluation Guidelines Developed

Completing ROI Projects

175
Chapter 7

Noted
Not every offering of a program is evaluated to impact or ROI. This type of evaluation is typically conducted
on select offerings. So, while 20 unique programs may be targeted for ROI evaluation, it is likely only one
or two will be evaluated to those levels.

Using Technology

176
Sustain Momentum

Sharing Information

177
Chapter 7

Preparing the Management Team

178
Sustain Momentum

Making the ROI Methodology Routine

Making Planning Routine

Integrating Evaluation Into Talent Development Programs

179
Chapter 7

Using Shortcuts

180
Sustain Momentum

Take Shortcuts at Lower Levels

Fund Measurement and Evaluation With Program Cost Savings

Think About This


As a percentage of the total talent development budget, how much do you currently spend on evaluation? What
will it take to increase your funding?

Use Participants

181
Chapter 7

Use Sampling

Use Estimates

Use Internal Resources

Use Standard Templates

Use Streamlined Reporting

182
Sustain Momentum

Getting It Done

Exercise 7-1. Measurement and Evaluation Strategy and Plan


This document addresses a variety of issues that make up the complete measurement and evaluation strategy and
plan. Each of the following items should be explored and decisions made regarding the specific approach or issue.

Purposes of Evaluation
From the list of evaluation purposes, select the ones that are relevant to your organization:
Determine success in achieving program objectives.
Identify strengths and weaknesses in the talent development process.
Set priorities for talent development resources.
Test the clarity and validity of tests, cases, and exercises.
Identify the participants who were most (or least) successful with the program.
Reinforce major points made during the program.
Decide who should participate in future programs.
Compare the benefits to the costs of a talent development program.
Enhance the accountability of talent development.
Assist in marketing future programs.
Determine if a program was an appropriate solution.
Establish a database to assist management with decision making.

Are there any others?

183
Chapter 7

Exercise 7-1. Measurement and Evaluation Strategy and Plan (cont.)


Overall Evaluation Purpose Statement
State the purpose for conducting an evaluation:

Stakeholder Groups
Identify specific stakeholders that are important to the success of measurement and evaluation:

Evaluation Targets and Goals


List the approximate percentage of programs currently evaluated at each level. List the number of programs you
plan to evaluate at each level by a specific date.

Level Current Use Planned Use Date


Level 1: Reaction and Planned Action
Level 2: Learning
Level 3: Application and Implementation
Level 4: Impact
Level 5: ROI

Staffing
Indicate the philosophy of using internal or external staff for evaluation work and the number of staff involved in
this process part time and full time.

• Internal versus external philosophy:

• Number of part-time staff:


» Names or titles:

• Number of full-time staff:


» Names or titles:

Responsibilities
Detail the responsibilities of different groups in talent development. Generally, specialists are involved in a leadership
role in evaluation, and others are involved in providing support and assistance in different phases of the process.

Group Responsibilities

184
Sustain Momentum

Budget
The budget for measurement and evaluation in best-practice organizations is 3 to 5 percent of the learning and
development budget. What is your current level of measurement and evaluation investment? What is your target?

Data Collection Methods


Indicate the current data collection methods used and planned for the different levels of evaluation.

Current Use Planned Use


Level 1: Reaction and Planned Action
Questionnaires
Focus groups
Interviews
Level 2: Learning
Objective tests
Questionnaires and surveys
Simulations
Self-assessments
Level 3: Application and Implementation
Follow-up surveys
Observations
Interviews
Follow-up focus groups
Action planning
Level 4: Impact
Follow-up questionnaires
Action planning
Performance contracting
Performance records monitoring

185
Chapter 7

Exercise 7-1. Measurement and Evaluation Strategy and Plan (cont.)


Building Capability
How will staff members develop their measurement and evaluation capability?

Action Audience Who Conducts or Organizes?


ROI briefings (one to two hours)
Half-day ROI workshop
One-day ROI workshop
Two-day ROI workshop
ROI certification
Coaching
ROI conferences
Networking

Use of Technology
How do you use technology for data collection, integration, and scorecard reporting, including technology for con-
ducting ROI studies? How do you plan to use technology?

Method Current Use Planned Use


Surveys
Tests
Other data collection
Integration
ROI
Scorecards

Communication Methods
Indicate the specific methods you currently use to communicate results. What methods do you plan to use?

Method Current Use Planned Use


Meetings
Interim and progress reports
Newsletters
Email and electronic media
Brochures and pamphlets
Case studies

186
Sustain Momentum

Use of Data
Indicate how you currently use evaluation data by placing a “ ” in the appropriate box. Indicate your planned use
of evaluation data by placing an “X” in the appropriate box.

Strategy Appropriate Level of Data


1 2 3 4 5
Adjust program design
Improve program delivery
Influence application and impact
Enhance reinforcement for learning
Improve management support for talent development
Improve satisfaction with stakeholders
Recognize and reward participants
Justify or enhance budget
Develop norms and standards
Reduce costs
Market talent development programs
Expand implementation to other areas

Questions or Comments

187
Appendix
ROI Forecasting Basics

Pre-Program Forecasts

Noted
When conducting a pre-program forecast, the step of isolating the effects of the program is omitted. It is
assumed that the estimated results are referring to the influence on the program under evaluation.

189
Appendix

Figure A-1. Basic ROI Forecasting Model

Develop Level 3 and


Level 4 Objectives Estimate Program
Costs
(Fully Loaded)

Estimate Business
Calculate the
or Organizational Convert Data to
Return on
Impact (Level 4) Monetary Value
Investment
Data

Identify
Intangible
Benefits

190
Appendix

Pilot Program

Level 1 Forecasting

191
Appendix

Table A-1. Questions for Forecasting ROI at Level 1

1. As a result of this program, what specific actions will you attempt as you apply what you have learned?

2. Indicate what specific measures, outcomes, or projects will change as a result of your action.

3. As a result of these anticipated changes, estimate (in monetary values) the benefits to your organization over a
period of one year. $_______________________

4. What is the basis of this estimate?

5. What confidence, expressed as a percentage, can you put in your estimate? _______%
(0% = no confidence; 100% = complete certainty)

Additional Approaches to Forecasting

192
Appendix

Table A-2. Example of Forecasting ROI at Level 3

Ten supervisors attend a four-day learning program

1. Identify competencies: Supervisor skills

2. Determine percentage of skills used on the job: 80% (average of group)

3. Determine the monetary value of the competencies using salary and benefits of participants:
$40,000 per participant

Multiply percentage of skills used on the job by the value of the job: $50,000 × 80% = $40,000
Calculate the dollar value of the competencies for the group: $40,000 × 10 = $400,000

4. Determine increase in skill level: 10% increase (average of group)

5. Calculate the monetary benefits of the improvement: $40,000

Multiply the dollar value of the competencies by the improvement in skill level: $400,000 × 10% = $40,000

6. Compare the monetary benefits to the cost of the program: The ROI is 166% and the cost of the program is
$15,000

$40,000 – $15,000
ROI (%) = x 100 = 166%
$15,000

Noted
Forecasting ROI and the use of predictive analytics is becoming much more popular than in the past. Be
forewarned: Don’t rely on forecasting alone. While forecasting and predictive analytics are useful, they re-
sult in mere estimates of what could be. The real meaning is in what actually occurs—hence, the need for
post-program evaluation.

193
References

195
References

196
Additional Resources

Additional Books From the ROI Institute

197
Additional Resources

198
Additional Resources

Case Studies Describing ROI Application

199
Additional Resources

TD at Work (formerly Infoline): The How-to Reference Tool for


Training and Performance Professionals

200
Additional Resources

Books on Data Visualization

201
About the Authors

203
About the Authors

204
About ROI Institute

205

You might also like