0% found this document useful (0 votes)
36 views136 pages

Lesson Notes - November 02, 2023 frimpong

Performance accountability is a system for evaluating policies and programs based on their outcomes against established standards, essential for informed decision-making in resource allocation. The Louisiana Performance and Accountability Act mandates performance accountability reporting for state-funded agencies, emphasizing the importance of measurable objectives and indicators. Successful performance accountability includes clear outcome definitions, regular reporting, and a feedback system to enhance program operations.

Uploaded by

caliost90
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views136 pages

Lesson Notes - November 02, 2023 frimpong

Performance accountability is a system for evaluating policies and programs based on their outcomes against established standards, essential for informed decision-making in resource allocation. The Louisiana Performance and Accountability Act mandates performance accountability reporting for state-funded agencies, emphasizing the importance of measurable objectives and indicators. Successful performance accountability includes clear outcome definitions, regular reporting, and a feedback system to enhance program operations.

Uploaded by

caliost90
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 136

Lesson 11

Performance Accountability

Saturday- October 29, 2022


3:00 - 4:20 pm

Dr. Augustine Adu Frimpong [email protected]


1
Office Hours Monday-Thursday: 3pm-5pm
What is Performance Accountability?
• Performance accountability is a means of judging policies and
programs by measuring their outcomes against agreed upon
standards.
• Is there the need for Performance Accountability System? Yes.
• This is because a performance accountability system provides the
framework for measuring results--not merely processes or
workloads--and organizes the information so that it can be used
effectively for making policy, management, and resource allocation
decisions.

Dr. Augustine Adu Frimpong [email protected]


2
Office Hours Monday-Thursday: 3pm-5pm
Note
• Under the provisions of the Louisiana Performance and
Accountability Act (Act 1465 of 1997), performance
accountability is mandated as part of performance-
based budgeting.
• As a result, all Budget Units [or State Funded Agencies]
are required to provide a “Performance and
Accountability Reporting”.

Dr. Augustine Adu Frimpong [email protected]


3
Office Hours Monday-Thursday: 3pm-5pm
Some Excellent Quotes—”Why the need for
SMART Objectives/Performance Indicators”.

•If you can't measure it, you can't improve it.


•What gets measured, gets done.
•What gets measured, gets changed.
•What gets measured, gets improved.

Dr. Augustine Adu Frimpong [email protected]


4
Office Hours Monday-Thursday: 3pm-5pm
Eight (8) Characteristics of Successful
Performance Accountability
• Successful performance accountability has the following characteristics:
• 1. It is built into policy planning and strategic planning processes.
• 2. It is established and used for internal management and decision making.
• 3. It is based on a clear understanding of process but focuses on outcomes (or results).
• 4. It uses a balanced set of performance indicators to measure performance.
• 5. It generates valid, reliable data consistently over time.
• 6. It includes both internal and external comparisons. It compares internal
performance over time; it compares performance against similar programs, activities,
or functions in public or private sectors.
• 7. It reports outcomes regularly and publicly.
• 8. It has a good feedback system. It quickly conveys information back to managers and
front-line employees who can use that information to improve program operations

Dr. Augustine Adu Frimpong [email protected]


5
Office Hours Monday-Thursday: 3pm-5pm
Three (3) Components of the Performance
Accountability Process

• The performance accountability process is


composed of three components:
• 1. Defining outcomes
• 2. Measuring and reporting performance
• 3. Evaluating performance and using results

Dr. Augustine Adu Frimpong [email protected]


6
Office Hours Monday-Thursday: 3pm-5pm
Defining outcomes:
• Identifying the results that are targeted for achievement.
• This component is linked to the “Where do we want to be?”part of
policy development, strategic planning, and operational planning
processes.
• In addition, performance accountability, at its highest level, should
point toward fulfillment of the organization’s leadership vision and
core mission (part of “Who are we?”)—the very purpose for which
the department, agency, or program was created.

• [Read MANAWARE pages 8-9 ]

Dr. Augustine Adu Frimpong [email protected]


7
Office Hours Monday-Thursday: 3pm-5pm
Discuss Goals

• Goals are the general end purposes (or


results) toward which effort is directed.
• Goals establish the direction in which an
organization is heading in order to reach a
particular destination.

Dr. Augustine Adu Frimpong [email protected]


8
Office Hours Monday-Thursday: 3pm-5pm
Discuss Objectives
• Objectives are specific and measurable targets for
accomplishment.
• Objectives identify milestones along the way toward
accomplishing goals.
• Both goals and objectives are inspired by the
organization's vision, mindful of the organization's
mission and philosophy, and based on the
organization's current internal situation and external
operating environment as well as projections of future
conditions.
Dr. Augustine Adu Frimpong [email protected]
9
Office Hours Monday-Thursday: 3pm-5pm
Discuss Performance Standard
• A performance standard is the expected level of performance
(value) associated with a particular performance indicator for a
particular fiscal year and funding level.
• During the strategic planning process, a balanced set of performance
indicators is identified.
• These performance indicators are measured and reported on an annual
basis in order to track strategic progress and support both performance-
based budgeting and management decision making.
• Performance standards are proposed during the budget development
process and established during the appropriation process.
• Performance standards are commitments for service that are linked with
the level of funding budgeted/appropriated.
• See “Performance Standards: Guidelines for Development and Revision”
on the OPB website for more information on performance standards.

Dr. Augustine Adu Frimpong [email protected]


10
Office Hours Monday-Thursday: 3pm-5pm
Measuring and Reporting Performance

Dr. Augustine Adu Frimpong [email protected]


11
Office Hours Monday-Thursday: 3pm-5pm
Name the Louisiana State Law and Revised Statutes that mandate Quarterly
Performance Progress Reporting by State Agencies.

• Quarterly performance progress reporting is required under the


provisions of the "Louisiana Government Performance and
Accountability Act" (R. S. 39:87.1et seq. or Act 1465 of 1997, as
amended by Act 1169 of 1999).
• In addition, annual undersecretaries’ management and program
analysis reports are required under R. S. 36:8 (Act 160 of 1982, as
amended by Act 911 of 1995).
• Further, performance indicators must be included in annual
operational plans and other budget request forms.

Dr. Augustine Adu Frimpong [email protected]


12
Office Hours Monday-Thursday: 3pm-5pm
Three (3) Steps to Measure and Track
Performance
• To measure and track performance, it is necessary to:
1. Identify and select balanced sets of performance
indicators to measure progress toward defined
outcomes;
2. Organize to gather performance information; and
3. Monitor and track performance on a regular basis.

Dr. Augustine Adu Frimpong [email protected]


13
Office Hours Monday-Thursday: 3pm-5pm
Performance Indicators
• Performance indicators are the tools used to measure the
performance, progress, and accomplishments of policies, plans,
and programs.
• Performance indicators consist of two parts:
• indicator name, and
• indicator value.
• The indicator name describes what you are measuring.
• The indicator value is the numeric amount or level achieved or to
be achieved during a given measurement period.

Dr. Augustine Adu Frimpong [email protected]


14
Office Hours Monday-Thursday: 3pm-5pm
Five (5) Types of Performance Indicators (PIs) used as Part of
Louisiana’s Accountability and Management Processes

• Types of Performance Indicators-- Louisiana's management processes


use five types of indicators to measure performance:
• 1. Input
• 2. Output
• 3. Outcome
• 4. Efficiency
• 5. quality
• These indicators are based on systems logic (how a process works)
and each type is designed to answer a different question or provide a
different perspective regarding performance.
• Together, these indicators provide a balanced view of performance.

Dr. Augustine Adu Frimpong [email protected]


15
Office Hours Monday-Thursday: 3pm-5pm
.
•.

Dr. Augustine Adu Frimpong [email protected]


16
Office Hours Monday-Thursday: 3pm-5pm
Input Indicators

• Input indicators measure resource allocation


and demand for services.
• They identify the amount of resources needed to
provide a particular service.
• Inputs include labor, materials, equipment,
facilities, and supplies.

Dr. Augustine Adu Frimpong [email protected]


17
Office Hours Monday-Thursday: 3pm-5pm
Examples of Input Indicators

Dr. Augustine Adu Frimpong [email protected]


18
Office Hours Monday-Thursday: 3pm-5pm
Output Indicators
• Output indicators measure quantity. They measure
the amount of products or services provided or
number of customers served.
• Output indicators are volume-driven. They focus on
the level of activity in providing a particular
program.

Dr. Augustine Adu Frimpong [email protected]


19
Office Hours Monday-Thursday: 3pm-5pm
Examples of Output Indicators

Dr. Augustine Adu Frimpong [email protected]


20
Office Hours Monday-Thursday: 3pm-5pm
Outcome Indicators
• Outcome indicators measure success.
• They measure results and gauge program effectiveness.
• Outcome indicators are the most important performance measures
because they show whether or not expected results are being
achieved.
• Outcome indicators demonstrate return on investment.
• Policy and budget decision makers are generally most interested in
outcome indicators.
• Outcome indicators are the keystone of a balanced set of
performance indicators but must be supported by appropriate
indicators related to input, output, efficiency, and quality.

Dr. Augustine Adu Frimpong [email protected]


21
Office Hours Monday-Thursday: 3pm-5pm
Examples of Outcome Indicators

Dr. Augustine Adu Frimpong [email protected]


22
Office Hours Monday-Thursday: 3pm-5pm
Efficiency Indicators
• Efficiency indicators measure productivity and cost-effectiveness.
• They reflect the cost of providing services or achieving results. Cost can
be expressed in terms of dollars or time per unit of output or outcome.
• Efficiency measures can also portray the relationship of inputs to outputs
or outcomes. Ratios are sometimes used to express these relationships.
• Efficiency indicators can gauge the timeliness of services provided.
• Efficiency measures are important for management and evaluation.
• They help organizations improve service delivery. Often they are used to
justify equipment acquisitions or changes to systems or processes.
• Measuring “cost per unit of service” is critical for many programs and
activities. Yet this indicator is frequently omitted from balanced sets of
performance indicators.

Dr. Augustine Adu Frimpong [email protected]


23
Office Hours Monday-Thursday: 3pm-5pm
Examples of Efficiency Indicators

Dr. Augustine Adu Frimpong [email protected]


24
Office Hours Monday-Thursday: 3pm-5pm
Quality Indicators
• Quality indicators measure excellence.
• They reflect effectiveness in meeting the expectations of customers,
stakeholders, and expectation groups.
• Measures of quality include reliability, accuracy, courtesy,
competence, responsiveness, and completeness associated with the
product or service provided.
• Lack of quality-- costs money. For example, resources devoted to
performing rework, correcting errors, or resolving customer
complaints can also be important to track.
• Quality measures are sometimes considered to be outcomes.
• However, quality indicators have been separately defined to reflect
the importance of quality improvement.

Dr. Augustine Adu Frimpong [email protected]


25
Office Hours Monday-Thursday: 3pm-5pm
Examples of Quality Indicators

Dr. Augustine Adu Frimpong [email protected]


26
Office Hours Monday-Thursday: 3pm-5pm
Performance Indicator Levels
• Performance indicator levels may be:
• key (K),
• supporting (S), or
• general performance information (GPI or G).

Dr. Augustine Adu Frimpong [email protected]


27
Office Hours Monday-Thursday: 3pm-5pm
Key Indicators [K]
• Key indicators are included in the executive budget
supporting document and the general or ancillary operating
appropriations bill.
• For key indicators, performance standards are established
during the appropriation process.
• Key indicators are tracked for accountability purposes in the
Louisiana Performance Accountability System (LaPAS);
interim targets and actual performance must be reported in
each quarterly performance progress report.
• Key indicators generally are measures of outcome, measures
related to big ticket items or hot button issues, and/or
especially valued or expressly demanded by decision makers.

Dr. Augustine Adu Frimpong [email protected]


28
Office Hours Monday-Thursday: 3pm-5pm
Supporting Indicators (S)
• Supporting indicators are included in the executive budget
supporting document but not in the general or ancillary
operating appropriations bill.
• For supporting indicators, performance standards are established
during the appropriation process. (Unless they are modified
during the appropriation process by language amendments in the
bill, the performance standard values proposed in the executive
budget supporting document become enacted performance
standards.)
• Supporting indicators are tracked in LaPAS, but interim targets
and actual performance must be reported in only second quarter
(midyear) and fourth quarter (yearend) performance progress
reports.
Dr. Augustine Adu Frimpong [email protected]
29
Office Hours Monday-Thursday: 3pm-5pm
General Performance Information (GPI)
Indicators
• General performance information (GPI) indicators provide data
on an actual basis only.
• GPI indicators are reported in the executive budget supporting
document and may appear in the general or ancillary operating
appropriations bill for information only.
• No performance standards are developed or enacted for GPI
indicators.
• GPI indicators are reported in LaPAS so that history may be built.
• However, only actual data are reported at second quarter (prior
year actual) and fourth quarter (yearend actual) progress reports.

Dr. Augustine Adu Frimpong [email protected]


30
Office Hours Monday-Thursday: 3pm-5pm
Aligning Types, Examples, and Measurement Levels of
Performance Indicators

Dr. Augustine Adu Frimpong [email protected]


31
Office Hours Monday-Thursday: 3pm-5pm
Aligning Types, Examples, and Measurement Levels of
Performance Indicators

Dr. Augustine Adu Frimpong [email protected]


32
Office Hours Monday-Thursday: 3pm-5pm
Aligning Types, Examples, and Measurement Levels of
Performance Indicators

Dr. Augustine Adu Frimpong [email protected]


33
Office Hours Monday-Thursday: 3pm-5pm
Aligning Types, Examples, and Measurement Levels of
Performance Indicators

Dr. Augustine Adu Frimpong [email protected]


34
Office Hours Monday-Thursday: 3pm-5pm
Thank You
• Next Lesson
• 10 minutes break

Dr. Augustine Adu Frimpong [email protected]


35
Office Hours Monday-Thursday: 3pm-5pm
Lesson 12
Quarterly Performance Progress Reports– LaPAS
Application

Saturday- October 29, 2022


4:30 - 7:20 pm

Dr. Augustine Adu Frimpong [email protected]


36
Office Hours Monday-Thursday: 3pm-5pm
Introduction to Performance Progress Quarterly Report

• Act 1465 of 1997 (the Louisiana Government


Performance and Accountability Act) requires that
each agency (budget unit) receiving an appropriation
in the general appropriation act or the ancillary
appropriation act produce a series of performance
progress reports.

Dr. Augustine Adu Frimpong [email protected]


37
Office Hours Monday-Thursday: 3pm-5pm
Connection Between an Agency’s Quarterly Performance Progress
Reporting and that Agency’s Appropriations

• The purpose of these reports is to provide the legislature with


information on the agency's actual progress toward achievement
of performance standards for performance indicators contained
within the general appropriation act, the ancillary appropriation
act, and the executive budget supporting document.
• In fact, the availability of funds appropriated is conditioned upon
each agency’s compliance with statutory provisions relative to
reporting of performance.

Dr. Augustine Adu Frimpong [email protected]


38
Office Hours Monday-Thursday: 3pm-5pm
What Louisiana State Agency is Considered the Official
Performance Data Record Keeper?
• The Office of Planning and Budget (OPB) in the Division of Administration, is
the official record keeper and repository of performance data.
• The OPB maintains an electronic performance database, the Louisiana
Performance Accountability System (LaPAS).
• For the overall management of LaPAS, it is a joint venture of the Division of
Administration's Office of Planning and Budget (OPB) and Office of
Information Services (OIS).
• The LaPAS is used to tracks performance standards, interim performance
targets, and actual performance.
• To ensure the integrity of the performance database, the OPB also designates the medium
for transmission and storage and establishes the rules for electronic transmission of
progress reports and database access.
• Quarterly performance progress reports are submitted by state departments and agencies
via LaPAS.

Dr. Augustine Adu Frimpong [email protected]


39
Office Hours Monday-Thursday: 3pm-5pm
Website of LaPAS

• LaPAS can be accessed on the OPB website:


• (http://www.doa.la.gov/Pages/opb/lapas/lapas.aspx)
• PI View (louisiana.gov)
• It is available for public viewing and searching

Dr. Augustine Adu Frimpong [email protected]


40
Office Hours Monday-Thursday: 2pm-5pm
Guidelines for Performance Progress
Reporting
• Act 1465 of 1997 provides official definitions
and sets specific requirements for
submission and content of the performance
progress reports.

Dr. Augustine Adu Frimpong [email protected]


41
Office Hours Monday-Thursday: 3pm-5pm
Definitions and Explanations associated with
LaPAS
• Fiscal Year - The Fiscal Year for which the Performance Indicator is
being reported.
• State Fiscal years are from July 1 through June 30 of every year.
• While federal fiscal year is October 1 through September 30 of every year.
• Program - A grouping of activities directed toward the
accomplishment of a clearly defined objective or set of objectives.
• Objective - A specific and measurable target for achievement
which describes the exact results sought, which is expressed in an
outcome-oriented statement that may reflect effectiveness,
efficiency, or quality of work, and which may be either numeric or
non-numeric.

Dr. Augustine Adu Frimpong [email protected]


42
Office Hours Monday-Thursday: 3pm-5pm
Definitions and Explanations associated with LaPAS..
• Performance Indicator - A statement identifying an activity,
input, output, outcome, achievement, ratio, efficiency, or
quality to be measured relative to a particular goal or objective
in order to assess an agency's performance.
• Performance indicators are the tools used to measure the
performance of programs.
• Performance indicators consist of two parts: indicator name
and indicator value.
• The indicator name describes what you are measuring.
• The indicator value is the numeric (or other) value or level
achieved within a given measurement period.

Dr. Augustine Adu Frimpong [email protected]


43
Office Hours Monday-Thursday: 3pm-5pm
Definitions and Explanations associated with
LaPAS..
• Key Performance Indicator (K) –
• A performance indicator that is included in the executive budget supporting document and
the general appropriation act or the ancillary appropriation act.
• Key indicators are outcome indicators (indicators that directly relate to or measure the
outcome described in an objective) or other measures that provide especially valuable
information for budget decision making.
• Key indicators always have a performance standard (an expected level of performance at
the appropriation level).
• Key indicators are reported each quarter in LaPAS.
• Supporting Performance Indicator (S) –
• A performance indicator that is included in the executive budget supporting document but
not the general appropriation bill/act or ancillary appropriation bill/act.
• Many of these indicators are input, output, efficiency or quality indicators that help make up
a balanced set of indicators and provide important background information to support a key
indicator.
• Supporting indicators always have a performance standard (an expected level of
performance at the appropriation level).
• Supporting indicators are reported in LaPAS at second quarter (or midyear) and a yearend
actual is reported at fourth quarter (or yearend).

Dr. Augustine Adu Frimpong [email protected]


44
Office Hours Monday-Thursday: 3pm-5pm
Definitions and Explanations associated with
LaPAS..
• Performance Standard - The expected level of performance
associated with a particular performance indicator for a
particular period.
• Performance standards are developed during the operating
budget development process and established during the
appropriation process.
• They represent the expected level of performance (performance
indicator values) to be achieved during the fiscal year for which a
budget estimate or an appropriation applies.
• Performance standards are commitments for service associated
with the level of funding budgeted/appropriated.

Dr. Augustine Adu Frimpong [email protected]


45
Office Hours Monday-Thursday: 3pm-5pm
Definitions and Explanations associated with
LaPAS.. [Performance Standard - ]
• Performance standards and indicator values must be
numeric (numbers, dollars, and percentages).
• LaPAS cannot calculate a variance on nonnumeric values.
• Numeric performance standards and indicator values may
have various numeric formats (plain numbers, dollars,
percentages, etc.).
• # = plain number
• $ = dollar
• % = percentage

Dr. Augustine Adu Frimpong [email protected]


46
Office Hours Monday-Thursday: 3pm-5pm
Definitions and Explanations associated with LaPAS..
• Interim Performance Targets - Intermediate service levels marked for
accomplishment.
• Annual performance standards are divided by the agency into quarterly
(for key indicators) or semi-annual (for supporting indicators)
performance targets.
• Actual year-to-date performance for each report period is compared to
the cumulative target for that same report period and variances are
calculated.
• Interim performance targets are set by agencies in their first quarter
performance progress reports.
• Because performance progress reports are cumulative, interim
performance targets should be cumulative. For information on how to
set targets, see “Guidelines: Quarterly Performance Progress Reports”
on the OPB website.

Dr. Augustine Adu Frimpong [email protected]


47
Office Hours Monday-Thursday: 3pm-5pm
Definitions and Explanations associated with
LaPAS..
• Prior Year Actual - Actual data (real data based on actual activity) for
the prior fiscal year.
• Quarter - This column identifies the four quarterly reporting periods
for the current fiscal year.
• Interim Target - This column contains quarterly interim targets for the
current fiscal year. Agencies may manage their own targets in a
forward manner. Targets for all key and supporting indicators are first
entered during the First Quarter. After that, agencies may modify
quarter targets in those quarters that occur after the one in which the
agency is reporting. For example, in the second quarter, an agency may
revise targets for Third Quarter and Fourth Quarter. Targets should be
cumulative.
• Actual Value - This column contains figures for quarterly actual
performance in the current fiscal year. Actuals should be cumulative.

Dr. Augustine Adu Frimpong [email protected]


48
Office Hours Monday-Thursday: 3pm-5pm
Definitions and Explanations associated with
LaPAS..
• Variance - The percentage difference between a performance standard
or target and actual performance.
• Variance is calculated by dividing the actual performance by the
standard or interim target and subtracting 1.00.
• Variance= { [Actual /Targets]-1}
• Variance for most numeric indicators is calculated automatically by
the Louisiana Performance Accountability System (LaPAS).
• This definition of "variance" does not conform to that of the term as
used by statisticians.
• Act 1465 uses the term "variance" but the actual calculation sought
for performance comparison purposes is that of percentage
difference.
• The Office of Planning and Budget regrets any discomfort this may
cause the statistical community.

Dr. Augustine Adu Frimpong [email protected]


49
Office Hours Monday-Thursday: 3pm-5pm
Examples of Variance Calculations

Dr. Augustine Adu Frimpong [email protected]


50
Office Hours Monday-Thursday: 3pm-5pm
Interpretation of Variance
• NOTE: For variances in which zero “0” is part of the calculation
formula—that is, when either the standard or target or actual is “0,”
then LaPAS defaults to a variance of 5.5%.
• This necessitates an explanatory note but does not skew the
variance by signaling either a 0% or 100% variance.
• Variance from the performance standard (or an interim target) can
be numerically positive or negative.
• However, a numerically positive variance may not represent a
positive outcome; likewise a negative variance does not necessarily
indicate a negative outcome.

Dr. Augustine Adu Frimpong [email protected]


51
Office Hours Monday-Thursday: 3pm-5pm
Interpretation of Variance
• For this reason, agencies should indicate for each indicator
whether a positive or negative variance from the standard
is desired or represents “a good thing or a bad.”
• LaPAS defaults to the condition of a positive variance being
a desired outcome. However, agencies may modify this
setting to designate a desired negative variance.
• A positive variance means that the agency exceeded
target.
• A negative variance means the agency did not exceed
target (or failed to achieve its target).

Dr. Augustine Adu Frimpong [email protected]


52
Office Hours Monday-Thursday: 3pm-5pm
Definitions and Explanations associated with
LaPAS..
• Approved (Y)- This column shows whether a designated
agency approval authority has approved the information
reported by the agency for a particular quarter. If it displays
a "NO," then any information reported by the agency for
that quarter should be considered preliminary and
unofficial.

• Quarter 1 Notes - Explanatory notes or comments related to


Quarter 1.
• Quarter 2 Notes - Explanatory notes or comments related to
Quarter 2.

Dr. Augustine Adu Frimpong [email protected]


53
Office Hours Monday-Thursday: 3pm-5pm
Definitions and Explanations associated with
LaPAS..
• Quarter 3 Notes - Explanatory notes or comments related to
Quarter 3.
• Quarter 4 Notes - Explanatory notes or comments related to
Quarter 4.
• Yearend Notes - Explanatory notes or comments related to
yearend performance and the variance between annual
performance standard and yearend actual performance.

Dr. Augustine Adu Frimpong [email protected]


54
Office Hours Monday-Thursday: 3pm-5pm
Submission of Performance Progress Reports
• Title 39 requires that performance progress reports must be
submitted quarterly to the Joint Legislative Committee on the
Budget, the legislative fiscal officer, the legislative auditor, and the
commissioner of administration.
• Note: Electronic transmission of performance information through
the Louisiana Performance Accountability System (LaPAS) satisfies
this requirement.
• LaPAS permits secure entry and approval of actual performance
information, one quarter at a time, via a web-based software
application.
• To maintain the security and integrity of the performance database,
access to the LaPAS data entry/update and approval functions is
controlled through log-on identifications (IDs) and passwords.
Dr. Augustine Adu Frimpong [email protected]
55
Office Hours Monday-Thursday: 3pm-5pm
Submission of Performance Progress Reports

• IDs and initial default passwords are issued by the OPB. Requests
for creation or termination of LaPAS IDs must be submitted to the
OPB on LaPAS forms, which is available on the OPB website.
• Quarterly performance progress reports may be submitted on or
before their due dates, which are set by statute.
• Ten (10) days after its deadline (or due date), a progress report is
considered delinquent.
• An official, complete submission must contain all required
information and have all information approved by the designated
approval authority.

Dr. Augustine Adu Frimpong [email protected]


56
Office Hours Monday-Thursday: 3pm-5pm
Submission of Performance Progress Reports
• The schedule for submission of quarterly performance progress
reports appears below. (Act 1465 of 1997 set deadlines for the first
day of the months in which reports were due.
• However, Act 1169 of 1999 revised deadlines for submission of
performance progress reports from November 1, February 1, May
1, and September 1 to November 8, February 8, May 8, and
September 8.)
• For each quarterly report, LaPAS displays a code in the submission
period column for each performance indicator.
• This code identifies the period within that reporting schedule that
information for an indicator was submitted. These submission periods and
codes are: RP, LP, CP and “Blank”.

Dr. Augustine Adu Frimpong [email protected]


57
Office Hours Monday-Thursday: 3pm-5pm
Submission Periods and Codes
• RP = Regular Period - the period from the opening of
LaPAS for data entry, update, and approval through the
end of the deadline (or due date) for the report (see
deadlines above). A report submitted on or before the
deadline (or due date) is considered on-time and coded
"RP."
• LP = Late Period - the ten-day period after the reporting
deadline (or due date), during which a report is
considered late but not officially delinquent. A report
submitted after the deadline (or due date) but before
the end of the ten-day late period is coded "LP."

Dr. Augustine Adu Frimpong [email protected]


58
Office Hours Monday-Thursday: 3pm-5pm
Submission Periods and Codes
• = Blank - signifies that no report was submitted during the
regular or late reporting periods. LaPAS displays a blank in the
submission period column when an agency fails to submit
agency-approved data for a performance indicator. If an agency
fails to submit a report during the regular or late reporting
periods, that agency's report is considered delinquent.
• CP = Closed Period - the period during which LaPAS is closed for
regular or late data entry, update, and approval. As explained
below, LaPAS is closed after the ten-day late period and special
permission must be obtained to enter, update, and approve data
following that closure. A report filed during this closed period is
coded "CP" and is considered delinquent.

Dr. Augustine Adu Frimpong [email protected]


59
Office Hours Monday-Thursday: 3pm-5pm
General Information about Submission
• Generally, LaPAS is open for data entry, update, and approval
approximately thirty (30) days prior to the deadline (or due date)
for each quarterly report.
• The OPB uses a message marquee on the LaPAS main page to
provide information on LaPAS availability for quarterly reporting.
• Database access remains open to an agency until that agency has
officially submitted its performance data or the ten-day late period
has passed, whichever occurs first.
• Official submission is signified by approval of performance data by a
designated agency official using an approval authority ID.
• At that time, the agency's access to the LaPAS data entry and update
function will be closed.

Dr. Augustine Adu Frimpong [email protected]


60
Office Hours Monday-Thursday: 3pm-5pm
LaPAS Schedule for Quarterly Performance Progress
Reporting

Dr. Augustine Adu Frimpong [email protected]


61
Office Hours Monday-Thursday: 3pm-5pm
Application of
LAPAS

Dr. Augustine Adu Frimpong [email protected]


62
Office Hours Monday-Thursday: 3pm-5pm
Steps to Access LaPAS
• Step 1= Google the OPB websites below:
(http://www.doa.la.gov/Pages/opb/lapas/lapas.aspx)
• (http://www.doa.la.gov/Pages/opb/lapas/lapas.aspx)
• PI View (louisiana.gov), then you will see the interface
below:

Dr. Augustine Adu Frimpong [email protected]


63
Office Hours Monday-Thursday: 3pm-5pm
Steps to Access LaPAS

Dr. Augustine Adu Frimpong [email protected]


64
Office Hours Monday-Thursday: 3pm-5pm
Step 2= Click on “LaPAS View and Log-on” You’ll see the
interface below:

Dr. Augustine Adu Frimpong [email protected]


65
Office Hours Monday-Thursday: 3pm-5pm
Step 3= Under “ LAPAS FUNCTIONS” Click on “VIEW”
Then You’ll see the interface below:

Dr. Augustine Adu Frimpong [email protected]


66
Office Hours Monday-Thursday: 3pm-5pm
Step 4= Click on “any of the years (i.e. 2018)” Then You’ll
see the interface below:

Dr. Augustine Adu Frimpong [email protected]


67
Office Hours Monday-Thursday: 3pm-5pm
Step 5= Click on “any of the Department (i.e. Department of
Transportation and Development (2018))” Then You’ll see the
interface below:

Dr. Augustine Adu Frimpong [email protected]


68
Office Hours Monday-Thursday: 3pm-5pm
Step 6= Click on “any of the Agencies (i.e. Administration (2018))”
Then You’ll see the interface below:

Dr. Augustine Adu Frimpong [email protected]


69
Office Hours Monday-Thursday: 3pm-5pm
Step 7= Click on “any of the Programs (i.e. A Office of the
Secretary (2018))” Then You’ll see the interface below:

Dr. Augustine Adu Frimpong [email protected]


70
Office Hours Monday-Thursday: 3pm-5pm
Step 8= Click on “View” Then You’ll see the LaPAS interface below:

Dr. Augustine Adu Frimpong [email protected]


71
Office Hours Monday-Thursday: 3pm-5pm
Dr. Augustine Adu Frimpong [email protected]
72
Office Hours Monday-Thursday: 3pm-5pm
Dr. Augustine Adu Frimpong [email protected]
73
Office Hours Monday-Thursday: 3pm-5pm
Submission of Reports
• Undersecretaries must submit annual management and
program analysis reports to their department secretaries
before November 25th of each year. Prior to December
5th of each year, department secretaries must submit the
report to:
• 1. the governor
• 2. the commissioner of administration,
• 3. the House Appropriations Committee,
• 4. the Senate Finance Committee and
• 5. the standing committee of each house of the legislature
having responsibility for oversight of the department.
• Submit reports electronically as attachments to e-mail
transmissions. The Office of Planning and Budget
provides more detailed submission instructions each year.
Dr. Augustine Adu Frimpong [email protected]
74
Office Hours Monday-Thursday: 3pm-5pm
Thank You
• Good Night

Dr. Augustine Adu Frimpong [email protected]


75
Office Hours Monday-Thursday: 3pm-5pm
Discussion of Final Project
• 20minutes

Dr. Augustine Adu Frimpong [email protected] 76


Program Evaluation Designs
and Methodologies
Lesson 10

Saturday- October 29, 2022


1:30 - 3:00 pm

Dr. Augustine Adu Frimpong Lecture Series 77


Lesson objectives
• Upon Completion of this lesson, students will be able to do the following:
• Articulate the major concepts associated with program evaluation designs and
methodologies
• Define the concept of evaluation designs
• Identify and discuss the various forms/types of evaluation designs
• Align the components of logic model to process and outcome evaluation designs
• Distinguish between process and outcome evaluation designs
• Discussion of qualitative evaluation designs—Thematic Analysis & Triangulation
• Discussion of the different quantitative evaluation designs
• Present a sample of process evaluation design— using either Thematic or
Triangulation Applications
• Present a sample of outcome evaluation design— using quantitative design
applications

Dr. Augustine Adu Frimpong Lecture Series 78


What is Evaluation Design?
• Evaluation design is simply a plan for conducting evaluation.
• According to Evaluation Toolkits (2021), evaluation design is seen as a
blueprint for how one will conduct one’s program evaluation.
• Frankfort-Nachmias (2008) also argued that evaluation (or research)
design is the “blueprint” that enables the investigator to come up with
solutions.
• Note that by selecting the appropriate design and working through and
completing a well thought out logic plan provides a strong foundation
for achieving a successful and informative program evaluation
(Evaluation Toolkits, 2021).

Dr. Augustine Adu Frimpong Lecture Series 79


What Actually Informs the Selection of
Evaluation Design?
• The decision for an evaluation design depends on the
following:
• the evaluation questions
• the standards of effectiveness,
• the resources available
• the degree of precision needed.

Dr. Augustine Adu Frimpong Lecture Series 80


Standards for Evaluation Designs
• According to Spiel (2001), the important standards for evaluation
designs are internal and external validity.
• Where he explained that internal validity is attained when the
evaluator can decide whether a finding is due to the program and
cannot be caused by some other factors or biases,
• While external validity is attributed to the generalizability of findings;
for instance, to participants of the program in other places and at
other times.

Dr. Augustine Adu Frimpong Lecture Series 81


Importance of Evaluation Designs
• Evaluation design is very important because it makes
the evaluators become well-organized.
• Evaluation design is the structure that provides the
information needed to answer each of your evaluation
questions (Cook and Campbell, 1979; von Eye and
Spiel, 1996).

Dr. Augustine Adu Frimpong Lecture Series 82


Factors that Intended Evaluation Design
Should be based on and Aligned With:
• According to DiTommaso (2015), evaluators intended
evaluation design should be based on and aligned with the
following:
• (a) the program’s theory of change and logic model;
• (b) the primary purpose of the evaluation and key
research questions;
• (c) resources available for the evaluation;
• (d) funder’s evaluation requirements.

Dr. Augustine Adu Frimpong Lecture Series 83


Factors that Intended Evaluation Design
Should be based on and Aligned With:
• According to Evaluation Toolkits (2021), it is important to choose an
evaluation design that aligns with the following:
• (a) Program goals;
• (b) Evaluation research questions;
• (c) Purpose of the evaluation;
• (d) Available resources

Note: Consequently, it is advised that any form of evaluation that is likely to be


undertaken by an evaluator should be designed to answer the identified
evaluation research questions.

Dr. Augustine Adu Frimpong Lecture Series 84


A General Guide to the Selection of Evaluation
Design
• As a guide to assist the choice of evaluation design, it has been
suggested in the literature that evaluators can select a specific design
by considering the following:
• (a) Which design will provide me with the information I want?
• (b) How feasible is each option?
• (c) How valid and reliable do my findings need to be?
• (d) Are there any ethical concerns related to choosing a specific design?
• (e) How much would each option cost?

Dr. Augustine Adu Frimpong Lecture Series 85


Types of Evaluation Designs
• It is observed from the available literature that whenever an evaluator
wants to decide on a particular type of evaluation design, it is advisable
and appropriate for the evaluator to consider the following two sides of a
program’s logic:
• The Process, and
• The Outcome’.
• Christie and Fierro (2010) also advises anytime evaluators want to focus on
evaluation, they must work with the stakeholders in order to elucidate the
envisioned purpose of the evaluation, generate a list of clearly articulated
evaluation questions that align with the intended purpose, that will in
effect assist in the prioritization of the enlisted evaluation questions, and
thereafter aid in the selection of an appropriate evaluation design.
Dr. Augustine Adu Frimpong Lecture Series 86
Types of Evaluation Designs
Program Evaluation Designs Purpose of the Evaluation Data Collection Evaluation Questions
Components Methods/Strategies
Inputs Input evaluation To assess the resources Documents analysis, Are the resources adequate
needed to implement the interviews, focus groups and to implement the program?
program activities or to surveys. Are the program staff
Process Evaluation meet the program goals. capable of implementing the
program or addressing the
needs of the beneficiaries?

Activities Process evaluation To assess the program Documents analysis, Are the program being
activities—whether the interviews, focus groups, and implemented as planned?
program is being observations Are the activities adequate
implemented according to to address the needs of
plan. participants?
Outputs Output evaluation To assess immediate results Documents analysis, How long did it take to
based on the activities interviews, focus groups, provide support to each
carried out (accounting to observations, and surveys beneficiary? How many
actions taken). counseling sessions were
offered? How many
participants completed all
the sessions?
Outcomes Outcome Evaluation Outcome evaluation To assess the benefits of the Documents analysis, What have program
activities. interviews, focus groups, and participants benefited? What
surveys are the immediate effect of
the program on them?

Impacts Impact evaluation To assess effect of the Mainly surveys or using Did the program impact
program on beneficiaries instruments to measure the participants’ (state specific
Dr. Augustine Adu Frimpong Lecture Series variable of interest. behavior you are 87interested
in assessing)?
Process Evaluation Design
• According to Rural Health Information Hub (2021), process evaluation is
defined as a systematic, focused plan for collecting data to determine
whether the program model is implemented as originally intended and, if
not, how operations differ from those initially planned.
• It seeks to answer the question, “what services are actually being delivered
and to whom?”
• According to Rural Health Information Hub (2021), the following questions
may help assess the “how or the process” aspect of a program through
process evaluation.
• They include:
• (a) how is the program being implemented?
• (b) under what conditions does the program work?
• (c) is the target population participating at expected levels?
• (d) can the program be replicated?

Dr. Augustine Adu Frimpong Lecture Series 88


Goal of Process Evaluation Design
• The goal of process evaluation designs, according to DiTommaso
(2015), is to:
• (a) document what the program is doing;
• (b) document to what extent and how consistently the program has been
implemented as intended; and
• (c) to inform or bring about changes or improvements in the program’s
operations.

Dr. Augustine Adu Frimpong Lecture Series 89


Common Features of Process Evaluation
Design
• Unlike outcome evaluation, the common features among all process
evaluation designs consider the following attributes:
• (1) it does not require a comparison group;
• (2) it includes both qualitative and quantitative methods of data collection;
• (3) it does not require advanced statistical data method of analysis.

Dr. Augustine Adu Frimpong Lecture Series 90


Common Qualitative and Quantitative Methods of
Data Collection for Process Evaluation Design
• The common qualitative and quantitative data
collection methods of the process evaluation include
the following:
• (a) review of program documents and records;
• (b) review of administrative data;
• (c) interviews, focus group;
• (d) direct observation.

Dr. Augustine Adu Frimpong Lecture Series 91


Common Qualitative and Quantitative Methods of
Data Analysis of Process Evaluation Design
• In relation to the process evaluation the most
common types of data analysis of process evaluation
include:
• (a) Thematic identification (DiTommaso, 2015);
and
• (b) Confirmation of findings across sources, which
is also called triangulation (DiTommaso, 2015).

Dr. Augustine Adu Frimpong Lecture Series 92


Thematic Analysis
• According to Caulfied (2019), thematic analysis is a method of analyzing
qualitative data.
• It is usually applied to a set of texts, such as interview transcripts.
• He further explained that the researcher closely examines the data to
identify common themes – topics, ideas and patterns of meaning that
come up repeatedly.
• The two major types of thematic analysis are inductive and deductive thematic
analysis.
• An inductive approach involves allowing the data to determine your
themes, while deductive approach involves coming to the data with
some preconceived themes you expect to find reflected there, based on
theory or existing knowledge (Caulfied, 2019).
Dr. Augustine Adu Frimpong Lecture Series 93
Six-Steps in Conducting Thematic Analysis
• The six-step process include the
following steps:
• (1) familiarization– get to know your
data (transcribing audio, reading through
the text, scanning through the interview
documents, and taking initial notes, as
well as generally looking through the
data to get familiar with it)
• (2) coding– Coding simply means
highlighting sections of your text –
usually phrases or sentences – and
coming up with shorthand labels or
“codes” to describe their content.

Dr. Augustine Adu Frimpong Lecture Series 94


Six-Steps in Conducting Thematic Analysis
• The six-step process include the
following steps:
• (3) generating themes– after the coding
of the interviews, then the next step is for
the evaluator or researcher to generate
themes from the coding. Here, the
evaluator or the researcher has to look
over the codes that they have created to
identify patterns among them, and to
start generating themes from them.
• (4) reviewing themes– Once the
evaluator finished generating the themes,
then the next task is to review all the
themes

Dr. Augustine Adu Frimpong Lecture Series 95


Six-Steps in Conducting Thematic Analysis
• The six-step process include the following steps:
• (5) defining and naming themes– as soon as the final list of themes is
developed, then the evaluator will go ahead to name and define each of them
• (6) writing up– The final stage of the thematic analysis is to analyze the
generated themes. Here you will use the themes to lead the discussions. In
the end, the evaluator has to summarize the responses under each theme as
part of the data analysis.

Dr. Augustine Adu Frimpong Lecture Series 96


Triangulation
• As experts have underscored or pointed out, triangulation very much
involves the utilization of more than one method to collect data on the
same topic (Kulkarni, 2013).
• The application of triangulation is, indeed, a way of assuring the validity of
research or evaluation through the use of a variety of methods to collect
data on the same topic, which involves different types of samples as well as
methods of data collection (Kulkarni, 2013).
• The purpose of triangulation is not necessarily to cross-validate data but
rather to capture different dimensions of the same phenomenon to
facilitate the understanding of research and evaluation findings.
(Denzin, 1989; Goetz & LeCompte, 1984 cited in Denzin, 1998, p.97).
Dr. Augustine Adu Frimpong Lecture Series 97
Sample of Process Evaluation Design—
Thematic and Triangulation Applications.

Dr. Augustine Adu Frimpong Lecture Series 98


Outcome Evaluation Designs
•An outcome evaluation involves the
examination of how well a project achieved the
expected outcomes.
•It is generally a summative evaluation of the
program which can be used to make
recommendations for future program
improvements. (DiTommaso, 2015)
Dr. Augustine Adu Frimpong Lecture Series 99
Goal of Outcome Evaluation Designs

•The goals of the outcome evaluation designs


are to:
•(a) identify the results or effects of a
program;
•(b) measure program beneficiaries' changes
in knowledge, attitude(s), and/or behavior(s)
that result from a program.
Dr. Augustine Adu Frimpong Lecture Series 100
Features of Outcome Evaluation Designs
• Unlike the process evaluation, the common features of
outcome evaluation design consider the following:
• (a) typically requires quantitative data;
• (b) often requires advanced statistical methods; and
• (c) may include a comparison group (impact evaluation)
(DiTommaso, 2015).

Dr. Augustine Adu Frimpong Lecture Series 101


Designs of Outcome Evaluation Analysis
• Outcome evaluation analysis or designs consider only quantitative
analysis, evaluation, or research designs.
• There are four main types of Quantitative research:
• Descriptive,
• Correlational,
• Causal-Comparative/Quasi-Experimental,
• Experimental Research
(DiTommaso, 2015; Frankfort-Nachmias, 2008, p.103).

Dr. Augustine Adu Frimpong Lecture Series 102


Descriptive research design
• Descriptive research design is used to describe the current status of an identified
variable.
• These research projects are designed to provide systematic information about a
phenomenon. The researcher does not usually begin with a hypothesis, but is
likely to develop one after collecting data. The analysis and synthesis of the data
provide the test of the hypothesis.
• Systematic collection of information requires careful selection of the units studied
and careful measurement of each variable.
• Examples of Descriptive Research include but not limited to:
• (a) A description of how second-grade students spend their time during summer vacation;
• (b) A description of the tobacco use habits of teenagers;
• (c) A description of how parents feel about the twelvemonth school year;
• (d) A description of the attitudes of scientists regarding global warming;
• (e) A description of the kinds of physical activities that typically occur in nursing homes, and
how frequently each occurs;
• (f) A description of the extent to which elementary teachers use math manipulatives.

Dr. Augustine Adu Frimpong Lecture Series 103


Correlational research/evaluation design
• Correlational research/evaluation design is used to determine the extent of a
relationship between two or more variables using statistical data.
• In this type of design, relationships between and among a number of facts are sought
and interpreted. This type of research will recognize trends and patterns in data, but it
does not go so far in its analysis to prove causes for these observed patterns (Kanner,
Coyne, Schaefer, & Lazarus, 1981; Cherry, 2020; Swaim, 2020).
• Sometimes correlational research is considered a type of descriptive research, and not as
its own type of research, as no variables are manipulated in the study.
• Examples of correlational research include but not limited to the following:
• (a) the relationship between intelligence and self-esteem;
• (b) the relationship between diet and anxiety;
• (c) the relationship between an aptitude test and success in an algebra course;
• (d) the relationship between ACT scores and the freshman grades;
• (e) the relationships between the types of activities used in math classrooms and student
achievement;
• (f) The covariance of smoking and lung disease.

Dr. Augustine Adu Frimpong Lecture Series 104


Causal-comparative/quasi-experimental
research/evaluation design
• Causal-comparative/quasi-experimental research/evaluation design is utilized to establish cause-effect relationships among the
variables.
• Quasi-experimental design does not have a random assignment component, but may involve comparing a treatment group to a
similar group that is not participating in the program. Quasi-experimental methods are used to estimate the effect of a treatment,
policy, or intervention when controlled experiments are not feasible (DiTommaso, 2015; Frankfort-Nachmias, 2008; Leona et al.,
1998).
• Examples of Causal-comparative/quasi-experimental research could be attributed to the following factors:
• (a) the effect of preschool attendance on social maturity at the end of the first grade;
• (b) the effect of gender on algebra achievement;
• (c) the effect of part-time employment on the achievement of high school students.

• There are different types of quasi-experimental designs, according to DiTommaso (2015), which include—
• (a) Regression discontinuity; is a quasi-experimental pretest-posttest design that aims to determine the causal effects of interventions by assigning a cutoff or
threshold above or below which an intervention
• (b) Differences-in-differences; is one way to estimate the effects of new policies. To use diff-in-diff, we need observed outcomes of people who were exposed to
the intervention (treated) and people not exposed to the intervention (control), both before and after the intervention.
• (c) Comparative interrupted time series; is a quasi-experimental design commonly used in public health to evaluate the impact of interventions or exposures.
• (d) Pre/post-test with matched comparison group
• —Propensity score matching,
• -- Case matching,
• -- Instrumental variable.
• Etc.

Dr. Augustine Adu Frimpong Lecture Series 105


Experimental design
• Experimental design is used to determine if a program or intervention is more effective
than the current process.
• Involves randomly assigning participants to a treatment or control group (DiTommaso,
2015; Frankfort-Nachmias, 2008). This type of design is often considered to be the gold
standard against which other research designs are judged, as it offers a powerful
technique for evaluating cause and effect (DiTommaso, 2015).
• This type of design uses the scientific method to establish the cause-effect relationship
among a group of variables that make up a study (see Table 9.5 for more details).
• The true experiment is often thought of as a laboratory study, but this is not always the
case; a laboratory setting has nothing to do with it.
• A true experiment is any study where an effort is made to identify and impose control
over all other variables except one. An independent variable is manipulated to determine
the effects on the dependent variables (DiTommaso, 2015; Frankfort-Nachmias, 2008).

Dr. Augustine Adu Frimpong Lecture Series 106


Experimental design

Dr. Augustine Adu Frimpong Lecture Series 107


Sample of Outcome Evaluation Design—
Quantitative Design Applications

Dr. Augustine Adu Frimpong Lecture Series 108


Thank You
• Next Lesson

Dr. Augustine Adu Frimpong


109
Lecture Series
Planning for Effective
Evaluation Measures
Lesson 9

Saturday- October 29, 2022


10:00 - 11:20 am

Dr. Augustine Adu Frimpong Lecture Series 110


Lesson Objectives
• Upon Completion of this lesson, students will be able to do the
following:
• Discuss the steps involved in planning for effective evaluation measures
• Connect/link program’s needs, activities, and outcomes
• Explain how evaluation questions determine the method/measure
• Align evaluation questions, purposes and methods
• Align evaluation types, data sources, purposes and components of logic model
• Have a clear plan for how to use evaluation results

Dr. Augustine Adu Frimpong Lecture Series 111


Introduction
• In the quest of striving for ensuing effective evaluation
practices, many academic institutions or educators tend to
face and subsequently address the following critical
question:
• “How do educators and education policymakers decide
on which alternative is likely to provide a lot most
effective outcome?”
• Evaluation and self-assessment are at the heart of strong
programs (i.e. education systems).

Dr. Augustine Adu Frimpong Lecture Series 112


Introduction
• To ensure an effective evaluation through scientific
methods, then the practice of evaluations should be:
• systematic,
• iterative
• Must be used to inform the following:
• (a) whether a program is effective in meeting its stated
goals and outcomes for participants
• (b) future programming decisions (thus, future
allocation of resources).

Dr. Augustine Adu Frimpong Lecture Series 113


Four (4) prerequisites steps to carry-out
effective evaluation
• Four (4) prerequisites steps to carry-out effective evaluation. They
include:
• (1) by starting with a clear and measurable statement of objectives;
• (2) Connecting activities and outcomes;
• (3) Let the evaluation questions determine the evaluation method;
• (4) Be open-minded about the findings and have a clear plan for how to use
the results.

Dr. Augustine Adu Frimpong Lecture Series 114


By starting with clear and measurable
objectives
• It may sound obvious but understanding whether program activities
have been effective requires a clear understanding of what the
program is trying to achieve.
• The objectives also need to be measurable (refer to lesson 5 for more
details).
• In these instances, it is easy to start with a clear statement of
objectives (i.e. to improve students’ ability to read by ensuring that
the number of students who can read is increased by 10% at the end
of Fall 2021 semester).

Dr. Augustine Adu Frimpong Lecture Series 115


By starting with clear and measurable
objectives

Dr. Augustine Adu Frimpong Lecture Series 116


Connecting activities and outcomes
• Effective programs have a clear line of sight between the needs they
are responding to, the resources available for implementation, the
activities undertaken with those resources, and how activities will
deliver outcomes.
• Logic modelling is one way to put these components on a piece of
paper.
• Wherever possible, this should be done by those who are developing
and implementing a program or policy, in conjunction with an
experienced evaluator.

Dr. Augustine Adu Frimpong Lecture Series 117


Connecting activities and outcomes

Dr. Augustine Adu Frimpong Lecture Series 118


Let the evaluation questions determine the
method:
• After a clear problem statement has been developed, the inputs and
activities are identified, and intended outcomes have been
established, coherent evaluation questions can be developed.
• Good evaluation will ask questions such as:
• (a) Did the program deliver what was intended? If not, why not?
• (b) Did the program reach the right recipients? If not, why not?
• (c) Did the program achieve the intended outcome and were there any
unintended (positive or negative) outcomes?
• (d) For whom did it work and under what circumstances?
• (e) Is this the most efficient way to use limited resources?

Dr. Augustine Adu Frimpong Lecture Series 119


Be open-minded and have a clear plan for how
to use the results:
• Indeed, in general open-minded help the evaluator to learn,
brainstorm, and try new evaluation techniques.
• Open-mindedness is the willingness to search actively for evidence
against one's favored beliefs, plans, or goals, and to weigh such
evidence fairly when it is available.
• Stakeholders need to understand the questions the evaluation sought
to answer, the methods employed to answer them, any assumptions
that were made, what the evaluation found and the consequences of
those findings in order to make use of the findings.

Dr. Augustine Adu Frimpong Lecture Series 120


Factors that inform Evaluation Plans
• Evaluation plans either considers one of the following
questions:
• (a) what kinds of decisions you need/want to make?
• (b) what resources you have on hand?
• (c) for what purpose is the evaluation being done, or
about to carried-out serve or what evaluators want to be
able to decide as a result of the evaluation?
• (d) Who are the audiences or stakeholders for the
evaluation?

Dr. Augustine Adu Frimpong Lecture Series 121


Ensuring Use of Evaluation Findings
• As part of coming with open-mindedness to plan for evaluation in
order to ensure the use of findings, one is also urged to pay special
attention to the data source for the evaluation and the strategies that
is being utilized to gather data to support the evaluation.
• Since program evaluation is data-driven, therefore the evaluation
must pay special attention to the credibility of the data sources.
• It is important to conclude that to ensure use of evaluation findings
the purpose of the evaluation must be linked with the evaluation data
source and the strategies used to gather data as a way to ensure
credibility in findings (see Tables below for more details).

Dr. Augustine Adu Frimpong Lecture Series 122


Ensuring Use of Evaluation Findings
• In choosing either the appropriate strategy or
data collection method to ensure use of
evaluation results, it is important to note the
overall goal or purpose in selecting
evaluation method.
• Sometimes the choice of the method is
guided by the need to get the most useful
information to assist key decision-makers in
the most cost-effective and realistic fashion.
• Ideally, evaluators might use a combination
of methods to allow for “triangulation” (i.e.
getting information from multiple sources is
called triangulation).

Dr. Augustine Adu Frimpong Lecture Series 123


Planning for Use Based on the Evaluation
Purpose
Evaluation Type Purpose of the Evaluation Data sources Evaluation Questions

Inputs To assess the resources needed to Program documents, staff, and stakeholders Are the resources adequate to implement the
Evaluation implement the program activities or to meet program? Are the program staff capable of
the program goals. implementing the program or addressing the
needs of the beneficiaries?

Process To assess the program activities—whether Program monitoring documents, staff, and Are the program being implemented as
Evaluation the program is being implemented stakeholders planned? Are the activities adequate to
according to plan. address the needs of participants?

Outputs Evaluation To assess immediate results based on the Participants, program documents, and staff How long did it take to provide support to
activities carried out (accounting to actions each beneficiary? How many counseling
taken). sessions were offered? How many
participants completed all the sessions?

Outcomes To assess the benefits of the activities. Participants, program documents, and staff What have program participants benefited?
Evaluation What are the immediate effect of the
program on them?

Impacts To assess effect of the program on Participants, and program documents Did the program impact participants?
Evaluation beneficiaries

Dr. Augustine Adu Frimpong Lecture Series 124


Planning for Use Based on the Evaluation
Purpose
Program Components Evaluations Types Purpose of the Evaluation Data Collection
Methods/Strategies

Inputs Input evaluation To assess the resources needed to Documents analysis, interviews,
implement the program activities focus groups and surveys.
or to meet the program goals.
Formative Evaluation

Activities Process evaluation To assess the program activities— Documents analysis, interviews,
whether the program is being focus groups, and observations
implemented according to plan.

Outputs Output evaluation To assess immediate results based Documents analysis, interviews,
on the activities carried out focus groups, observations, and
(accounting to actions taken). surveys

Outcomes Summative Evaluation Outcome evaluation To assess the benefits of the Documents analysis, interviews,
activities. focus groups, and surveys

Impacts Impact evaluation To assess effect of the program on Mainly surveys or using
beneficiaries instruments to measure the
variable of interest.

Dr. Augustine Adu Frimpong Lecture Series 125


The appropriate steps to guide one in choosing the
appropriate method to support evaluations:
• In a lot more systematic way, it is advised for one to think about the
following questions as one think about the appropriate steps to guide
one in choosing the appropriate method to support evaluations:
• (a) What information is needed to make current decisions about a program?
• (b) Of this information, how much can be collected and analyzed in a low-cost
and practical manner, e.g. using questionnaires, surveys and checklists?
• (c) How accurate will the information be? Can I live with the disadvantages of
my method (as there are always disadvantages)?
• (d) Will the method (s) get me all of the information I need?

Dr. Augustine Adu Frimpong Lecture Series 126


The appropriate steps to guide one in choosing the
appropriate method to support evaluations:
• (e) What additional methods should and could be used if additional
information is needed?
• (f) Will the information appear as credible to decision-makers, e.g. to
funders or top management?
• (g) Will the nature of the audience conform to the methods, e.g., will they
fill out questionnaires carefully, engage in interviews or focus groups, let
you examine their documentations, etc.? Also think about who one’s
participant happens to be?
• (h) Can someone either administer the methods immediately, or is special
training required to do so?
• (i) How can the resulting data be analyzed? Instead of providing specific
answers to the foregoing crucial queries, they can simply serve as guides in
the process.
Dr. Augustine Adu Frimpong
127
Lecture Series
Formative and Summative CDC’s Healthy Community Program

• Evaluation falls into one of two broad


categories: formative and summative.
• Formative evaluations are conducted
during program development and
implementation and are useful if you
want direction on how to best achieve
your goals or improve your program.
• Summative evaluations should be
completed once your programs are well
established and will tell you to what
extent the program is achieving its
goals.

Dr. Augustine Adu Frimpong Lecture Series 128


How do I conduct an evaluation?
Examples from the “Violence Against Women” campaign implemented in Western Australia

• Engage stakeholders—This first step involves identifying and engaging stakeholders.


These individuals have a vested interest in the evaluation.
• Find out what they want to know and how they will use the information.
• Involve them in designing and/or conducting the evaluation.
• For less involved stakeholders, keep them informed about activities through meetings, reports and
other means of communication (CDC, 2008; McDonald et al., 2001).
• EX: The program planners of the Violence Against Women campaign included internal
and external partners as stakeholders. Internal partners were the Director of the
Domestic Violence Prevention Unit and the Family and Domestic Violence Task Force.
External partners were experts in the field of social marketing/behavior change, health
promotions, communication, and women’s issues; the Department of Family and
Children’s Services; service providers including trained counselors, therapists and social
workers; and the police. The program planners kept in touch with stakeholders and got
input from them throughout the campaign (Turning Point Social Marketing Collaborative,
Centers for Disease Control and Prevention, and Academy for Educational Development,
2005).

Dr. Augustine Adu Frimpong Lecture Series 129


How do I conduct an evaluation?
• Identify program elements to monitor—In this step you and/or the team decides what’s worth
monitoring.
• To decide which components of the program to oversee, ask yourself who will use the information and how,
what resources are available, and whether the data can be collected in a technically sound and ethical
manner.
• Monitoring, also called process evaluation, is an ongoing effort that tracks variables such as funding received,
products and services delivered, payments made, other resources contributed to and expended by the
program, program activities, and adherence to timelines.
• Monitoring during program implementation will let you know whether the program is being implemented as
planned and how well the program is reaching your target audience.
• If staff and representative participants see problems, you are able to make mid-course program corrections
(CDC, 1999, 2008).
• EX: A needs assessment was conducted using focus groups of general population males and
perpetrators. It identified the need for a prevention focus targeting both violent and potentially
violent men. The messages would need to avoid an accusatory or blaming tone because that
would cause the target audiences to reject the information. Process evaluation would be
implemented to monitor the campaign’s reach, the messages’ effectiveness, the audiences’
awareness of the Men’s Domestic Violence Helpline, and changes in attitudes toward domestic
violence (Turning Point Social Marketing Collaborative et al., 2005).

Dr. Augustine Adu Frimpong Lecture Series 130


How do I conduct an evaluation?
• Select the key evaluation questions—Basic evaluation questions which should be
adapted to your program content include:
• What will be evaluated? (i.e., What is the program and in what context does it exist?)
• Was fidelity to the intervention plan maintained?
• Were exposure levels adequate to make a measurable difference?
• What aspects of the program will be considered when judging performance?
• What standards (type or level of performance) must be reached for the program to be considered
successful?
• What evidence will be used to indicate how the program has performed?
• How will the lessons learned from the inquiry be used to improve public health effectiveness?
(CDC, 1999, 2008).
• EX: The evaluation measured the following: (1) General awareness of, attitudes towards,
and professed behaviors relating to domestic violence; (2) awareness of how to get help,
such as knowledge about available support services and where to telephone for help; (3)
inclination to advise others to telephone the Helpline; and (4) advertising reach and
impact, message take-away, attitudes toward the campaign, calls to the Helpline, and
acceptance of referrals to counseling (Turning Point Social Marketing Collaborative et al.,
2005).

Dr. Augustine Adu Frimpong Lecture Series 131


How do I conduct an evaluation?
• Determine how the information will be gathered—In this step, you and/or the team must decide
how to gather the information.
• Decide which information sources and data collection methods will be used.
• Develop the right research design for the situation at hand. Although there are many options, typical choices
include: (1) Experimental designs (use random assignment to create intervention and control groups,
intervention is administered to only one group, and then compare the groups on some measure of interest to
see if the intervention had an effect); (2) quasi-experimental designs (same as experimental but does not
necessarily involve random assignment of participants to groups); (3) Surveys (a quick cross-sectional
snapshot of an individual or a group of people on some measure via telephone, Internet, face-to-face, etc.);
and (4) case study designs (an individual or a situation is investigated deeply and considered substantially
unique).
• The choice of design will determine what will count as evidence, how that evidence will be gathered and
processed, and what kinds of claims can be made on the basis of the evidence (CDC, 1999, 2008; Yin, 2003).
• EX: In the first seven months of the campaign, a three-wave statewide random telephone survey
was conducted. In each wave, approximately 400 males, 18-40 years old who were in a
heterosexual relationship were interviewed. The three surveys took place (1) prior to the
campaign to serve as a baseline; (2) four weeks into the campaign to assess initial impact,
including advertising reach so that any deficiencies could be detected and modified; and (3) seven
months into the campaign to identify any significant changes in awareness of sources of
assistance, particularly the Men’s Domestic Violence Helpline as well as any early changes in
beliefs and attitudes (Turning Point Social Marketing Collaborative et al., 2005).

Dr. Augustine Adu Frimpong Lecture Series 132


How do I conduct an evaluation?
• Develop a data analysis and reporting plan—During this step, you and/or
the team will determine how the data will be analyzed and how the results
will be summarized, interpreted, disseminated, and used to improve
program implementation (CDC, 1999, 2008).
• EX: Standard research techniques were used to analyze the data and
develop a report on the findings. The report was disseminated to the
program managers as well as to all partners/stakeholders. Feedback was
collected from stakeholders and, as appropriate, used to modify the
strategies, messages and interventions. For example, findings from
evaluating the first two sets of commercials were used to identify the
timing of a third set of ads and their messages. The evaluation results also
were used in developing Phase 2 of the campaign (Turning Point Social
Marketing Collaborative et al., 2005).

Dr. Augustine Adu Frimpong Lecture Series 133


How do I conduct an evaluation?
• Ensure use and share lessons learned—Effective evaluation requires time, effort,
and resources.
• Given these investments, it is critical that the evaluation findings be disseminated
appropriately and used to inform decision making and action.
• Once again, key stakeholders can provide critical information about the form, function, and
distribution of evaluation findings to maximize their use (CDC, 1999, 2008).
• EX: Awareness of the Men’s Domestic Violence Helpline increased significantly
from none before the campaign to 53% in Wave 2. The research also showed that
a number of positive belief and attitude effects began to emerge: By Wave 2, 21%
of respondents exposed to the campaign stated that the campaign had “changed
the way they thought about domestic violence” and 58% of all respondents
agreed that “domestic violence affects the whole family” rather than just the
children of the female victim. These results and their implications provided
guidance for revising future activities. Phase 2 utilized lessons learned from the
first phase and was designed to establish additional distribution channels for
counseling services such as Employee Assistance Programs and rural/remote
areas (Turning Point Social Marketing Collaborative et al., 2005).

Dr. Augustine Adu Frimpong Lecture Series 134


Why should I conduct an evaluation?
• Experts’ stress that evaluation can:
• Improve program design and implementation—It is important to
periodically assess and adapt your activities to ensure they are as
effective as they can be. Evaluation can help you identify areas for
improvement and ultimately help you realize your goals more
efficiently (Hornik, 2002; Noar, 2006).
• Demonstrate program impact—Evaluation enables you to demonstrate
your program’s success or progress. The information you collect allows
you to better communicate your program’s impact to others, which is
critical for public relations, staff morale, and attracting and retaining
support from current and potential funders (Hornik & Yanovitzky,
2003).

Dr. Augustine Adu Frimpong Lecture Series 135


Thank You
• Next lesson
• Next Lesson is Scheduled After Academic Support Session
• @ 1:30 - 4:20 pm

Dr. Augustine Adu Frimpong Lecture Series 136

You might also like