0% found this document useful (0 votes)
8 views

Fundamentals of MEL

The document outlines the fundamentals of Monitoring, Evaluation, and Learning (MEL) through a training session held on May 15-16, 2023. It covers various aspects of MEL including project cycles, stakeholder engagement, and the importance of indicators in measuring progress and outcomes. The training emphasizes the role of MEL in enhancing accountability, learning, and improving program design and implementation.

Uploaded by

Daniel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

Fundamentals of MEL

The document outlines the fundamentals of Monitoring, Evaluation, and Learning (MEL) through a training session held on May 15-16, 2023. It covers various aspects of MEL including project cycles, stakeholder engagement, and the importance of indicators in measuring progress and outcomes. The training emphasizes the role of MEL in enhancing accountability, learning, and improving program design and implementation.

Uploaded by

Daniel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 88

Fundamentals of Monitoring,

Evaluation and Learning


Day 1
May 15, 2023
Warm Up!
• Have you ever attended a training on Monitoring and Evaluation?

• Were you ever engaged in developing a Monitoring or an Evaluation Plan?

• Have you developed project or monitoring or evaluation indicators before?

• Have you conducted data analysis before?

2
Quiz
1. Which of the following should not be a 3. Which research method is a bottom-up approach
criterion for a good research project? to research?

a) Demonstrates the abilities of the researcher a) Deductive method


b) Is dependent on the completion of other b) Explanatory method
project c) Inductive method
c) Demonstrates the integration of different fields d) Exploratory method
of knowledge
d) Develops the skills of the researcher

2. Cyber bullying at work is a growing threat to 4. How much confidence should you place in a
employee job satisfaction. Researchers want to find single research study?
out why people do this and how they feel about it.
The primary purpose of the study is: a) You should trust research findings after different
researchers have replicated the findings
a) Description b) You should completely trust a single research
b) Prediction study
c) Exploration c) Neither a nor b
d) Explanation d) Both a and b
3
Quiz
5. Monitoring should ideally be conducted: 7. What are indicators?

a) Every six months a) Questions for monitoring


b) Every year b) Unit of information
c) Once in three years c) Meter to depict change
d) Depends d) It changes over time

8. You have designed and implemented a program for


6. Evaluation should ideally be conducted: increasing uptake of family planning services. When you
started the programme three years ago, total percentage of
a) Every six months eligible people using contraception was 32% of the total
b) Every year number of eligible people in a block. The present percentage
c) Once in three years of workers using contraception is 59%. The change because
d) Depends of your programme is

a) 27 percent
b) 13 percent
c) 59 percent
d) Can’t say 4
Research + MEL Buzzwords!

5
Project

Activity Time
• Set of activities
• Implemented with specific resources
• Within a specific time
• Towards achievement of specific Objective Resources

objective
Project Objective

High poverty
incidence
(core problem)

Reduction in
poverty
(Objective)
Project Cycle
Stage 1: Situation Analysis
To identify what is ‘wrong’

Stage II: Problem Analysis


To identify possible causes of the situation or problem

Stage III: Project Identification


Identify problem the project can address

Stage IV: Project Design


Identify project strategy, stakeholders, plan activities and allocate resource

Stage V: Implementation and Monitoring


Executed activities and monitor progress of activities

Stage VI: Evaluation


Evaluation helps take stock of whether the situation has changed or not
MEL in Project Cycle
When does MEL start?

Ideally, throughout the project


cycle
Practically, at baseline

9
Why MEL?
Maximizing benefits per INR spent

Learning Accountability
How can the investment Whether services
be better? are being
delivered as per
mandate?

Intersection

Scalability/ replication Benefit by costs


Can the investment be Whether costs
scaled up to cover more justify benefits?
regions/units/people?

10
India a popular destination for intervention and evaluations

Source: Drew B. Cameron, Anjini Mishra & Annette N. Brown (2015). https://www.tandfonline.com/doi/full/10.1080/19439342.2015.1034156

11
Demand for MEL
Civil Society Government
To advocate grassroot solutions for govt. To enhance accountability & learnings for policy

Examples: Examples:
• Capacity building of community platforms • Evaluation of central sector and centrally sponsored
(i.e., SHGs) schemes, State schemes
• Nurturing leadership skills in communities
• Fact-findings missions on relevant sectors/
policies/ schemes, etc.

Private Sector/Donors
Evidence creation for field building

Examples:
• Education/health/ nutrition/ renewable/technology/energy-based pilots

12
The Logic of Logic Model
Program Planning, Implementation and Program Management Tool

Hierarchy of results Causal linkages


• Immediate to Long-term • If-Then
• Cause-effect/Means-end

Some Ways of depicting a logic model

Input Process Result Outcome Impact

What we invest What we do and What we want to change at


whom we reach How we improve the situation societal level

Resources Activities Output Objective Goal

13
Is logical framework different from logic model?

• Logic model provides a big picture while logframe illustrates implementation detail
• Logic model follows a flow, logframe is built as a matrix
• Logic model could include change pathways external to program, logframe only depicts
Yes components directly connected to program
• Logic model can begin at the top (impact), Logframe starts at the bottom (inputs)

• A program management tool


No • Follows a ‘logical’ sequence (X leads to Y)
• Outlines the trajectory of change

14
What is an Indicator?

Simple: Easy to Understand


An Indicator is;
• Unit of information Measureable: provide a metric to depict change
• Measured over time
• To depicts change in condition
Value Neutral: Without any positive or negative value attached
under observation
Precise: Defined in the same way by all

15
Monitoring and Evaluation
Monitoring Evaluation
▪ Systematically tracks down the key elements • Sequential validation of change in the results
in the performance of a given program/project proposed that may be attributed to the
▪ Focuses on activities and outputs program/project
▪ Generally, an internal activity • Focuses on outcomes and impacts
▪ Systematic activity • Generally, an external activity
▪ Is more frequent, basis of evaluation • Episodic activity, not very frequent
• Requires more resources and time

Input Activity Output Outcome Impacts

M E
PE 16
Role of MEL in
Policy Making and
Designing
Government
Programs
“Process by which governments,
organizations, or institutions develop
and implement policies or plans to
achieve specific goals or address
Policy Making particular issues. It involves
identifying problems, analyzing
information, and developing and
implementing strategies to address
the identified issues.”

18
Multi-dimensionality – ACROSS
different areas AT various levels
Policy INVOLVING various stakeholders
making AND IN various forms

Agenda setting - Prioritization


ELEMENTS OF and Formulation
POLICY MAKING
CYCLE Implementation

Evaluation
What is a Program Cycle
Problem/Gap Analysis: Ascertain the causes of the
situation or the problem. Identify the gap between the
current situation and the desired situation

Problem/Gap
Analysis

Project Identification: Ascertain which


Situation Analysis: Setting the context for Project specific problem the project will solve for,
knowing what is the existing situation or Situation and to formulate and design the framework
Identification
condition is called Situation Analysis. Analysis of action
& Design

Evaluation: An Evaluation helps us take Implementation & Monitoring: Activities are


stock of whether the situation has changed Implementation & executed & progress is monitored.
Evaluation
& whether objective has been achieved. Monitoring
Program Design : Critical Steps
Identifying the problem or challenge that the program aims to address.

Conducting a needs assessment to determine the context and the needs of the target population or
community.

Identifying the program goals and objectives based on the needs assessment and the problem identified.

Developing program strategies and activities that are based on evidence-based practices, proven
interventions, or innovative approaches.

Developing a program budget, including resources needed for program implementation, monitoring, and
evaluation.

Identifying potential partners and stakeholders who may be involved in program implementation or support.

Developing a monitoring and evaluation plan to track program performance and measure impact.
Being (financially) accountable

Improving operations
Learning
Readjusting strategy
Purposes
for M&E in Strengthening capacity

Policy and Understanding and documenting the context

Programs Deepening understanding

Building and sustaining trust

Advocacy

Sensitising for action


Breaking it down – Policy Making

Evaluation
Monitoring Learning
•Understand factors that
• Enables to identify contribute to success, • Sharing knowledge and
potential problems inform decisions on the best practices between
and challenges in the continuation or different stakeholders –
formulation and modification of policies & help build a collective
implementation of provide recommendations understanding on what
policies for future policy works and what doesn't
development.
Breaking in Down – Developing Government Programs

Help in identifying the needs & priorities of the target population - gather data on the current situation,
Needs Assessment identify gaps and challenges, and develop an evidence-based approach to program design

Understanding the context, target population, and available resources to design programs that are
Program Design tailored to the needs of the intended beneficiaries, ensure the effective use of resources, and maximize
program impact.

Help in monitoring the implementation of programs - setting up monitoring systems, track program
Implementation progress, identify challenges, and make timely adjustments to program design to achieve better
outcomes.

Evaluation Evaluating the effectiveness of programs and their impact - whether programs are achieving their
intended outcomes, identify areas for improvement, and adjust program design to maximize impact.

Help in learning from experience to improve program design and implementation - continuously
Learning learning & improving their approach to program design, implementation, and evaluation
Design Implementation Evaluation Learning

We monitor the
Promote learning and
implementation of the Evaluating the impact by
continuous
program by setting up a conducting
Conducting baseline improvement in the
robust monitoring independent
and needs assessment program by sharing
framework and evaluations to assess
through a survey to best practices and
mechanism such as a program effectiveness.
understand the current lessons learned among
dashboard. The The results of these
situation and identify stakeholders. Create a
framework can include evaluations can be used
areas for improvement. knowledge
indicators to measure to identify strengths and
This data can be used to management system to
progress towards weaknesses of the
develop the program's share information, best
program objectives and program and make
objectives, strategies, practices, and conduct
targets & conduct recommendations for
and targets. capacity building
regular data collection program improvement
workshops to promote
to track program such as focus on
learning and innovation.
implementation.

Let’s build an Example on Role of


MEL in Sanitation Program
Strategy and Direction: Are we doing the right thing?

Sample M&E questions Common ways to assess

• Is the programme theory appropriate, ✓ Reviewing (quarterly or annual)


logical and credible? How has it been reports, key documents and strategies
developed? Has it changed? & programme theories & how they
• How appropriate and relevant are have been developed/adapted over
programme strategies for meeting the time
goals of the project? ✓ Conducting workshops/meetings with
• Are the right stakeholders being key partners & stakeholders to identify
engaged? gaps or lacks in implementation and
• Are selected research where strategies and plans need
questions/themes in line with country’s adapting
priorities or strategies? ✓ Stakeholder analysis and social
network analysis
Management : Are we implementing the plan as effectively as
possible?

Sample M&E questions Common ways to assess

• To what extent are deliverables being ✓ Monitoring and reviewing


completed to comply with programme agendas and minutes of internal
timetables? meetings
• How are risks managed? ✓ Assessing performance and
• Are there capacity needs to be capacity of partner organizations
addressed? and organizational self-
• Is budget spent against plans? If not, assessments
why not? ✓ Reviewing internal strategies,
• How decisions are made, with what work plans, risk registers,
criteria and how are they documented? procedures and processes
Are they consistent, inclusive and
transparent?
Outputs: Do they meet required standards and appropriateness for the
audience?

Sample M&E questions Common ways to assess

• What outputs have been produced?


What has been their quality and ✓ Monitoring and validation
relevance? exercises
• Are outputs aligned with policy and ✓ Review against plans
program strategies (overall strategy, ✓ Cost-benefit analysis
gender strategy)? ✓ Review of implementation
• To what extent are the outputs being pathways – through process
delivered in a way that represents evaluation
value for money?
Outcomes : What kinds of effects or changes did the work have, or
contribute to?

Sample M&E questions Common ways to assess

• What are the changes and at what level


(mainly at individual or institutional ✓ Structured stakeholder interviews
levels) over time? or surveys; or impact evaluations
• What differences are there in results ✓ Process evaluations – linking
seen in different contexts (sectors, processes & pathways of
sites, partners)? What has produced implementation to outcomes
these differences? ✓ Outcome mapping, Outcome
• How sustainable are observed changes harvesting
likely to be? ✓ Contribution analysis
Fundamentals of Monitoring,
Evaluation and Learning
Day 2
May 16, 2023
Recap of Day 1
• Form Pairs
• Tell each other 3 learnings you had from Day 1
• Write down any 3 learnings from Day 1 and put
it on the chart paper attached at the front of
the room
Stakeholder Mapping
Any individuals, groups of people,
institutions or organisations that may have
Who is a a significant interest in the success or
failure of a potential program/policy. They
Stakeholder ? may be affected either positively or
negatively by a proposed project.
Technical specialists

Stakeholders Development partners

should Non-governmental organizations (NGOs), private


voluntary organizations
represent a
District and regional administration organizations
diversity of
perspectives, National Program
such as…
Ministry of Health, National government

Included from various levels—national, regional and


local—as appropriate to the activity.

Champions for change


Engaging stakeholders : throughout the
project process from design to follow-up
• Support a three-stage process: identify, engage and follow up.
• Stakeholder engagement not just in the design phase of the
activity, but in an action plan for engaging throughout the
project and post implementation as well
• The further we progress with Stakeholder Engagement
principles, the more powerful the outcomes can be.
• Engaging stakeholders throughout the process, not just at the
beginning and end, can raise awareness of the activity and
facilitate the use of data and information produced by the
activity.
Mapping the
Stakeholders
Mapping the stakeholders allows us to plot
stakeholders based on their ‘power & interest’
and this can help you prioritise your level of
engagement with them. Interest relates to the
stakeholder’s level of interest in the issue.
Exercise
Participation or
Stakeholder Category Interests
Role

1.Fill in the names/category of your stake holders in the first column

2. Identify each stake holder’s category. These might be employees, senior


leadership, or the organization’s partners, or funders. You can customize your
categories to suit your organization’s identified stakeholders.
Exercise
3. You might also indicate whether a stakeholder
a) Is an integral part of the organization;
b) Is interested in, and committed to, the organization;

c) Knows the organization but is not committed to it; or

d)Has a vested interest in destroying the organization, that is, competitors, etc.

4. Indicate each stake holder’s interest in the M&E results, that is, whether a stakeholder
a) Will use the results for planning;

b) Will use them to support the organization; or

c) Will use the M&E results to design new programs, introduce change, or develop future
strategies, etc

5. Each stake holder may have several interests.


6. Identify each stake holder’s possible participation or role in the M&E
implementation, that is, whether the stake holder can –
a. Be a data or information provider;
b. Make decisions based on M&E findings; or
c. Become a beneficiary of change arising from the M&E findings, etc.
7. Each stake holder may have several roles in the implementation process.
One person can be listed more than once
Participation
Stakeholder Category Interests
or Role
Results Based Management
and Theory of Change
Results Based Management (RBM)
Focusing on results instead of only actions
Core RBM features
• RBM shifts the language, from simply Timeline of change
actions to results Have results been defined realistically & based on appropriate analysis?

• E.g., Focus shift from ‘providing


supplementary nutrition’ to ‘improved Relevance of strategy

nutrition of children below 6 years’ Do the designed strategies meet needs of the beneficiaries?

Areas of improvement
• A performance-based management How can results information help make effective management
approach decisions?

Identification of KPIs
How can progress of expected results be monitored?
• Defines result as a describable or
measurable change Fidelity of assumptions
• Product of a cause-effect relationship How can risks be identified & managed?

Reporting on results
To what extent has the goal been achieved by utilizing x% of resources

42
Results chain
Articulating a project, addressing the what, when, why, how, who & where

Resources Activities Output Objective Goal

Hierarchy Of Results
HOW DO I RBM?
LOGIC MODEL/RESULTS CHAIN -> THEORY OF CHANGE

44
What is a Theory of Change?
• A theory of change is an articulation of what change is
sought to be achieved and how is it to be effected through
the project
• Depicting pathways of change based on sound cause-
effect/means-end logic.
• TOC is a flexible tool that can be used before, during and
after an intervention. But it is most effective at the design
stages of an intervention.
• A strong theory of change requires surfacing
hidden assumptions and challenges from
Theory of people in different roles, levels, and
perspectives, facilitating agreement between
Change (ToC) them, and negotiating shared commitment
among them.
also includes • A theory of change also highlights the
iterative learning process embedded in the
program.
It matters because:
Why do • It is explicit- describes program inputs,
activities, indicators, direct and indirect results
we need a etc.

Theory of • It reduces the risk of being biased towards


results by including direct and indirect causes
of change
Change? • It is the basis on which monitoring, and
evaluation plans are devised
• It is embedded in the program design, is
iterative and can evolve over time
• It creates a pathway for systems change
• It serves as a guide to measuring success
• It provides a framework for decision-making
Why do we need a
Theory of Change?
It helps a program in:
1. Identifying long-term goals
2. Backwards mapping and connecting the
preconditions or requirements necessary to achieve
that goal and explaining why these preconditions
are necessary and sufficient.
3. Identifying your basic assumptions about the
context.
4. Identifying the interventions that your initiative will
perform to create your desired change.
5. Developing indicators to measure your outcomes
to assess the performance of your initiative.
6. Writing a narrative to explain the logic of your
initiative.
Components of a Theory of Change
Inputs are the resources that we use in the project

Processes are the activities that we implement in the project

Outputs are the immediate effect of the activities implemented (and not the
completed activities) in a project and form the deliverables of the project.

Outcome is the project objective to be achieved and can be understood as the


inverted image of the core problem

Impact is the goal to be contributed or the long –term objective of the project
Components of a
Theory of Change
• The core of the theory of change focuses on the links
between activities and results
• How the particular contexts in which the intervention is
implemented affect activities and results
• Potential unintended results, both positive and
negative,
• Assumptions on conditions based on which or how the
change happens and major Risks that may affect it
• Area of Control and Sphere of Influence
How to create a Theory of Change?
Identify stakeholders and agree intended impact

Collect evidence and establish context on your proposed Theory of Change

Identify impacts, or long-term goals

Define your outcomes

Identify outputs and activities

Identify inputs

Clearly state your assumptions

Identify risks

Apply causal links


A results framework is a representation of the flow of
changes that you intend to deliver through your
project to achieve the overall program goal.
Results A key feature of the results framework is to always
maintain a cause-and-effect relationship among each
framework of its level.

and Indicators are specific markers that measure the


achievement of inputs, outputs/intermediate
Indicators outcomes, outcomes and impact of the program.

Each level of the results framework has specific


indicators assigned to them to measure the success of
the implementation work.

Indicators provide an objective way of showing the


program achievement and are extremely helpful in
demonstrating organisational success.
Indicator
A good indicator would be:
▪ An indicator is
▪ Unit of information Simple
▪ Measured over time • Would be easy to understand
▪ To depicts change
▪ In condition under observation
Measurable

• Would provide a metre for depicting change

Precise

• Defined in the same way by all

Value Neutral

• Should be defined without any positive or


negative value attached

53
Indicators are Not
• Just anything you can think of to measure. Every measure is not an indicator (#
of school desks).
• Indicators are not objectives or targets, but the actual results.
• Indicators are not biased rather neutral worded i.e. they do not specify a
particular level of achievement -- the words, improved, increased, gained, etc
do not belong in an indicator. Indicators measure if there has been an increase
or a decrease.
Concepts and Definitions
An indicator can be a:
• Number
• Ratio
• Percentage
• Average
• Rate
• Index (composite of indicators)
SMART Results and Indicators

56
Do we need specific How many indicators should
indicators for each specific we have for one output or
project or M&E level? outcome?

Yes. Specific indicators for each For each level of result, we should
output, outcome have at least one indicator

Can indicators be
Can indicators qualitative and
change over time? Quantitative in Nature?
No. Set of indicators would Yes. Based on the nature of
measure the same thing and information that a particular indicator
would not change over time
relates to, it can be Quantitative or
Qualitative
Indicator
• Let’s define some Indicators for evaluation, based on your understanding and the criteria mentioned
above, mark the indicators given below as good bad or worst indicators.

Markers
Indicators
Good Bad Worst
Reduction in the number of open defecation
Positive change in the cleanliness index
Improved access to Swachh Bharat scheme

More diversified sources of drinking water facility


X% of workers feel empowered
Increased awareness on hygiene and sanitation practices
% of women in the households using toilets
% of respondent reported improved safety due to toilet access

58
Exercise
1. Draw a simpler Theory of Change on your area of interest
Input Activity Output Outcomes Impact
Quantifiable What you do to Immediate results Longer-term Long-term,
resources going accomplish your from your activity expected population level
in to your objectives? results related result. Can
activities – the to changes relate to a
things you Related to program or
budget for. program Goal organization
vision / mission
statement

Assumptions
Exercise
1. Under each column, let’s add at least one indicator -
Input Activity Output Outcomes Impact
Quantifiable 1) What you Immediate Longer-term change in Long-term,
resources going in do to results from your attitude, behaviour, etc. population level
to your activities – accomplish activity Related to program change. Can relate
Level the things you your - people trained, Goal to a program or
budget for. objectives? services organizations vision
provided / mission statement

- # of hubs for # of farmers Measure of change in Increased farm


training Training on trained crop yield for farmers income
Indicator - money spent on improved % change in
(example) training agricultural knowledge
implementation practices
Exercise Example of Present Training on M&E
Input Activity Output Outcomes Impact
Infrastructure – Training -Government -Increased monitoring Improvement in
training hall conducted officials trained and evaluation of program outcome
Budget for in May 2023 on M&E government programs
conducting training -Improved - Increased use of
Level knowledge on evaluation findings
M&E

- # of people # of people # of government Improved program


engaged for Training trained programs being outcome
training # of # of people with monitored
Indicator - money spent on trainings knowledge on % of schemes being
(example) training conducted M&E evaluated out of total
implementation schemes
Research Approach

Qualitative research Quantitative Mixed methods


is an approach for exploring
and understanding the
research is an research is an
approach for testing approach to inquiry
meaning individuals or groups
objective theories by involving collecting both
ascribe to a social or human
examining the relationship quantitative and
problem.
among variables. These qualitative data,
variables, in turn, can be integrating the two forms
measured, typically on of data, and using
instruments, so that distinct designs
numbered data can be
analyzed using statistical
procedures.

62
Qualitative Quantitative

• Reality is socially constructed • Social facts have an objective


reality
• Primacy of subject • Primacy of method

• Variables are complex, interwoven • Variables are measured with


and difficult to measure existing tools

• Relationships between variables are • Relationships between variables


generally described as observed can be assessed using standard
patterns or cases statistics
Qualitative
Quantitative
Purpose Purpose
• Contextualization • Generalizable findings

• Interpretation • Prediction
• Causal explanations
• Understanding peoples’ perspectives

Researcher’s Role Researcher’s Role


• Personal involvement and partiality • Detachment and impartiality
• Emphatic understanding • Objective portrayal

Design Design
• Flexible and emergent • a priori and inflexible
Qualitative Quantitative
Approach Approach
• Ends with hypotheses and a theory • Begins with hypotheses and theories
• Emergence and portrayal • Manipulation and control
• Naturalistic • Experimentation
• Inductive • Deductive
• Searches for patterns • Component analysis
• Seeks pluralism, complexity • Seeks consensus, the norm
• Minor use of numerical indices • Reduces all data to numerical
indices
• Thick description through writing
• Precise technical language,
numerical presentation
Impact Evaluation
➔ Both randomization & quasi-experiment draw
Evaluator’s task counterfactual to estimate this difference

➔ Key principles:
◆ Choosing similar groups (balance)
◆ Creating a baseline and following up with an end-
line
◆ Keep a track of the adequacy of “dose” and effect
of any alternate “doses” that exist

➔ Why are they different:


◆ Randomization means randomly allocating
groups into treatment & counterfactual to reduce
Difference (impact) observed in the outcome of risk of bias. Evaluator has more control.
◆ Randomization requires a placebo or similar
interest with and in the absence of an
intervention (Project - BAU) condition to be created for the counterfactual

66
Popular evaluative designs in the market today
Summative or outcome evaluations:
Cluster randomized trials: Treatment receives the project and control receives the placebo.
Difference-in-differences: Drawing a “matched” comparison group to your project group. Capturing
baseline and end-line results and estimating difference between them.
Repeated measures on large samples: Used for national surveys (NFHS, NSSO), includes assessing the
target population multiple times, cross-sectionally/longitudinally, to estimate trends over time. In case we
can’t draw a counterfactual.

Formative or diagnostic evaluations:


Process evaluations: To check the quality, fidelity, and adoption of the project both by implementers
and intended target audience. Makes use of mixed-methods and focus on qualitative inquiry.

Popular designs that are systems-focused:


Theory based evaluations: Design evaluative approaches/solutions aligned to the needs and dynamics of the
ToC. This principle is now widely followed across multiple designs like developmental evaluation, realist
evaluation, contribution analysis and outcome harvesting.

67
The Organization for Economic Co-operation and Development (OECD)
Development Assistance Committee (DAC) Evaluation Criteria

Relevance, Coherence, Timeliness, Effectiveness, Efficiency, Impact and Sustainability

RELEVANCE: IS THE INTERVENTION DOING THE RIGHT THINGS?


COHERENCE: HOW WELL DOES THE INTERVENTION FIT?
EFFECTIVENESS: IS THE INTERVENTION ACHIEVING ITS OBJECTIVES?
EFFICIENCY: HOW WELL ARE RESOURCES BEING USED?
IMPACT: WHAT DIFFERENCE DOES THE INTERVENTION MAKE?
SUSTAINABILITY: WILL THE BENEFITS LAST?

68
List of Readings
OECD. (2002). Glossary of Key Terms in Evaluation and Results Based Management. Organization for Economic Cooperation and Development.
https://www.oecd.org/development/evaluation/qualitystandards.pdf
OECD. (2010). Quality Standards for Development Evaluation. Organization for Economic Cooperation and Development.
https://www.oecd.org/development/evaluation/qualitystandards.pdf
Gugerty, MK., Karlan, D., Welsh, D. (2016). Guiding Your Program to Build a Theory of Change. Innovations for Poverty Action. https://www.poverty-
action.org/sites/default/files/publications/Goldilocks-Deep-Dive-Guiding-Your-Program-to-Build-Theory-of-Change_2.pdf
Chandurkar, D., Sen, Nidhi. (n.d.). Development Monitoring and Evaluation Framework for Budget Work Projects. National Foundation for India.
http://nfi.org.in/sites/default/files/publication/Developing%20Monitoring%20and%20Evaluation%20Framework%20for%20Budget%20Work%20Projectspdf.
pdf
Davidson, JE. (2009). Causal inference: Nuts and bolts. Better Evaluation.
https://www.betterevaluation.org/en/resources/guides/causal_inference_nuts_and_bolts
Gertler, PJ., Martinez, S., Premand, P., Rawlings, LB., Vermeersch, CMJ. (2016). Impact Evaluation in Practice, Second Edition. The World Bank Group.
https://openknowledge.worldbank.org/bitstream/handle/10986/25030/9781464807794.pdf?sequence=2&isAllowed=y
Khandker, SR., Koolwal, GB., Samad, HA. (2010). Handbook on Impact Evaluation, Quantitative Methods, and Practices. The World Bank Group.
https://openknowledge.worldbank.org/bitstream/handle/10986/2693/520990PUB0EPI1101Official0Use0Only1.pdf?sequence=1&isAllowed=y
Lopez Acevedo, G., Krause P., Mackay, K. (2012). Building Better Policies, The Nuts and Bolts of Monitoring and Evaluation Systems. The World bank Group.
https://openknowledge.worldbank.org/bitstream/handle/10986/6015/681660PUB0EPI004019020120Box367902B.pdf?sequence=1&isAllowed=y
Kusek, JZ., Rist, RC. (2004). Ten Steps to a Results-Based Monitoring and Evaluation System. The World Bank Group.
http://documents1.worldbank.org/curated/en/638011468766181874/pdf/296720PAPER0100steps.pdf
UNDG. (2011). Results-based Management Handbook. The United Nations Development Group. https://unsdg.un.org/sites/default/files/UNDG-RBM-Handbook-
2012.pdf
Mehrotra, S. (2013). The Government Monitoring and Evaluation System in India: A Work in Progress. The World Bank Group.
https://openknowledge.worldbank.org/bitstream/handle/10986/19000/884180NWP0Box300ecd0wp280india0me00.pdf?sequence=1&isAllowed=y
AusAID. (2005). The Logical Framework Approach. Commonwealth of Australia.
https://sswm.info/sites/default/files/reference_attachments/AUSAID%202005%20The%20Logical%20Framework%20Approach.pdf
UNAIDS. (n.d.). An Introduction to Indicators. Joint United Nations Programme on HIV/AIDS. https://www.unaids.org/sites/default/files/sub_landing/files/8_2-
Intro-to-IndicatorsFMEF.pdf

69
Fundamentals of Monitoring,
Evaluation and Learning
Day 3
May 17, 2023
Recap of Day 1 and Day 2
Exercise 1
Steps –Exercise 1
• We will show you a group of pictures showing
MEL in a Project with numbers
• After all the images have been shown – you
have to put them in the correct sequence
IFRC
IFRC
https://www.flaticon.com/
https://www.annmurraybrown.com/
Exercise 1 – Correct Sequence
Steps –Exercise 1
• We will show you a group of pictures depicting
a Theory of Change with numbers
• After all the images have been shown – you
have to put them in the correct sequence
Exercise 2 – Theory of Change

New Yorker
Exercise 2 – Correct Sequence
Exercise
• Problem - 35% of children <5 years in India are
stunted
• Very few have good dietary diversity or feeding
practices
• Donor Zen decides to make a grant to make and
distribute nutritious food & dietary behaviour
change communication
• Zen wants an independent evaluation of the grant
• Implementer Goodthoughts proposes solutions

87
Exercise
- What approach would you adopt to get the results would you want to see?
- What type of data would you collect to check on these results?
- How will you report on results?
- Can you think of any indicators to report on?

88

You might also like