Chapter 10 Monitoring and Evaluation PDF
Chapter 10 Monitoring and Evaluation PDF
Monitoring and
Evaluation (M&E)
guid e
A
Front cover photo: Susan Thubi, a Clinical Officer at the Nazareth Holy Family
Clinic in Limuru, Kenya, checks the records of a patient who has come to the
center for treatment. David Snyder for CRS.
Step 10.3.3 Use M&E Information for Learning and Decision Making.........34
Glossary............................................................................................................ 39
References....................................................................................................... 43
i
Acronyms
HR Human Resources
IR Intermediate Results
SO Strategic Objective
ii
Chapter 10:
Monitoring and
Evaluation
1
Monitoring and Evaluation
Process Map
Do you have objectives that Do you have a plan for Do you have a plan to carry
will drive your Monitoring and turning the indicators into the out the M & E system you
Evaluation processes? information forms? have designed?
see page 9 see page 18 see page 27
2
P u rpose of T h i s G u i de
Good M&E starts during project design. At this stage, you define the
projects purpose and develop a strategy to address identified opportunities
and challenges. The processes for developing a set of objectives
are conducted in a way that supports the design, establishment, and
implementation of a high-quality M&E system for the project. Table 10.1
summarizes the steps of project design.
Table 10.1 Key Elements of Project Design That Underpin High-Quality Project M & E
Key Project
Design Elements Description
Planning the A calendar detailing all activities in project design, such as stakeholder
project design analysis, assessments, etc. For each activity, assign responsible person
and determine budget.
Undertaking Assessments gather information for project design decisions. This will
necessary help you to understand a situation in terms of geographic, political, social,
assessments economic, and cultural concerns.
Analyzing and Identification of the range of possible objectives for the project you are
setting project designing. Objectives should be analyzed by asking, Why have you
objectives selected these project objectives?
Reviewing different There are likely many possible strategies for accomplishing your objectives.
project strategies In agriculture, for example, there are many approaches to inproving
productivity. It is at this point in the process that you will decide which
specific approaches will be used in this particular project.
3
The project M&E system is founded on the project design work. The remainder
of this chapter will guide you through the following three business processes1
required for high quality M&E:
1 T he processes described in the M&E chapter have drawn heavily from Catholic Relief Services materials. In
particular, the following documents have been used: Hagens et al. (2009), Hahn and Sharrock (2010), and
Stetson et al. (2004 and 2007).
4
What Function Does M&E Serve?
Joe Weber/CRS
Monitoring: High-quality monitoring of information encourages timely
decision-making, ensures project accountability, and provides a robust
foundation for evaluation and learning. It is through the continuous
monitoring of project performance that you have an opportunity to learn
about what is working well and what challenges are arising. Job descriptions
of staff involved in managing and implementing projects should include
assigned M&E responsibilities.
Most evaluations will consider one or more of the following criteria:2 Members of the St.
Patricks All-Stars
1. Relevance Did the project address the needs of community members?
marching band in Old
2. Efficiency Did the project do so in a manner that was as low-cost as Harbour Bay, Jamaica,
possible? after Hurricane Dean in
September, 2007.
3. Effectiveness Did the project change existing practices in a
beneficial manner?
2 For further information on these five evaluation criteria, see OECD/DAC (1991).
5
Table 10.2 Differences Between Monitoring and Evaluation
Generally focuses on the question Are Generally focuses on the question Are
we doing things right? we doing the right thing?
6
Summary of This Guide
Business Process 10.1 explains in three steps how to develop the Results
Framework, Logical Planning Framework, and an M&E narrative.This section
assumes that the initial project design work has already been completed.
The steps outlined in Business Process 10.1 will enable you to establish a
comprehensive M&E system.
Step 10.1.1 breaks down the development of an M&E system into easily
understood parts for development of an M&E Operating Manual. Step 10.1.2
describes a process to ensure the active participation of community members
in designing an M&E system and Step 10.1.3 outlines a process to make the
system operational, including the field-testing of the system. Maintaining a
transparent and participatory process helps to make certain that each staff
member has a clear understanding of the project and his or her role in the
projects monitoring, evaluation, and learning activities.
7
Key Principles of the M&E
Function
As agents of development you stand in service to the poor and marginalized.
Your responsibility is to engage and empower communities in programs that
improve and enrich their lives. You are entrusted with significant resources
to support humanitarian and development efforts for which you are held
accountable. Undertaking good quality M&E will promote better learning and
strengthen accountability to stakeholders.
Undertaking good
quality M&E will M&E is guided by the following key principles:3
promote better learning 1. Systematic Inquiry Staff conduct site-based inquiries that gather both
and strengthen
quantitative and qualitative data in a systematic and high-quality manner.
accountability to
2. Honesty/Integrity Staff display honesty and integrity in their own
stakeholders.
behavior and contribute to the honesty and integrity of the entire M&E
business process.
3. Respect for People Staff respect the security, dignity, and self-worth of
respondents, program participants, clients, and other M&E stakeholders.
4. Responsibilities to Stakeholders Staff members articulate and take
into account the diversity of different stakeholders interests and values
that are relevant to project M&E activities.
8
M&E Business Process 10.1 -
Developing an M&E Framework
P rocess D escr i pt i o n
You have learned that the projects M&E system is built on the work undertaken
during project design. It is now time to develop the M&E framework. A key
feature of this work is the identification of a set of objectives, structured in
a manner that provides a firm foundation for the design, establishment, and
operation of the M&E system. A number of M&E tools can be used to develop
the M&E framework; this section describes two of the most well known M&E
planning tools and provides guidance on how to develop a narrative describing
the M&E system for inclusion in the project proposal.
P rocess F low
Process 10.1
Developing an M&E Framework
Start Process
M&E Team
End Process
9
S tep 1 0 . 1 . 1 D evelop a R es u lts F ramework
10
Intermediate Results These reflect the uptake/use of project outputs
Start early Draft a results framework early in the project design stage,
soon after your assessment and problem analysis. This will allow you to
clearly lay out your thinking regarding your theory of change. Include the
results framework in your project ideas note.
Allow time The results framework appears simple but takes time. Your
initial version will likely be revised during the design process. Such
revisions reflect the discussions and debates that will occur as the
objectives of the project become more focused.
Do not be intimidated The more you use results frameworks the more
comfortable you will be with them.
11
Be mindful of the terminology It is important to note that every donor
has its own terminology for describing the different levels in an objectives
hierarchy (see Appendix B: Master Translator.) Always check for the
latest information in this regard.
Use the Example Results Framework and the outputs from the project design
effort to develop a results framework that reflects your proposed project
design and strategy.
12
S tep 1 0 . 1 . 2 C reate a L og i cal P la n n i n g
F ramework
13
Table 10.3 Logical Planning Framework Matrix
Goal
Strategic Objectives
Intermediate Results
Outputs
Activities
14
chiefs approve access to land for demonstration plots (the critical
assumption linking Activities to IR), then the famers will see how
no-till agriculture improves productivity compared with traditional
methods. Critical assumptions are most important at the lower level
of objectives because this is where assumptions about uncontrollable
events have the most influence. Based on your discussions about
critical assumptions, you may need to revisit column one to add other
activities to lessen the risk to the project.
3. Fill in columns two and three. Start from the top and work down
because, in the process of selecting performance indicators and
measurement methods, you may find objectives that cannot be
measured as stated and therefore need revision. This, in turn, may
require revision of others farther down the matrix. Include a balance
of both quantitative and qualitative data. It can take time to decide
on all of the indicators and measurement methods and then ensure
they match the objective statement. Take sufficient time to complete
these columns, because they are the driving force for your projects
M&E system. Note that the heading for Column 3 in the Example
Logical Planning Framework indicates that all the indicator data will be
described in the M&E Operating Manual (see Step 10.2.1.) Note too
that there can be more than one indicator for a single objective.
4. Finalize the logical planning framework. Once you have finished the
framework, reconcile any change in objective statements with those in
the draft of the results framework.
15
S tep 1 0 . 1 . 3 W r i te the M & E Narrat i ve
Holly Inuretta/CRS
Broadly speaking, the M&E narrative should describe your plans for the
following:
Project monitoring
Project Monitoring
The narrative will be based on the completed results framework and logical
planning framework. It describes the proposed system that will ensure
performance indicator data are collected, analyzed, and reported. The M&E
narrative is an opportunity to describe in more detail the methods that you
are intending to use to collect and analyze data, and report ongoing project
progress and achievement of the strategic objectives. The narrative should
describe the participation of beneficiaries in the M&E system so that their
contributions can inform internal decision-making and project reporting.
16
The narrative will include a description of how the data generated will be
used by project staff to assess the need for modifications to planned project
operations. It is important to briefly describe your plans for internal project
meetings and other reflective events that will utilize information generated by
the project monitoring system.
You may decide that the project duration merits a mid-term review. Use the
narrative to describe its purpose, the kind of information that you would expect
it to generate, and how that information will be used to guide subsequent
project operations.
You have now completed the process for developing the framework for your
M&E system. In the next section, you will learn how to establish the M&E
system once the award has been made.
17
M&E Business Process 10.2 -
Establishing the M&E System
P rocess D escr i pt i o n
In Business Process 10.1, you laid the foundation for the establishment of
the M&E system. You defined project objectives, their indicators, and how
those indicators would be measured. In Process 10.2, you will learn about
the essential steps to establish an M&E system that connects the defined
indicators with the forms required to collect and report on data. This M&E
system incorporates features for learning and decision-making based on
robust and reliable evidence.
Project M&E systems are best when they balance the needs of project staff
and donors in generating timely field-level information on progress and
success with those of community members to influence project learning,
direction, and ultimately learning. These are critical elements of a high
performing, dynamic learning organization.
P rocess F low
Process 10.2
Establishing the M&E System
Start Process
M&E Team
End Process
18
S tep 1 0 . 2 . 1 D evelop a n M & E O perat i n g M a n u al 5
Inputs All the relevant documents that were developed for the
project proposal
Appendix E: Example Data Flow Map
Appendix F: Example Instruction Sheet
Appendix G: Example Communication and Reporting Map
Appendix H: Sample Prompt Questions for Learning to Action
Discussions
Outputs Hard and soft copy of all documents in the M&E operating
manual including data-gathering forms and report formats.
See Table 10.4 below for a full list of documents in the manual.
Step Summary The M&E Operating Manual contains all of the documents
needed to implement the M&E system. While this work takes
time, it will ensure that your data are collected and analyzed in a
comprehensive and rigorous way. The manual includes steps to
ensure the data are turned into useful information that is used
in making decisions about project direction and in reporting on
project results and impact.
The M&E system is the backbone of the project because the objectives and
their indicators are linked to a transparent system to collect, analyze, and
report on data. By carefully designing and developing the data-gathering tools
for M&E you ensure that all required data are collected and extraneous data
are not. The system will include mechanisms to turn data into useful evidence
that supports sound project decision-making and ensure that all staff
members have a clear understanding of the project and their role in M&E.
All of the documents developed for the M&E system will be organized into an
M&E operating manual. This manual becomes the central source of information
about your M&E system. The manual should be made available in a ring binder
5 See Hahn and Sharrock (2010) for a more detailed description about the content and process of an M&E
Operating Manual.
19
as well as electronically. Both hard and soft copies should be shared with
others when step 10.2.1 is finished. Assign a staff member the responsibility for
keeping the hard and soft copies of the manual updated.
Experience has shown that it is best to meet with a small group of project
staff including M&E, management, and technical to draft the first version of
the M&E operating manual. Request a facilitator who can help manage the
process and oversee the development of the manual.
Table 10.4 lists the documents that represent the key elements of your M&E
system. Include the appropriate documentation for each element in your M&E
operating manual, organized under three headings as seen in the table.
M&E System
Element M&E Operating Manual Documentation
Table of Contents List all of the documents in your project M&E operating manual
in a Table of Contents with the correct title and date of the most
recent version.
M&E Purpose In writing this brief statement, answer the broad question of why
Statement you are setting up an M&E system for this particular project.
There will be more obvious reasons (e.g., to monitor and report
progress), but also less obvious ones, such as your desire to
experiment with new community-based approaches.
M&E Working List those people who agree to help oversee the operation of
Group the M&E system, along with a list of tasks they plan to address.
Component 2: Setting Up
Indicator The IPTT may have been developed earlier for inclusion
Performance in the project proposal. It shows indicator targets and
Tracking Table accomplishments for each year of the project. If available, the
(IPTT) earlier draft will be reviewed; if not, the IPTT may have to be
developed as part of step 10.2.16.
6 For further information on IPTTs, see Stetson et al. (2004, pp. 140143) and Willard (2008a).
20
Detailed An M&E calendar/schedule may have been prepared earlier
Implementation for reference in the M&E narrative of the proposal. The annual
Plan, Including detailed implementation plan builds on this by listing activities
M&E and the people responsible for them for each project output. It
also contains detailed activities for setting up and operating the
M&E system.
Data Flow Maps Data maps show the flow of indicators through the data-
gathering forms and report formats and how they are
connected. The maps ensure a process for collecting data for
the indicators listed in the project proposal. Depending on the
scale and complexity of the project, there may be several data
flow maps. See Appendix E: Example Data Flow Map.
Data-Gathering These forms are created to collect data based on the data flow
Forms maps. These may include monitoring forms, medical records,
or survey questionnaires. There may be existing forms that you
can already use.
Report Formats These are created to be filled out by project staff or project par-
ticipants to transmit data and information to the next reporting
level, including the management team and the donor.
Instructions Sheets These provide clear instructions on how to fill out each item in
the data-gathering forms and report formats. See Appendix F:
Example Instruction Sheet.
Communication These diagrams show the flow of reports and other communica-
and Reporting tions to all relevant stakeholders, including responsible persons
Maps and dates. See Appendix G: Example Communication and
Reporting Map.
Learning to Action This is a list of questions that might be posed to prompt produc-
Discussions tive discussion and analysis of the data and action required.
Appendix H: Sample Prompt Questions for Learning to Action
Discussions lists prompt questions that provide a structure for
analyzing data and discussing their implications for responsive
project management with all levels of stakeholders. For more
discussion on Learning to Action Discussions see Step 10.3.3.
Capacities and All too often M&E is under-resourced. An effective M&E system
Resources requires human resources, staff training, funding, and material
resources. Staff with M&E responsibilities must have the knowl-
edge, skills, tools, and support to undertake their respective
tasks. This should be discussed with your colleagues who have
been assigned HR and Finance responsibilities.
Reports and Progress reports are an important vehicle for analyzing, sum-
Evaluations marizing, and communicating monitoring data to different
stakeholders. You will already have discussed reports and
evaluations as you worked on the data flow maps, data-gather-
ing forms, and report formats. These reports and evaluations
ultimately represent key outputs from your M&E system once it
is up and running.
21
With guidance from the M&E Working Group, periodically review the M&E
David Snyder for CRS
system to confirm that it is providing useful and timely information. If the M&E
system is not effectively providing a service that meets the needs of staff and
other stakeholders, take the opportunity to assess why this might be the case
and seek possible solutions.
22
S tep 1 0 . 2 . 2 S et Up the C omm u n i t y M & E S y stem
Individuals and communities are the primary stakeholders of the project, but
accountability to them is often overlooked. Community engagement allows
communities to play a more active role in project management, reflect upon
progress, and assess changes in their situation. Community involvement
in monitoring also builds the communitys capacity to direct their own
development, increases the communitys sense of ownership of the project,
and builds accountability and transparency.
23
S tep 1 0 . 2 . 3 P re -T est F orms , T ra i n S taff, a n d
C omm u n i cate the M & E S y stem
Step Summary The data gathering forms and report formats are pre-
tested using the draft instruction sheets to ensure
that consistent data are collected across the project.
All data-gathering staff members need to be trained
on the system and feel competent to carry out their
responsibilities. All project staff members, particularly
supervisors and managers, should understand the M&E
system and understand their roles in the system.
The M&E operating manual (Step 10.2.1) is complete, but forms and
processes may change as the M&E system is implemented. Changes may
also occur as you train staff on the use of the system and as you inform all
staff about the system and how it will work. Each time a change is made,
the relevant tools need to be tested and adjusted if necessary. The M&E
operating manual is also updated with the new or revised tools.
Work with these same staff members to test the reporting formats. Field staff
and volunteers are often responsible for collecting and reporting on the source
data for much of the projects monitoring system. Work through the report
24
formats with them so they clearly understand the formats and how to use them.
Field test each tool in a nearby community to avoid extended travel time.
Following the field test, hold a team discussion to solicit feedback on how
the tools worked overall and any suggestions for revising or altering specific
instruction sheets. Make final revisions to the instruction sheets based on
this discussion.
The forms may have to be translated into one or more local languages. Spend
sufficient time on this step to ensure that all data gatherers are interpreting
the questions in the same fashion so data collection is consistent and reliable.
25
Train staff in collecting, analyzing, and reporting on data
Staff capacity to implement the project M&E system often requires significant
strengthening. Even staff with extensive experience in M&E should be trained
on the specific objectives, tools, and protocols for each M&E activity to ensure
that there is consistency and quality.
There are three key tasks involved in planning and delivering quality staff
training. You must assess and identify training needs and resources; deliver
high-quality training; and follow-up, monitor and evaluate. In an initial training
session, you may want to cover the following topics:
The observations and decisions of the M&E working group should also be
communicated on a regular basis. Use project meetings and agency meetings
to keep technical and management staff up to date on the M&E system,
findings, and use of the information.
26
M&E Business Process 10.3 -
Implementing the M&E System
P rocess D escr i pt i o n
P rocess F low
Process 10.3
implementing the M&E System
Start Process
Organize Data
Manage
M&E Team
Management in the
Evaluations
Project M&E System
10.3.2
10.3.1
27
S tep 1 0 . 3 . 1 O rga n i z e D ata M a n ageme n t i n the
P ro j ect M & E S y stem
Set up databases to manage the flow of data and ensure that the system
produces good quality and timely information. The database houses data that
are checked, validated, and securely stored. Good data management enables
project staff to run simple calculations to produce summaries for the purposes
of analysis and interpretation (see Step 10.3.3). Step 10.3.1 is concerned with
the three following main tasks:
Developing a Database
1. Determine the purpose of the database. Do not merge monitoring and
evaluation data in one database. Instead create separate monitoring
and evaluation databases for the same project. See Appendix I: Data
Management for a comparison of monitoring and evaluation databases.
28
2. Check with other staff in your organization to see if there are well-
functioning and efficient monitoring databases already in use. If the
structure of an existing database is appropriate for your program, use
its format as a starting point.
29
Data Storage and Security
1. Organize record keeping from the start. Set up filing and record
keeping systems for both paper-based forms and digitized information.
Just as you have a well organized M&E operating manual, it is equally
important that you develop a data-storage system that facilitates
secure access.
30
S tep 1 0 . 3 . 2 M a n age E val u at i o n s
Step Summary The levels of effort for baseline, midterm, and final
evaluations depend on the scale and significance of
the project. Your goal is to increase the quality of your
organizations M&E activities through well-managed
evaluations. A checklist summarizes the tasks involved in
managing evaluations.
In the project proposal, plans may have been outlined for baseline, midterm,
and final evaluations. These are costly and time consuming, so careful
thought and planning are needed for a worthwhile evaluation.
8 Readers are recommended to refer to Willard (2008b and 2008c) and Stetson (2008) for information on
preparing for and managing, and reporting and communicating on an evaluation, respectively.
31
Table 10.5 Sample Checklist of Evaluation Tasks
Scope of Work Scope of work drafted, reviewed by staff and management, and
finalized
Consultants identified
Team assembled
32
Psychological Evaluation manager mentor chosen
Elements
Safety valve for evaluation team developed (weekend options,
half day excursions, etc.)
33
S tep 1 0 . 3 . 3 Use M & E I n format i o n for L ear n i n g
a n d D ec i s i o n M ak i n g
Project staff members are more likely to use M&E data if they feel confident
about its quality and if the information is available in a timely fashion (see
Step 10.3.1.) Their willingness to engage with the information is improved
An M&E system stands if they feel involved in the M&E process. This ensures that they better
or falls by its usefulness understand the data. These points increase the likelihood that staff will use
to the end users of the M&E information for learning and decision-making.
information.
The M&E system will produce different types of data at different points during
the life of the project, as follows:
34
Prompt Questions for Learning to Action Discussions should be tailored for
Khalil Ansara/CRS
local use. Field staff will discuss the data they have collected with their
supervisors; in turn, supervisors will consolidate the data from all their field
staff and discuss the aggregated data report with the person to whom they
report; and so on. These are known as Learning to Action Discussions (LADs).
Engaging with the data in this way is enriching and will inform decisions about
follow-up action.
These LADs are a time set aside to understand and analyze the data and to
discuss implications for the management of the project. While LADs can take
place at any time in the project, it is a good practice to link the LADs to the
communication and reporting map (see Step 10.2.1.) With this map, you see
excellent opportunities for discussing data, findings, and their implication for
next steps in the project. The LADs are particularly valuable to staff on site
visits for involving community members in discussions about project progress. In Egypt, where refugees
often experience
With LADs, staff members are encouraged to use the data they have been
prejudice and a lack of
collecting to reflect on their own work. Junior staff members observe that
opportunity, a Peace
supervisors and managers use the data to make project decisions. This active Camp brings Iraqi,
use of the data serves to reinforce the collection of data and appreciation of Sudanese, Egyptian and
its use in meaningful project management. other children together
for summer fun.
Baseline, Mid-Term and Final Evaluation data
Data generated in evaluation surveys will provide a rich source of information
for project staff. Consider the following three points:
1. Analyze all the data collected. All data is included in the evaluation
analysis to get as complete a picture as possible.
2. Interpret data in a way that reflects the limitations and biases of
the data. When interpreting data, do not hide any limitations or bias
in the data collection methods. They are common to all data collection
exercises. The best approach is to be transparent about such
limitations, bear them in mind when interpreting the data, and note
them in any M&E reports.
3. Plan for an evaluation lessons learned workshop. Discussing lessons
learned offers an opportunity for invited stakeholders to validate the
survey information, discuss the findings, and use this knowledge to
inform decision-making.
35
points from the LADs and evaluation lessons learned workshops and
disseminate them to others. Each staff member will view the information
through his or her own personal lens that will enrich the interpretation
and learning that takes place among project staff.
36
Compliance Checklist for M&E
The M&E compliance checklist supports your efforts to reach a high standard
in your M&E work by raising questions for discussion and critical review.
Use the checklist to review the work you undertook in developing the M&E
framework and establishing the M&E system and to help guide you through
M&E system implementation.
While all projects require good M&E, the size of the project and the resources
available must be considered when establishing M&E components and
tailoring them to the specific needs of each project.
You will ask the following three questions regarding the M&E system
depending on the timing of your review:
Does your project have an M&E framework? Step Review and Analysis
37
Does your project have an M&E operating manual? Step Review and Analysis
Does your project have a system for listening to and learning from
community members and for responding to their concerns in a 10.2.2
transparent manner?
Are staff and other stakeholders using the data generated by your
projects M&E system? Step Review and Analysis
Are staff members using the M&E data through LADs, evaluation
10.3.3
lessons learned workshops, and/or other learning events?
38
Glossary
Activities
A logical planning framework term for the functions that need to be
undertaken and managed to deliver the projects outputs to targeted
beneficiaries and participants.
Critical Assumptions
Factors that project designers cannot (or choose not to) control but that could
endanger project success if assumptions about those factors are incorrect.
Data-Gathering Forms
Forms to be filled out by project participants or staff to collect data.
Evaluation
A periodic, systematic assessment of a projects relevance, efficiency,
effectiveness, and impact on a defined population. Evaluation draws from
data collected during monitoring as well as data from additional surveys or
studies to assess project achievements against set objectives.
39
Goal
A logical planning framework term for the longer-term, wider development
change in peoples lives or livelihoods to which a project will contribute.
Implement
Involves translating plans into performance by carrying out the DIP.
Implementation is more than simply following a plan or recipe; it requires
much discipline, judgment, and creativity.
Instruction Sheets
Sheets that give clear instructions on how to fill out each of the data-gathering
forms and report formats.
Intermediate Results
A crucial bridge between lower- and higher-level objective statements in a
results and logical planning frameworks. Learning processes are explicitly
built-in to project implementation. After implementation has commenced,
feedback received from project beneficiaries helps ensure that the project is
on track toward achieving its strategic objectives.
M&E System
Well-organized, interdependent activities or components and clear
procedures that contribute to a well-defined purpose of M&E within a project.
An M&E system integrates more formal, data-oriented tasks (for example,
collecting information on logical planning framework indicators) with informal
monitoring and communication. It ensures that people responsible for M&E
can do their jobs.
40
Monitoring
A continuous process of collecting, analyzing, documenting, and reporting
information on progress to achieve set project objectives. This information
assists timely decision-making, ensures accountability, and provides the basis
for evaluation and learning. Monitoring provides early indications of progress
and achievement of objectives.
Objectives Hierarchy
The vertical arrangement of different levels of objective statements inresults
and logical planning frameworks. One objective level is seen as the means to
achieving the next higher-level objective.
Objectives Statements
The first column of the logical planning framework matrix. These statements
provide a concise commentary on what the project is aiming to achieve and
how it intends to do so.
Outputs
A logical planning framework term meaning the goods, services, knowledge,
skills, attitudes, enabling environment, or policy improvements that not only
are delivered by the project, but also are demonstrably and effectively received
by the intended beneficiaries and participants.
Performance Indicators
Something observed or calculated that acts as an approximation of, or proxy
for, changes in the phenomenon of interest.
Project Accountability
The notion that managers are responsible for using intermediate results as feedback
to check that their project is on-track toward achieving the strategic objectives.
Project Proposal
A structured, well-argued, and clearly presented document prepared to
obtain approval and funding for a proposed project strategy. It stands as
the agreement among the relevant stakeholders about the analysis of the
situation and the resulting plan of action.
Report Formats
Reports to be filled out by project participants or staff to report data and
information to the next level.
Results Framework
An organigram that gives a snapshot of the top three levels of a projects
objectives hierarchy in a way that makes it easy to understand the overarching
thrust of the project.
41
Stakeholders
Individuals, groups, and institutions important to, or who have influence over,
the success of the project.
Strategic Objectives
The central purpose of the project, described as the noticeable or significant
benefits that are actually achieved and enjoyed by targeted groups by the end
of project.
Theory of Change
An articulation of how a proposed project strategy will lead to the achievement
of the projects Strategic Objectives.
42
References
American Evaluation Association. (2004). Guiding principles for evaluators.
Retrieved from http://www.eval.org/GPTraining/GP%20Training%20Final/
gp.principles.pdf
OECD/DAC. (1991). DAC criteria for evaluating development assistance. In
OECD/DAC Network on Development Evaluation, Evaluating development
cooperation, summary of key norms and standards (pp. 1314). Retrieved
from http://www.oecd.org/dataoecd/12/56/41612905.pdf
The references cited below are available from the following website: http://
www.crsprogramquality.org/monitoring-evaluation/
Catholic Relief Services. (2009a). Monitoring and evaluation standards.
(Version 1.0. June).
Catholic Relief Services. (2009b). Monitoring and evaluation standards
support tool: Using the standards to improve monitoring and evaluation.
(Version 1.0. July).
Hagens, C. (2008). M&E and ethics: A framework for addressing ethical concerns
in M&E. (Short Cuts Series). Baltimore, MD: Catholic Relief Services.
Hagens, C., Morel, D., Causton, A., & Way, C. (2009). CRS Asia M&E guidance
series. Baltimore, MD: Catholic Relief Services.
Hahn, S., & Sharrock, G.(2010). ProPack III: A guide to creating a SMILER M&E
system. Baltimore, MD: Catholic Relief Services.
Stetson, V. (2008). Communicating and reporting on an evaluation: Guidelines
on developing an evaluation reporting and communication strategy.
(Short Cuts Series). Baltimore, MD: Catholic Relief Services.
Stetson, V., Sharrock, G., & Hahn, S. (2004). ProPack: The CRS project
package: Project design and proposal guidance for CRS project and
program managers. Baltimore, MD: Catholic Relief Services.
Stetson, V., Hahn, S., Leege, D., Reynolds, D., & Sharrock, G. (2007). ProPack
II: The CRS project package: Project management and implementation
guidance for CRS project and program managers. Baltimore, MD: Catholic
Relief Services.
Willard, A. (2008a). Using indicator performance tracking tables: Guidelines
and tools for the preparation and use of indicator performance tracking
tables. (Short Cuts Series). Baltimore, MD: Catholic Relief Services.
Willard, A. (2008b). Preparing for an evaluation: Guidelines and tools to
plan for an evaluation. (Short Cuts Series). Baltimore, MD: Catholic
Relief Services.
Willard, A. (2008c). Managing and implementing an evaluation: Guidelines on
managing and implementing a successful evaluation. (Short Cuts Series).
Baltimore, MD: Catholic Relief Services.
43
44
Appendix A
Results Framework
Improving Agricultural Productivity and Incomes through the No-Till Agriculture Project,
DR Congo
45
Appendix B
Master Translator: Comparison of Logical Planning Frameworks
Wider or End of
Intermediate
Long-Term Project Outputs Interventions
Effect
Effect Effect
46
Appendix C
Cheat Sheet for Working with Logical Planning Frameworks
Measurement
Objectives Performance Methods/Data
Statements Indicator Statements Sources Critical Assumptions
Goal:
Performance Indicator It is not necessary to It is not necessary to
This describes the longer-
Statements and complete this box. complete this box.
term, wider, development
associated data are
change in peoples
drawn from appropriate,
lives or livelihoods to
pre-existing sources such
which the project will
as Amnesty International,
contribute. This could be
FAO, Freedom House,
in a given region, or in
IFPRI, Transparency
the nation as a whole.
International, World
Think of the Goal as a Bank, UN, national
larger, longer-term hope government reports, and
or aspiration. so on.
How to write: Write as a
full sentence, as if it has
already been achieved.
Use the general
population of intended
beneficiaries as the
subject of the sentence.
Strategic Objectives SOs-to-Goal:
(SOs): SO indicators reflect SO indicators are gener- Assumptions that will affect
the benefit(s) expected ally monitored and/or achievement of the Goal
These describe the
noticeable or significant to occur for beneficiary evaluated via field visits concern:
benefits that are actually subgroups by EOP as as well as mid-term and (a) the longer-run
a result of behavioral final evaluations. sustainability of the project;
achieved and enjoyed
by targeted groups by change(s) (achieved at To measure these and
IR-level, prompted by benefits against the set (b) the contributions of
the end of the project
(EOP). These benefits successful delivery and targets, EOP results are national governments and/
receipt of the projects always compared with or other organizations that
are achieved due to IR-
Outputs). the corresponding base- may be critical to achieving
level changes that have
line findings (whether the Goal.
taken place as a result
of the Outputs from well- from primary measure-
ment methods or other
executed Activities.
data sources) in the final
Each SO expresses an project evaluation.
aim that is realistic,
specific to the project,
and measurable.
SOs are the central
purpose of the
project; that is, why
it was designed and
implemented in the first
place.
How to write: Write as a
full sentence, as if it has
already been achieved.
Use the targeted
primary beneficiary
group(s) as the subject
of the sentence.
47
Appendix C
continued
48
Appendix C
continued
49
Appendix D
Logical Planning Framework: Improving Agricultural Productivity through the No-Till
Agriculture Project, DR Congo
Measurement Methods/
Data Sources
(Indicator data will be Critical
Objectives Performance Indicators
gathered via the project M&E Assumptions
System as described in the
M&E operating manual)
Project Goal:
Poor rural communities in
Manam Province improve
their food security and natural
resource base
Strategic Objective: Percent of project beneficiaries Baseline and final
Farm families increase report an improvement in their food evaluation (FANTA
agricultural productivity and security and income. quantitative data collection
income Percent of households in target instruments and qualitative
communities adopt no-till agriculture data)
on their farms Field data
Intermediate Result 1: Number of farm associations Field data and observations;
Farm families practice and actively engaged in no-till agriculture mid-term review and final
evaluate their experiences plot management evaluation
with no-till agriculture farming Percent of area on which no-till Reports from project
agriculture techniques are applied technical staff
on home farm
Documentation on no-till agriculture
on results of each demonstration
plot
Documentation of no-till agriculture
and of best bet no-till agriculture
practices
Output 1.1: Number of participating farmers who Field data and focus groups
Farmers have knowledge of actively engage in weekly meetings
and skills in no-till agriculture Number of participating farmers who
actively engage in discussions on
no-till agriculture demonstration plot
Activities: Number of farmers in associations Trainers reports
Caritas staff to train trained, disaggregated by sex and Study tour report and
farmers in no-till agriculture location observation
techniques and establishing Number of interested farmers (not in
demonstration plots association) trained, disaggregated
by sex and location
Number of study tours and
participation by farmers,
disaggregated by sex and location
Output 1.2: 42 paired demonstration plots are Field data
Farmer groups design, install, in operation for use by extension
and manage demonstration workers to train project farmers
plots
Activities: Number of farmer groups that Caritas monthly report Security situation allows
1. Caritas to work with farm agree to work with Caritas on for travel by Caritas staff
associations to establish demonstration plots Use of land for
demonstration plot Number of members in each farmer demonstration is
2. Caritas to sensitize group, disaggregated by sex and approved by village chief
communities in target area location
about no-till agriculture Number of sensitization meetings
planned activities
50
Appendix D
continued
Measurement Methods/
Data Sources
51
Appendix E
Data F low M ap : T he No -T i ll Agr i c u lt u re P ro j ect
List of participants at
weekly meetings Field Notebook
Farm data Baseline
Farmer Group
Farmers responses
List of farmers who
Activities of the use no-till agriculture Soil analyses
field agent
Field Agent
Field Agent
Checklist of
NTA techniques
NTA staff Supervisor monthly report
Supervisor
Supervisor quarterly report
Training reports Supervisor
NTA staff
Soil analyses
Six-month report for donor
CRS Project Manager
52
Appendix F
I n str u ct i o n s for the Da i ly Atte n da n ce R eg i sters
General information
Month/year Enter the month and year in which the form is completed
Page If the number of pupils exceeds the space on this page, add another form and give it a new page
number
Class State the class; complete one set of forms for every class
Data Table
Number For the first time in attendance, copy the name from the attendance register
Name Enter student name (Last Name and First Name) and use the same order for all the following
months
Transfer Draw a line and write transferred when a pupil moves to another school
New Intake Draw a thick line after the current pupils name. Write New Intake and write the names of new
pupils below the line
Total atten- Add the total attendance for all pupils for the month and write the resulting number
dance for
month
Attendance average
Number attending less than 50 percent of school days in the month
Number attending fewer than 12 days
Number attending fewer than 10 days
Female form only: Number attending at least 80 percent of school days in the month
53
Appendix G
C omm u n i cat i o n a n d R eport i n g M ap ( abr i dged ) :
T he No T i ll Agr i c u lt u re P ro j ect
Project technical
advisor and M&E
Monthly agriculturalist report
technical advisor
(review and provide (Collects data from notebook by 28th of
feedback on each month; reports to supervisor by first
quarterly report week of following month)
before submitted
LAD with supervisor
to CRS project
manager)
District Agriculture
Officer, Ministry of
Agriculture
CRS DR Congo Project Manager six-
month and annual reports (submitted Ministry of
March 30th and September 30th) Agriculture, Kinshasa
54
Appendix H
S ample P rompt Q u est i o n s for L ear n i n g to
Act i o n D i sc u ss i o n s
Learning
1. What did we plan for the month? Quarter? Six months?
2. What did we achieve?
a. Review the data on the monthly data reports
What do these data tell us?
What dont the data tell us?
Who do the data represent?
Who dont the data represent?
What else do we need to know?
b. Are these data consistent with our observations during field visits?
c. Review of successes and challenges. Focus on the facts!
Successes:
What is going well?
Why is this happening?
So, what does this mean?
How does it affect us?
Issues/Challenges:
What problems/issues are we having?
Why is this happening?
So, what does this mean?
How does this affect us?
3. What happened (both good and bad) that we did not expect?
4. How are these results contributing to our objectives?
Action
1. What initiatives are successful?
a. How can they be reinforced?
b. Are there other places in the project area that might adopt these
initiatives?
2. What initiatives are not going well?
a. What needs to change?
b. Should any activities be dropped?
3. If activities change, who needs to be informed and how do we plan this?
4. If activities change, is there a budget to support the work?
5. How best can community members be informed of our current thinking?
a. What is the best way to inform different members of the community?
b. What issues/questions are likely to surface?
c. How should we respond to opportunities and concernshow much
room is there for negotiation?
d. Which project staff and partners should be involved in follow-up
discussions?
55
Appendix I
Data M a n ageme n t: S u mmar y of M o n i tor i n g a n d E val u at i o n Databases
Description A monitoring database tracks project An evaluation database is useful for analyzing
activities, outputs completed, and assessment or evaluation data and can
progress toward objectives. It track progress toward the projects strategic
also houses project management objectives and intermediate results.
information.
Frequency of use Often on a monthly basis or more Based on the frequency of assessments and
frequently. In an emergency response, evaluations. Often used at project baseline,
information may be needed on a daily or mid-term, and end.
weekly basis.
Common source(s) Monthly activity report Household surveys (baseline, mid-term, final)
of data Project records Community-level surveys (baseline,
Field monitoring reports mid-term, final)
Type of analysis Sums, frequencies, percents, mean Frequencies, percents, mean values,
values. For example: statistical significance tests, comparisons
Number of community-wide meetings between sub-groups. For example:
held Comparison between the average number
Percent of communities that have of meals per day for female-headed
elected committees households and the average for male-
Number of trainings conducted headed households
Average number (or mean number) of Comparison of sources of loans (in
attendees at community meetings percent) for households in the lowest
socio-economic group, the middle
socio-economic group, and the highest
socio-economic group
Technical Can require minimal technical Utilization of the database generally requires
considerations expertise or advanced technical skills advanced analysis skills.
in order to set up and utilize the
database, depending on the complexity
of the system.
56
Appendix I
continued
Software Recommended
Program Advantages Disadvantages Use
Microsoft Excel The software is readily available. Few staff members are familiar Monitoring
Most staff members have Excel on with the Excel functions for more databases
their computers. complex analyses (comparisons
Staff members are more likely to be between groups, etc.)
familiar with the basic functions of Excel allows for more error in data
Excel than with the other software entry or while analyzing/using
programs. data.
57
Appendix J
4 Steps to Effectively
Introduction Communicate and Report
on Evaluation Results
This edition of Short Cuts provides practical instructions on how
to design an evaluation communication and reporting strategy
using tailored reporting formats that are responsive to audience
profiles and information needs. Most donors require midterm
and final evaluations, and best practice indicates that these
periodic assessments provide the most detailed information about
a particular projects progress. An evaluation represents a large
investment in time and funds, yet private voluntary organizations
(PVOs) often report that evaluation reports are not read or
shared, and in some cases, a reports recommendations are not
used.
58
Communicating and Reporting on an Evaluation Page 2
Failure to plan from Not communicating regularly with stakeholders can cause disengagement,
the start disinterest, and, ultimately, the non-use of findings.
Evaluation teams can find out too late that no budget was allocated for
report production, verbal presentations, or dissemination.
Organizational Preconceptions are held about the project that are resistant to change.
culturedefined Staff may view negative or sensitive evaluation results as shameful criticism
as management and resist discussing them openly.
operating style, the
way authority and Communication may be inefficient due to the loss of institutional memory
responsibility are because of rapid staff turnover or other reasons.
assigned, or how staff Leaders who do not want to share performance information in open
are developed meetings hinder dissemination of performance findings.
Ongoing communication during an evaluation is inhibited by the
organizations dysfunctional information-sharing systems.
Overcoming Challenges
In theory, anxiety and resistance should be lessened by the participatory, utilization-focused
evaluation approach and mitigated by a focus on evaluation as dialogue and learning, rather than on
judgment and accountability. Treating evaluation stakeholders respectfully, in a way that protects their
dignity, will also help to lessen anxiety.
60
Communicating and Reporting on an Evaluation Page 4
For example, if the group has a high degree of literacy, written communication can be used. If the audience is
largely illiterate, however, visual and oral communications will be better communication methods.
Program donor, Findings and Final evaluation June 15th Evaluation Printing costs
located in recommendations report with executive team to for 25 copies
Washington, D.C., summary prepare written of written
needs to review reports; PVO report; travel
final evaluation Debriefing meeting to June 30th headquarters costs of staff to
report for decision be held at donor offices staff to prepare Washington,
making about to present findings, debriefing D.C., for
future funding recommendations, and meeting agenda meeting; and
intended actions and presentation time to prepare
and debrief
Table 3, below, presents a wide range of reporting options and descriptions of each option. Use table
3 to choose formats that fulfill the evaluation purposes and meet the needs of different stakeholders
and dissemination audiences (Patton 1997).
61
Communicating and Reporting on an Evaluation Page 5
WRITTEN REPORTING
The final evaluation report presents the full view of the evaluation. It serves as the basis for
the executive summary, oral presentations, and other reporting formats, and is an important resource
for the program archives. Many program donors have a prescribed format for required reports; follow
this format carefully. Usually, at least one draft evaluation report is circulated to stakeholders for com-
ments and additional insights prior to the final report production.
An executive summary is a short versionusually one to four pagesof the final evaluation
report, containing condensed versions of the major sections. Placed at the beginning of the final
evaluation report, it communicates essential information accurately and concisely. Executive
summaries are typically written for busy decision-makers and enable readers to get vital information
about the evaluation without having to read the entire report. The executive summary may be
disseminated separately from the full report and should be understandable as a stand-alone document.
62
Communicating and Reporting on an Evaluation Page 6
Interimi or progress reports present the preliminary, or initial, draft evaluation findings.
Interim reports are scheduled according to specific decision-making needs of evaluation stakeholders.
While interim reports can be critical to making an evaluation more useful, they can also cause
unnecessary difficulties if interpreted incorrectly. To avoid this problem, begin interim reports by
stating the following:
Which data collection activities are being reported on and which are not
When the final evaluation results will be available
Any cautions for readers in interpreting the findings (Torres et al. 2005).
Human interest, success, and learning stories are different ways to communicate
evaluation results to a specific audience. Donors are increasingly interested in using short narratives
or stories that put a human face on M&E data.
Human interest stories document the experiences of individuals affected by PVO projects and
help to personalize the successes and challenges of PVO work.
Success stories are descriptions of when, what, where, how, and why a project succeeded in
achieving its objectives.
Learning stories narrate cases of unanticipated project difficulties or negative impacts, how these were
identified and overcome, and what was learned that might be helpful in the future to others (De Ruiter
and Aker 2008; Long et al. 2006). These stories can be included in the final report or in an appendix.
For more information on how to write these stories, consult Human Interest Stories: Guidelines and Tools for
Effective Report Writing (De Ruiter and Aker 2008) and Success and Learning Story Package: Guidelines and Tools
for Writing Effective Project Impact Reports (Long et al. 2006); and Writing Human Interest Stories for M&E (Hagens
2008).
Short communicationsnewsletters, bulletins, briefs, and brochuresserve to highlight
evaluation information, help to generate interest in the full evaluation findings, and serve an
organizations public relations purposes. Their format can invite feedback, provide updates, report on
upcoming evaluation events, or present preliminary or final findings. However, the short formats may
be less useful if the evaluation is primarily qualitative, and when a full description of the evaluation
context is critical to interpreting results (Torres et al. 2005). These types of communication use
photos, graphs, color, and formatting to be attractive and eye-catching to the reader.
63
Communicating and Reporting on an Evaluation Page 7
News media communications are another method for disseminating evaluation results. The
project can send the evaluation report to the news media, send them press releases on the report
findings, or encourage interviews of evaluation team members or evaluation stakeholders (Torres et
al. 2005). The news media provides access to a larger audience, such as the general public or a specific
professional group.
Use of media can also be trickythere are no guarantees of what the reporter will write. For this
reason, it is important to promote a clear message to the media, to brief the evaluators and stakeholders
on the main points to speak on, and to contact the media only after other key stakeholders have
reviewed the evaluation findingsno one likes to be surprised by reading about their program in the
press.
150 North
cycles Use different colors or different textures if
100 West in black and white
50 East
0
1st 2nd 3rd 4th
Qtr Qtr Qtr Qtr
2nd Qtr
important from 12 OClock
3r d Qtr
Use bright contrasting colors
4th Qtr
Bar Chart/Cluster Bar Chart Compares differences between Use as few bars as possible
similar information (for example, Use color or texture to emphasize data
100
80 percent distribution) aspects
60 East
40
20 West Cluster bar chart compares Place numbers showing bar values at
0
1st 2nd 3rd 4th
North
several items top or inside the bar
Qtr Qtr Qtr Qtr
Other Charts (flow, time series, Show processes, elements, roles, Use white space effectively
scatterplot) or parts of a larger entity Convey the message in the title
Add the data source
100
80
East
60
West
40
North
20
0
0 2 4 6
64
Communicating and Reporting on an Evaluation Page 8
Illustrations (diagrams, maps or Effectively convey messages or Keep it simpleif a lot of explanation is
drawings) ideas that are difficult to express needed, use text instead
in words Use illustrations creatively as they help
Show organizational structures, to communicate
demonstrate flows Include a legend to define any symbols
Show direction used
Use flow charts to show issues Use white space
Use map charts to show results
comparable across geographic
regions or countries
VERBAL PRESENTATIONS
Oral or verbal presentations communicate evaluation progress and findings to stakeholders and other
audiences. With this method, audiences can ask questions and communication is more interactive.
Oral presentations with facilitated discussions can lead to dialogue among stakeholders and
commitment to actions (see critical reflection, below) (Torres et al. 2005).
Debriefing meetings typically begin with a brief presentation, followed by discussion of key
findings or other issues. Ongoing debriefing meetings may be held to communicate evaluation
progress to program managers. A final debriefing meeting can be held with stakeholders to share and
discuss key findings and recommendations from the final evaluation report.
Panel presentations can be used to bring together evaluation stakeholders to present key
evaluation findings and recommendations or other evaluation components. Usually composed of
three to four panelists, each individual makes a short presentation on some aspect of the evaluation.
A moderator then facilitates discussion among panelists and between panelists and the audience
(Kusek and Rist 2004).
Broadcast media can be useful when evaluation findings need to be disseminated beyond the
primary stakeholders. Radio is a very effective way to disseminate information. Community radio
stationswith a mandate for developmentprovide low-cost production and often have local
language translation capacity.
65
Communicating and Reporting on an Evaluation Page 9
CREATIVE REPORTING
Consider using creative but less-traditional communication formats to report on evaluation findings.
These formats can be crucial when reporting information to illiterate stakeholders, as they show
respect for local communication traditions such as oral history. Information on how to use video
presentations, dramas or role plays, roster sessions, writeshops, critical reflection events, after action
reviews, and working sessions are presented below.
Video presentations bring the combined power of visual imagery, motion, and sound. Videos
can be shot in digital formats, edited on computers, and disseminated in CD-ROM or digital videodisk
(DVD) formats. Although it is advantageous to have a presenter, videos can be distributed and viewed
by wide numbers of audiences. Videos are especially useful to do the following (Torres et al. 2005):
Video Tips
Establish the video purpose and criteria for selecting program events to be filmed.
Obtain permission from program participants before videotaping.
Ensure the videos for stand-alone pieces include sufficient background information about the program
and the evaluation.
Consider the intended audience when determining length; shorter videos (2030 minutes) have a
better chance of being included in meeting agendas.
Dramas or role plays are powerful ways to portray evaluation findings and to illustrate potential
applications of recommendations. Torres (2005) describes three theatrical formats where evaluation
findings are presented and used to spark dialogue.
1. Traditional sketches are developed from evaluation dataespecially interviews and focus
groupsand may also portray evaluation findings. Actors perform a sketch and then exit. The
sketch is followed by a facilitator-guided discussion with audience members.
2. Interactive sketches are provocative scenarios that engage audience members in thinking and
talking about evaluation issues and findings. Following an interactive sketch, the audience
discusses their reactions with the actors, who stay in character, again guided by a facilitator who
also provides evaluation data. After the facilitated discussions, actors repeat the sketch, changing
it according to the audience discussion outcomes.
3. Forum theater workshops use role playing. A facilitator presents evaluation findings; participants
can be both actors and audience members. Participants create mini-scenes based on evaluation
findings and their own experiences. These are dynamic scenarios; participants can move in and
out of acting roles, and actors can change strategies mid-scene. A facilitator then elicits questions
and leads discussions about each mini-scene.
66
Communicating and Reporting on an Evaluation Page 11
or group establish an environment of trust, respect, and collaboration among evaluators and
stakeholders. Critical reflection is enhanced when people:
After action reviews are a sequence of reflective activities that can be used during an evaluation
to process an evaluation teams initial findings or to review progress or obstacles in the evaluation
process. As with other critical reflection events, after action reviews work best in a safe environment
where people can express their ideas openly; a facilitator poses open questions and leads the group
discussions. After action reviews are conducted while memories are still fresh. The facilitator asks a
series of sequenced questions as follows and records key points made by the group, such as:
Working sessions with evaluation stakeholders are the hallmark of a collaborative participatory
evaluation and can be conducted at any time during the evaluation (Torres et al. 2005). Effective
working sessions apply adult learning principles, such as those used for workshops. Guidance for
conducting productive working sessions is described in the box, below.
67
Communicating and Reporting on an Evaluation Page 12
A chat room is an area on the Internet where two or more people can have a typed conversation
in real time; this method is ideal for routine conversations about data collection or evaluation
procedures.
Web conferences are meetings between people at different locations done through an Internet
connection that allows them to view the same document or presentation on computer monitors
simultaneously, along with audio communication. Features of Web conferencing software vary
and may include a chat room feature or video and/or audio communication. Web conferences
can be used for planning, presenting information, soliciting input and reactions, and editing
evaluation plans and reports. Web conferences can be arranged through companies specializing
in the service or through the Internet.
Podcasts are a series of digital media files that are distributed over the Internet for playback on
portable media players (e.g., iPods) and computers. Podcasts enable evaluators to communicate
and report information with stakeholders at any time. For example, if a stakeholder is unable to
attend a final debriefing meeting, a meeting podcast allows him/her to download the podcast of
the event. Although used infrequently at present, this electronic format holds much promise for
the future.
68
Communicating and Reporting on an Evaluation Page 13
This publication is part of a series on key aspects of monitoring and evaluation (M&E) for
humanitarian and socioeconomic development programs. The American Red Cross and Catholic
Relief Services (CRS) produced this series under their respective USAID/Food for Peace
Institutional Capacity Building Grants. The topics covered were designed to respond to field-
identified needs for specific guidance and tools that did not appear to be available in existing
publications. Program managers as well as M&E specialists are the intended audience for the
modules; the series can also be used for M&E training and capacity building. The Short Cuts series
provides a ready reference tool for people who have already used the full modules, those who simply
need a refresher in the subject, or those who want to fast-track particular skills.
69
The M&E modules and Short Cuts series were produced by CRS and the American Red Cross with
financial support from Food for Peace grants: CRS Institutional Capacity Building Grant (AFP-A-
00-03-00015-00) and ARC Institutional Capacity Building Grant (AFP-A-00-00007-00). The views
expressed in this document are those of the authors and do not necessarily represent those of the U.S.
Agency for International Development or Food for Peace.
70
71
Catholic Relief Services (CRS)
228 W. Lexington Street
Baltimore, MD 21201, USA
Tel: (410) 625-2220
www.crsprogramquality.org
72