0% found this document useful (0 votes)
2 views

Eval Framework

The document outlines an evaluation framework for health promotion activities aimed at reducing cardiovascular disease (CVD) and type 2 diabetes. It emphasizes the importance of systematic evaluation for understanding program effectiveness and improving future initiatives, detailing a stepwise process for planning and conducting evaluations. The framework includes engaging stakeholders, describing the program, focusing the evaluation design, collecting and analyzing data, and disseminating lessons learned.

Uploaded by

darinmcrae
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Eval Framework

The document outlines an evaluation framework for health promotion activities aimed at reducing cardiovascular disease (CVD) and type 2 diabetes. It emphasizes the importance of systematic evaluation for understanding program effectiveness and improving future initiatives, detailing a stepwise process for planning and conducting evaluations. The framework includes engaging stakeholders, describing the program, focusing the evaluation design, collecting and analyzing data, and disseminating lessons learned.

Uploaded by

darinmcrae
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Planning for healthy communities 97

12 Evaluation framework

12.1 Introduction
Evaluation is an important component of health promotion activities aimed at reducing the
incidence of CVD and type 2 diabetes. Evaluation enables us to learn about the
effectiveness of activities, as well as the reasons why programs achieve or fail to achieve
their objectives. This information provides a valuable knowledge base for planning and
implementing future activities. In addition, evaluation enables practitioners to meet
accountability requirements and to more systematically document, disseminate and
promote effective practice.
As described in this guide, the evidence base for health promotion interventions to reduce
CVD and diabetes is dominated by relatively large intervention trials conducted by
universities and other research organisations. Smaller, community based initiatives can be
very effective, but are rarely included in the published evaluation literature. Evaluation and
documentation of these interventions will help to provide a more balanced evidence base
for effective action to improve efforts to reduce the incidence of CVD and diabetes.
The evaluation planning guide described below sets out a stepwise process for planning
and conducting a program evaluation. The following characteristics are necessary for
achieving optimum benefits from evaluation:
• Evaluation planning is conducted in parallel with program planning. This interaction
improves both the program and the evaluation.
• Evaluation planning is realistic and strategic. Many evaluation plans simply list an
evaluation activity for every program activity. This approach dilutes the value and impact of
the evaluation; it is better to invest limited evaluation resources where they will be most
useful. Answering the question ‘What do we really need to know from this evaluation?’ is a
key component of evaluation planning.
• Some aspects of data collection are standardised. The use of standardised measures of
dietary intake, physical activity and tobacco use, for example, will allow comparisons to be
made over time, across programs and between national and state data. These comparisons
will contribute to the new generation of health promotion evaluation, which is seeking to
build an evidence base around what program–context–population group combinations are
most effective.
A summary of the evaluation planning guide is in figure 4. This is followed by a more
detailed description of the evaluation planning process, and a worked example of an
evaluation plan.
Other useful evaluation planning resources include ‘Measuring health promotion impacts:
a guide to impact evaluation in intergrated health promotion
(http://hnb.dhs.vic.gov.au/rrhacs/phkb/)
98 Planning for healthy communities

Figure 4: Evaluation planning guide

Step 1: Evaluation preview


• Engage stakeholders
• Identify evaluation resources
• Clarify the purpose of the evaluation

Step 2: Describe the program


• Identify the program plan—program
aims and objectives, components,
resources, and process and outcome indicators

Step 3: Focus the evaluation design


• Specify the evaluation objectives
• Specify the evaluation design
• Specify the data collection methods
• Locate or develop data collection instruments

Step 4: Collect data


• Coordinate the data collection

Step 5: Analyse and interpret data


• Analyse the data
• Interpret the findings

Step 6: Disseminate lessons learnt


• What reports will be prepared?
• What formats will be used?
• How will the findings be disseminated?
Planning for healthy communities 99

12.2 Evaluation planning guide


Step 1: Evaluation preview
• Engage stakeholders. Seek the opinions and participation of people and organisations
involved in the program who are in a position to shape and support the evaluation and to
act on the evaluation findings.
• Identify evaluation resources. The nature and scope of an evaluation depends on the human
and financial resources available. In general, larger programs are expected to require more
comprehensive evaluations, which usually require about 10–15 per cent of the total
program budget. Identify who will coordinate the evaluation and whether the appropriate
skills are available.
• Clarify the purpose of the evaluation. Why is the evaluation being conducted? To meet the
accountability requirements of funding bodies and program management? To improve
practice? To assess program effectiveness? To determine program sustainability? And/or to
document, disseminate or promote the program?

Step 2: Describe the program


• Identify the program plan. A clear statement of program aims and objectives, components,
resources, and process and outcome indicators provides the basis for evaluation planning.
Program logic models provide an excellent framework for both program development and
evaluation planning.

Step 3: Focus the evaluation design


Based on the information collected in steps 1 and 2:
• Specify the evaluation objectives
– Program plans should specify program goals and objectives (that is, what the program
aims to achieve) and program strategies (that is, what the program aims to do to achieve
its goals and objectives).
– Impact/outcome evaluation involves assessing the extent to which the program has
achieved its goals and objectives, while process evaluation involves assessing to what
extent and how well the planned activities have been implemented.
– Evaluation objectives often include both impact and process evaluation questions.
– Long term outcomes (outcome measures) can include changes in health status, such as
reduced mortality, morbidity or disability, and improved quality of life).
– Short or intermediate term outcomes (impact measures) can include changes in
awareness, knowledge, attitudes, behaviours, policies, environments, services, networks
and community participation/action.
100 Planning for healthy communities

• Specify the evaluation design.


– Evaluation designs include quantitative designs (for example, pre/post design with or
without a comparison group, trend analysis) and qualitative designs (for example, case
study, participatory action research and evaluation).
– Quantitative designs are usually used to measure impacts, while qualitative designs are
useful within process evaluation, but this distinction is not definitive.
– Case studies, for example, can be used to qualitatively detect (rather than quantitatively
measure) program impacts.
– Similarly, qualitative designs can be used to help understand why certain (quantitatively
measured) impacts have occurred.
• Specify the data collection methods (sample/participants, data collection instruments, data
collection procedures).
– Data collection methods are usually categorised into quantitative methods (data in the
form of numbers) and qualitative methods (data in the form of words, pictures and
so on).
– Quantitative data collection methods commonly used in health promotion evaluation
include surveys, structured observation, health statistics or other record analysis,
environmental audits and quantitative content analysis (for example, analysis of policies).
– Qualitative data collection methods commonly used in health promotion evaluation
include individual interviews, focus group discussions, participant observation, and
qualitative document and record analysis.
– Regardless of whether quantitative or qualitative data collection methods are used, each
method should specify the sample (for example, people, documents, observation times),
the instrument (for example, questionnaire, interview format) and the procedures (for
example, how, when and where data will be collected, ethical procedures).
– See appendix E for a summary of data collection methods commonly used in health
promotion evaluation.
Also refer to ‘Measuring health promotion impacts: a guide to impact evaluation in
intergrated health promotion’ (http://hnb.dhs.vic.gov.au/rrhacs/phkb/)
• Locate or develop data collection instruments. If appropriate, it is desirable to use
standardised, widely used instruments for data collection to facilitate comparisons across
programs and over time. Questionnaire items assessing dietary intake, physical activity and
tobacco use have been developed and widely used in Australia. The publication Monitoring
food habits in the Australian population using short questions (Marks et al. 2001), for
example, lists:
– questions about fruit and vegetable intake
– questions about foods that contribute to fat intake
– questions about cereals and cereal foods
Planning for healthy communities 101

– proposed indicators for monitoring key aspects of breastfeeding in Australia


– questions about food security.
See section 7.8 for evaluation tools used in physical activity promotion.

Step 4: Collect data


Coordinate data collection by specifying:
• what tasks need to be done
• who should undertake the tasks
• when task should be undertaken
• the required resources.

Step 5: Analyse and interpret data


• Analysing the data. This step involves calculating descriptive statistics (such as frequencies
and means) for quantitative data, and identifying and describing key themes in qualitative
data.
• Interpreting the findings. This step involves comparing the findings with other evaluation
findings; comparing them with standards and similar programs; making judgements and
recommendations; and using the lessons learned for the ongoing development of the
knowledge and evidence base for health promotion practice.

Step 6: Disseminate lessons learned


Deliberate effort is required to ensure evaluation findings are disseminated and used to
inform decision making and guide appropriate action. Lessons learned from the evaluation
should be communicated to relevant audiences in a timely, unbiased and consistent way.
This step requires specifying:
• the reports that will be prepared
• the formats that will be used
• how the lessons learned will be disseminated.
102 Planning for healthy communities

12.3 ‘Healthy people, healthy places’ program evaluation


plan—an example
Decision making
Planning steps Evaluation plan
process/options to consider

Step 1: Evaluation preview


Engage stakeholders Conduct a focus group Stakeholders
Clarify purpose of the discussion with program • Program funding body
evaluation. stakeholders aimed at answering
• Program manager
the question ‘What do we need
Identify evaluation resources. • Project officer
to know from the evaluation?’.
Match resources with evaluation • Collaborating partners
information needs and priorities Goals of evaluation
identified in focus group • Meet accountability
discussion. requirements.
• Contribute to the evidence base
regarding what works.
• Add to knowledge about critical
success factors.
Resources
• Project officer
• Evaluation consultant
• Casual data collection assistant
• Evaluation budget of $10,000
Step 2: Describe the program

Clarify with program staff that Program: Healthy people,


the program was/is to be healthy places
implemented as documented, Goal: To reduce the incidence
because many programs evolve of cardiovascular disease and
and change over time. diabetes in Banksia Bay.
Summarise the program. Outcome objectives
1. To increase fruit and vegetable
consumption among Banksia
Bay residents by 20%.
2. To increase the number of adults
using walking or cycling for short
journeys (less than 2 kilometres)
by 20%.
Strategies
1. Assist all organisations and
settings that provide food within
Banksia Bay to promote fruit and
vegetable consumption through
increased availability, access,
variety, quality and favourable
pricing.
Planning for healthy communities 103

Decision making
Planning steps Evaluation plan
process/options to consider

2. Assist workplaces to promote


active commuting and provide
facilities (e.g. showers, bike
parking)

Step 3: Focus the evaluation design


Evaluation objectives Develop evaluation objectives Evaluation objectives
Evaluation design based on consultation, setting 1. To assess whether program
priorities and resources available has led to increased fruit and
Data collection methods
as described in above steps. vegetable consumption and
Sample (who?)
Decide on the most rigorous, active transport in Banksia Bay.
Instrument (what?) practical design to meet the 2. To document critical success
Procedures (how?) evaluation objectives. factors and barriers to
Usually, use probability sampling successful program
(e.g. random sampling) for implementation.
quantitative measurement. Data collection
Nonprobability sampling suits
1. Pre and post mailed survey of
qualitative assessment.
Banksia Bay residents and a
Review existing instruments or comparison community.
develop your own if necessary.
2. Post-program qualitative
Obtain ethical approval. Specify interviews with all stakeholders.
when, where and how data
Sample
collection will take place.
1. Random sample of 500
residents in each community
obtained from the electoral role.
2. Key stakeholders in the program.
Instrument
1. Standardised questions about
fruit and vegetable intake and
modes of transport.
2. Key informant interview format
focusing on what did and didn’t
work and why.
Procedure
1. Mail self-complete questionnaire
to a random sample of 500 adult
residents in each community,
followed by two reminder letters.
2. Conduct audio-tape recorded
key informant interviews at end
of project.
104 Planning for healthy communities

Decision making
Planning steps Evaluation plan
process/options to consider

Step 4: Collect data


What tasks need to be done? Develop a timeline and detailed
budget, and allocate tasks.
By whom?
When?
What resources are required?

Step 5: Analyse and interpret data


Analyse data 1. Summary of quantitative data
Interpret what the findings mean using descriptive statistics such
as frequencies and means
2. Key themes identified from
qualitative data
Has the program had the
desired impacts? Why?
What key lessons have been
learned?
What are the critical success
factors?
What are the barriers?
What should be done differently
in future?

Step 6: Disseminate lessons learned

What reports will be produced? Print the executive summary and


What formats will be used? full report and send them to key
stakeholders.
How will the lessons learned be
disseminated? Post the report on the website.
Present the findings at
management meeting.
Present a paper at a professional
association annual conference.
Planning for healthy communities 105

12.4 Resources
Health promotion and public health evaluation planning guidelines:
• ‘Measuring Health Promotion Impacts: A Guide To Impact Evaluation In Integrated Health
Promotion’. http://www.hnb.dhs.vic.gov.au/rrhacs/phkb/
• Guide to Evaluating Drug Prevention Projects in Victoria.
http://www.dhs.vic.gov.au/phd/pdpc/publication.htm
This link is for an award winning resource that shows a simple setout for
planning evaluation.
• The US Centres for Disease Control and Prevention ‘Evaluation framework for public health
interventions’
• The US Centres for Disease Control and Prevention ‘Evaluation framework for physical
activity promotion activities’
• Central Sydney Area Health Service ‘Program Management Guidelines for Health
Promotion’ NSW Health, Sydney.

Data collection methods


• Neuman W, L 2003, Social research methods: qualitative and quantitative approaches, Allyn
and Bacon, Boston. Provides a comprehensive description of a wide range of research
designs and methods applicable to health promotion evaluation.
• Robson, C 2002, Real world research, Blackwell Publishers, Oxford. Provides a very user
friendly overview of designs and methods suitable for health promotion evaluation.
• Hawe, P, Degeling, & Hall, J 1990. Evaluating health promotion: A practitioners’ guide.
McLelland & Petty: Sydney. A practical guide to planning and conducting evaluations of
health promotion programs.
See Appendix E for a summary of commonly used data collection methods.

Indicators and measures


• Monitoring food habits in the Australian population using short questions (Marks et al. 2001),
which lists:
– questions about fruit and vegetable intake
– questions about foods that contribute to fat intake
– questions about cereals and cereal foods
– proposed indicators for monitoring key aspects of breastfeeding in Australia
– questions about food security
• Measures of physical activity (see chapter 7 — Resources)

You might also like