0% found this document useful (0 votes)
4 views

Qualitative Methods in Evaluation

The document outlines a curriculum for a short course on qualitative methods in the evaluation of public health programs, developed by MEASURE Evaluation and various international health institutions. It aims to enhance participants' skills in designing and implementing rigorous qualitative evaluations, covering core competencies, practical applications, and ethical considerations. The course includes 12 sessions over 10 days, focusing on qualitative data collection, analysis, and dissemination tailored for health professionals with prior knowledge in evaluation methods.

Uploaded by

GEORGES MUTUALE
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Qualitative Methods in Evaluation

The document outlines a curriculum for a short course on qualitative methods in the evaluation of public health programs, developed by MEASURE Evaluation and various international health institutions. It aims to enhance participants' skills in designing and implementing rigorous qualitative evaluations, covering core competencies, practical applications, and ethical considerations. The course includes 12 sessions over 10 days, focusing on qualitative data collection, analysis, and dissemination tailored for health professionals with prior knowledge in evaluation methods.

Uploaded by

GEORGES MUTUALE
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

Qualitative Methods in Evaluation

of Public Health Programs


A Curriculum on Intermediate Concepts
and Practices: Syllabus
December 2018
Qualitative Methods in Evaluation
of Public Health Programs
A Curriculum on Intermediate Concepts
and Practices: Syllabus
Jessica A. Fehringer
Pilar Torres-Pereda
Phyllis Dako-Gyeke
Elizabeth Archer
Carolina Mejia
Liz Millar
Brittany Schriver Iskarpatyoti
Emily A. Bobrow

December 2018

MEASURE Evaluation This publication was produced with the support of the United States
Agency for International Development (USAID) under the terms of
University of North Carolina at Chapel Hill MEASURE Evaluation cooperative agreement AID-OAA-L-14-00004.
123 West Franklin Street Building C, Suite 330 MEASURE Evaluation is implemented by the Carolina Population
Center, University of North Carolina at Chapel Hill in partnership with
Chapel Hill, North Carolina, USA 27516 ICF International; John Snow, Inc.; Management Sciences for Health;
Palladium; and Tulane University. Views expressed are not necessarily
Phone: +1 919-445-9350 those of USAID or the United States government. MS-17-121A
[email protected] ISBN: 978-1-64232-082-4 © 2018 by MEASURE Evaluation
-

www.measureevaluation.org
ACKNOWLEDGMENTS
The short course, “Qualitative Methods in Evaluation of Public Health Programs,” was developed jointly by
MEASURE Evaluation (funded by the United States Agency for International Development [USAID] and
based at the University of North Carolina at Chapel Hill) and Global Evaluation and Monitoring Network
for Health, in collaboration with experts from the Instituto Nacional de Salud Pública (INSP), in Mexico City;
the University of Ghana in Accra; the Public Health Foundation of India (PHFI), in New Delhi; and the
University of Pretoria, in South Africa.

We thank our Curriculum Advisory Committee (CAC) members Elizabeth Archer, Phyllis Dako-Gyeke,
Sunil George, and Pilar Torres. They guided the conceptualization of the curriculum. Elizabeth Archer,
Phyllis Dako-Gyeke, and Pilar Torres also wrote components, and carried out review, of the current course.
We also thank CAC members Hemali Kulatilaka, Emily Bobrow, and Jen Curran (all of MEASURE
Evaluation) for their contribution to curriculum conceptualization, evaluation, and logistics. Jessica Fehringer
(MEASURE Evaluation) led the CAC activities and overall course development, with the assistance of
Carolina Mejia (formerly of MEASURE Evaluation). Jessica Fehringer, Carolina Mejia, and Liz Millar
(MEASURE Evaluation) also contributed content to and edited the curriculum. Heather Biehl assisted with
editing as well. Brittany Iskarpatyoti (MEASURE Evaluation) also contributed content and Susan Pietrzyk
and Eva Silvestre (both of MEASURE Evaluation) gave feedback on the course outline and selected sessions.

We also thank the Knowledge Management team of MEASURE Evaluation for editorial and
production services.

We thank Global Evaluation and Monitoring Network for Health members who participated in the March
2017 curriculum review meeting in Mexico and the October 2017 Ghana pilot workshop participants. Their
invaluable feedback was used to improve the course to its current version.

We particularly thank USAID for supporting this strategic activity on strengthening qualitative methods in
evaluation and Amani Selim (USAID) for her feedback during the curriculum review meeting.

2 Qualitative Methods in Evaluation of Public Health Programs


CONTENTS
Acknowledgments................................................................................................................................................................2
Introduction..........................................................................................................................................................................4
Course Description..............................................................................................................................................................5
Appendix 1. Core Competencies and Learning Objectives..........................................................................................8
Appendix 2. Session Overviews......................................................................................................................................10
Appendix 3. Qualitative Methods in Public Health Evaluation Short Course: Agenda.........................................19
Appendix 4. Advisory Committee...................................................................................................................................20
Appendix 5. Curriculum Contributors...........................................................................................................................21
References ........................................................................................................................................................................22

Syllabus 3
INTRODUCTION
Health organizations around the globe regularly make evidence-based decisions for effective health
programming. Qualitative evaluation fulfills an important role in rigorous evaluation of programs. The
strength of qualitative evaluation is its ability to provide valuable insight into complex issues, which
quantitative methods may not provide. Qualitative data sources can answer the “why” behind program
successes or challenges. Additionally, qualitative data illuminate the uniquely human side of health
programming and bring to light important contextual factors, such as culture, gender, or societal norms.
Qualitative evaluation may be used to complement quantitative data, answer a question not accessible
quantitatively, or provide a cost-effective data source when one would not otherwise be available.

This syllabus covers a training that is meant to assist health professionals in using qualitative evaluation
skills in sound and rigorous evaluation of their program. The sessions go beyond basic concepts to explore
important considerations of qualitative methods in the context of rigorous evaluation. Through session
content and participatory exercises, participants will gain basic skills in rigorous qualitative data collection,
analysis, and use.

This syllabus provides an overview of the ten-day (8.5 working days) training workshop, including
presentations, facilitator guides, practical sessions, case studies, and sample agendas.

4 Qualitative Methods in Evaluation of Public Health Programs


COURSE DESCRIPTION
Objectives
The purpose of this course is to build participants’ knowledge about the core competencies of the course in
order to enhance their capacity to conceptualize, design, develop, govern, and manage qualitative methods in
evaluation and use the information generated for improved public health practice and service delivery. This
course contextualizes qualitative methods within rigorous evaluation, rather than offering the basics of a
qualitative approach.
This course includes a practical component. Participants are asked to contribute a specific program evaluation
need that they are aware of. Course organizers choose five program evaluation concepts that are best suited
to the course. Small groups will be formed on day 1, and each group will select a real qualitative evaluation for
which they will develop a protocol. Throughout the course, time will be allotted to develop the various protocol
components, based on sessions covered that day. On the final day of the course, groups will present their draft
protocols to the rest of the participants for feedback.

Definition of Rigorous Evaluation


MEASURE Evaluation defines “rigorous evaluation” as an evaluation that follows a clearly specified protocol
that is appropriate to address the evaluation question(s) of interest in the context in which the evaluation
is being conducted. The protocol should use scientifically-recognized methods to address the question(s)
of interest objectively. The protocol should be comprehensive and should discuss threats to the evaluation
findings, the extent to which these threats are addressed by the design, and design limitations and their
implications for the interpretation of results. Implementation of the evaluation should also be “rigorous.”
This means the evaluation should follow recognized scientific standards to ensure that the data quality is good,
procedures are ethical, analysis is correctly implemented, results are interpreted appropriately, and information
products are well written. Rigorous evaluations should be designed and implemented to ensure that they
yield information that is relevant and can inform program decisions. This can be accomplished by engaging
stakeholders from the outset and sharing results in appropriate formats for different audiences. Rigorous
evaluation can include formative evaluations, process evaluations, outcome evaluations, and impact evaluations.

Core Competencies
At the end of this course, participants will have acquired the qualitative program evaluation competencies
listed below.

Competency Categories
• Concepts, approaches, and purposes of qualitative methods in evaluation
• Creating and conceptualizing evaluation questions
• Troubleshooting selected qualitative methods for evaluation
• Discussing the nature of sampling participants in qualitative evaluations
• Developing data collection tools
• Qualitative data analysis techniques
• Fieldwork considerations
• Presentation and dissemination of data
• Quality standards for qualitative inquiry
• Ethical principles for qualitative evaluation, including gender integration

Syllabus 5
Audience
The course curriculum is designed for participants who have a basic knowledge of program evaluation and
qualitative methods. The intended audience is professionals from the monitoring and evaluation (M&E) and
health and development fields.

Course Prerequisites
Prior experience (academic or professional) with qualitative methods and public health program evaluation is
required. For example, it would be beneficial for participants to have already taken a basic course in qualitative
methods and have conducted evaluations.
A short list of required reading is included with the course, which should be completed beforehand. Additional
references are included for participants wishing to learn more about each session topic. In addition, participants
ideally should come with information and resources on the program for which they will design an evaluation
for the groupwork component of the course.

Curriculum Summary
The course consists of 12 sessions covering the key aspects of rigorous qualitative evaluation. The total
duration of the course is 65 hours, to be covered over 10 days of in-person instruction, including time
for practical application. Detailed competencies and learning objectives are included in the appendices, along
with the agenda.

Sessions
1. Introduction to Paradigms and Qualitative Evaluation
2. Creating and Conceptualizing Qualitative Evaluation Questions
3. Troubleshooting in Selected Qualitative Methods for Evaluation
4. Developing Data Collection Tools
5. Sampling Strategies and Saturation
6. Qualitative Data Analysis Techniques for Drawing Themes
7. Qualitative Data Analysis: Hands-On
8. Quality Research Standards for Qualitative Inquiry: Trustworthiness
9. Developing a Fieldwork Plan for Qualitative Evaluation
10. Data Presentation and Dissemination
11. Key Ethical Principles in Qualitative Evaluation
12. Integrating Gender into Your Evaluation

Teaching Methods
Course delivery is based on adult learning principles. A range of teaching methods, such as lectures,
discussions, case studies, exercises, and group work, will address participants’ varying learning styles. Each
module includes varied teaching approaches for its activities.

6 Qualitative Methods in Evaluation of Public Health Programs


Course Materials
The course materials include digital copies of the following:
• Course syllabus
• Facilitators’ guide
• PowerPoint presentations
• Case study
• Group exercises
• Examples of relevant tools/guides
• Additional reference materials

Course Evaluation
The following are the recommended course evaluation methods:
• Pretests and post-tests covering all 12 sessions
• Simple daily participants’ evaluation form for facilitators to review covering the following:
o Was content clear?
 Were the facilitators prepared and organized in conducting the session?
 Overall impression of the day (use a scale)
• Final evaluation, stressing the following:
o Overall impressions
o Comments on specific module presentations
o Group comments and ranking
o What worked best; what did not work
o Suggestions for improvement (general and specific suggestions)

• Assessment of facilitators

Syllabus 7
APPENDIX 1. CORE COMPETENCIES AND LEARNING OBJECTIVES
Characterization of qualitative evaluation
Discuss major concepts, approaches, and types of qualitative methods in evaluation, including the purpose
of using qualitative methods in evaluation as well as discussing the use of mixed-methods.
LO1: Understand and compare the four major paradigms of evaluation.
LO2: Compare and contrast the use of qualitative methods for evaluation with other approaches.
LO3: Establish the appropriateness of the use of mixed-methods of evaluation.
Evaluation questions and theory of change
Identify evaluation questions that are appropriate for qualitative methods. Analyze the theory of change of
the program in order to identify relevant evaluation question(s) for qualitative assessment.
LO1: Use the program’s theory of change to identify key questions that can be answered using different
types of qualitative evaluation.
LO2: Conceptualize key components of evaluation questions.
Methods
Assess and select appropriate methods for qualitative evaluations.
LO1: Explain the pros and cons of selected qualitative methods for rigorous evaluation.
LO2: Describe methods to mitigate common problems in qualitative evaluation.
Data collection tools: Develop data collection tools that reflect the evaluation question
Design various data collection tools appropriate for addressing specific evaluation questions: in-depth
interviews, focus group discussions, and observation guides.
LO1: Identify specific tools used for various qualitative data collection approaches.
LO2: Describe the structure and components of qualitative data collection tools.
LO3: Demonstrate the use of probes to elicit in-depth responses.
Utilize appropriate data collection tools to address an evaluation question.
LO4: Outline sets of questions that can address specific study objectives in data collection instruments.
LO5: Demonstrate the logical flow of questioning in a data collection tool.
Methods/design: Sampling considerations
Discuss the nature of sampling participants in qualitative evaluations.
LO1: Discuss types of sampling strategies employed in qualitative evaluations.
LO2: Explain the concept of data saturation and how to identify it.
LO3: Discuss factors that have an impact on the sampling strategy, including the emergent nature of
qualitative evaluation.
LO4: Discuss strategies to reduce bias in sampling.
Analysis: Appropriately select qualitative data analysis techniques to develop evaluation question–relevant
themes drawing on the evidence
Demonstrate the relevance of various qualitative data analysis techniques for evaluation; validate and utilize
themes that can address the evaluation questions.
LO1: Explain qualitative data analysis and its approaches.
LO2: Describe stages in conducting qualitative analysis.
LO3: Develop a coding structure for categorizing data.
LO4: Apply an analytic method for drawing themes.

8 Qualitative Methods in Evaluation of Public Health Programs


Develop a data analysis plan for a qualitative evaluation.
LO5: Design an analysis plan using a selected analytical technique.
LO6: Understand main practicalities of analysis for evaluation.
LO7: Demonstrate use of different qualitative analysis software and their applicability to specific analytical steps.
Applying qualitative norms in research: Understand and apply approaches to strengthen trustworthiness of the
findings from qualitative evaluation
Debate the philosophical underpinnings of trustworthiness (quality research standards for qualitative inquiry).
LO1: Describe the various approaches and principles of establishing quality in qualitative evaluation.
Illustrate the practical application of trustworthiness in qualitative evaluation.
LO2: Justify the choice of approach to qualitative norms to be applied for a particular study.
LO3: Develop a plan for establishing trustworthiness in a qualitative component of an evaluation.
Fieldwork considerations
Discuss practical constraints and requirements in qualitative evaluation, and develop a fieldwork plan that
takes this into consideration.
LO1: Understand what qualitative data collection in evaluation requires.
LO2: Outline field data collection, identify timeline components, and find potential solutions
to timing constraints.
LO3: Describe key components of a field data collection budget and potential solutions to
budget-related constraints.
LO4: Describe the interviewer field team: hiring, training, and field supervising needs.
LO5: Understand considerations related to the funding agency or government regulatory body requirements.
LO6: Recognize the special considerations, including gender issues, required for qualitative methods and the
management of crisis during fieldwork.
Data presentation and dissemination
Evaluate the appropriateness of various types of data presentation for particular audiences.
LO1: Organize evaluation findings in a coherent and clear story line.
LO2: Propose and negotiate the report format and dissemination plan with stakeholders.
LO3: Demonstrate how dissemination will be appropriate for various stakeholders, including potentially
vulnerable or special populations.
LO4: Formulate a dissemination plan that provides actionable recommendations based on qualitative data.
Illustrate ethical principles for qualitative evaluation and how those apply to evaluation
Identify and address ethical, gender-related, and political implications of, and considerations in,
evaluation work.
LO1: Specify the basic tenets of ethical protocols for field data collection.
LO2: Identify special ethical considerations in qualitative evaluation when using methods such as case studies,
focus group discussions, interviews, or observations.
LO3: Describe ethical and gender-related issues in evaluation design, data collection, analysis, and
dissemination/use.
LO4: Understand the potential influence of political and cultural contexts in evaluation.
Design an ethically acceptable qualitative component of an evaluation.
LO5: Given a specific evaluation context or area/location, identify potentially vulnerable
or special populations.
LO6: Describe types of consent for data collection and basic components of a consent form.
LO7: Explain data security considerations and steps to ensure data security.

Syllabus 9
APPENDIX 2. SESSION OVERVIEWS
Session 1. Introduction to Paradigms and Qualitative Evaluation
Session Objectives
By the end of this session, participants will be able to do the following:
• Understand and compare the four major paradigms of evaluation
• Compare and contrast the use of qualitative methods for evaluation with other approaches
• Establish the appropriateness of the use of mixed-methods of evaluation

Topics Covered
• Four major paradigms with respect to evaluation in health systems
• Strengths and weaknesses of various philosophical approaches to evaluation
• Introduction to qualitative evaluation
• Introduction to mixed-methods evaluation
• Types of qualitative assessment

Required Reading
Onwuegbuzie, A.J. (2002). Why can’t we all get along? Towards a framework for unifying research paradigms.
Education; 122(3):518–531. Retrieved from http://files.eric.ed.gov/fulltext/ED452110.pdf

Further Reading
None

Session 2. Creating and Conceptualizing Qualitative Evaluation Questions


Session Objectives
By the end of this session, participants will be able to do the following:
• Use the program theory of change to identify key questions that can be answered using different types
of qualitative evaluation
• Conceptualize key components of evaluation questions

Topics Covered
• Creating questions appropriate to the type of evaluation planned
• Aligning evaluation questions with program theory of change
• Conceptualizing evaluation questions

Required Reading
Centers for Disease Control and Prevention, National Center for HIV/AIDS, Viral Hepatitis, STD, and TB
Prevention. (2018). Types of Evaluation. Retrieved from https://www.cdc.gov/std/Program/pupestd/Types%20
of%20Evaluation.pdf

10 Qualitative Methods in Evaluation of Public Health Programs


Further Reading
Agee, J. (2009). Developing qualitative research questions: a reflective process. International journal
of qualitative studies in education; 22(4):431–447. Retrieved from http://www.tandfonline.com/doi/
pdf/10.1080/09518390902736512

Session 3. Troubleshooting in Selected Qualitative Methods for Evaluation


Session Objectives
By the end of this session, participants will be able to do the following:
• Explain the pros and cons of selected qualitative methods for rigorous evaluation
• Describe methods to mitigate common problems in qualitative evaluation

Topics Covered
• Strengths, challenges, and considerations in using selected qualitative methods of data collection,
such as participant observation, focus group discussions, and interviews
• Techniques for mitigating or managing challenges in qualitative data collection

Required Reading
None

Further Reading
Rimando, M., Brace, A., Namageyo-Funa, A., Parr, T.L., Sealy, D.A., Davis, T.L., & Christiana, R.W. (2015).
Data collection challenges and recommendations for early career researchers. The Qualitative Report; 20(12):2025.
Retrieved from http://nsuworks.nova.edu/tqr/vol20/iss12/8

Session 4. Developing Data Collection Tools


Session Objectives
By the end of this session, participants will be able to do the following:
• Identify specific tools for qualitative data collection
• Describe the structure and components of qualitative data collection tools
• Outline sets of questions that can address specific evaluation components in data collection instruments
• Demonstrate the use of probes to elicit in-depth responses
• Design tool with logical flow of questions

Topics Covered
• Structure of qualitative data collection tools
• Techniques for achieving flexibility
• Using enabling techniques
• Preparing data collection tools

Syllabus 11
Required Reading
Ritchie, J., Lewis, J., Nicholls, C.M., & Ormston, R. (Eds.). (2013). Qualitative research practice: A guide for
social science students and researchers. Sage. Retrieved from https://mthoyibi.files.wordpress.com/2011/10/
qualitative-research-practice_a-guide-for-social-science-students-and-researchers_jane-ritchie-and-jane-lewis-
eds_20031.pdf

Further Reading
DiCicco-Bloom, B., & Crabtree, B.F. (2006). The qualitative research interview. Medical education; 40(4):314–
321. Retrieved from https://onlinelibrary.wiley.com/doi/epdf/10.1111/j.1365-2929.2006.02418.x

Session 5. Sampling Strategies and Saturation


Session Objectives
By the end of this session, participants will be able to do the following:
• Discuss types of sampling strategies employed in qualitative evaluation
• Explain the concept of data saturation and how to identify this
• Discuss factors that have an impact on the sampling strategy, including the emergent nature
of qualitative evaluation
• Discuss strategies to reduce bias in sampling

Topics Covered
• Types of qualitative sampling approaches
• The concept of data saturation
• Factors to consider when sampling
• Reducing biases in sampling

Required Reading
Patton, M. (1990). Purposeful Sampling. In Qualitative evaluation and research methods (pp. 169–186). Beverly Hills,
CA: Sage. Retrieved from https://legacy.oise.utoronto.ca/research/field-centres/ross/ctl1014/Patton1990.pdf

Further Reading
Guest, G., Namey, E., & McKenna, K. (2017). How many focus groups are enough? Building an evidence base
for nonprobability sample sizes. Field methods; 29(1):3–22.

Devers, K.J., & Frankel, R.M. (2000). Study design in qualitative research—2: Sampling and data collection
strategies. Education for health; 13(2):263.

Teddlie, C., & Yu, F. (2007). Mixed methods sampling: A typology with examples. Journal of mixed methods
research; 1(1):77–100.

12 Qualitative Methods in Evaluation of Public Health Programs


Session 6. Qualitative Data Analysis Techniques for Drawing Themes
Session Objectives
By the end of this session, participants will be able to do the following:
• Explain qualitative data analysis and its approaches
• Describe stages in conducting qualitative analysis
• Develop a coding structure for categorizing data
• Apply analytical method for drawing themes

Topics Covered
• Overview of qualitative analysis
• Techniques for drawing themes
• Coding qualitative data
• Identifying and reviewing themes

Required Reading
Braun, V., Clarke, V., & Terry, G. (2012). Thematic analysis. APA handbook of research methods in psycholog y;
2:57–71. Retrieved from https://www.researchgate.net/profile/Victoria_Clarke2/publication/269930410_
Thematic_analysis/links/5499ad060cf22a83139626ed/Thematic-analysis

Further Reading
Fereday, J., & Muir-Cochrane, E. (2006). Demonstrating rigor using thematic analysis: A hybrid approach of
inductive and deductive coding and theme development. International journal of qualitative methods; 5(1):80–92.

MacQueen, K.M., McLellan, E., Kay, K., & Milstein, B. (1998). Codebook development for team-based
qualitative analysis. CAM Journal; 10(2):31–36.

Starks, H., & Trinidad, S.B. (2007). Choose your method; A comparison of phenomenology, discourse
analysis, and grounded theory. Qualitative Health Research; 17(10). Retrieved from http://journals.sagepub.com/
doi/abs/10.1177/1049732307307031

Session 7. Qualitative Data Analysis: Hands-On


Session Objectives
By the end of this session, participants will be able to do the following:
• Design an analysis plan using a selected analytical technique
• Understand main practicalities of analysis for evaluation
• Demonstrate use of different qualitative analysis software and their applicability to specific
analytical steps

Syllabus 13
Topics Covered
• Review of analysis process and main analytical techniques
• Designing the steps of an analysis plan using selected analytical techniques and strategies including
content analysis, thematic analysis, and discourse analysis
• Deciding on an analysis plan: creating an analysis chart
• Finding gaps and emerging data
• Using qualitative software to help with analysis (demonstration using qualitative software)
• Creating and applying codes
• Generating outputs

Required Reading
Patton, M.Q. (2002). Qualitative Research & Evaluation Methods. In Qualitative evaluation and research methods;
3rd Ed;440–447;462–481. Thousand Oaks, CA: Sage. Retrieved from https://www.researchgate.net/profile/
Masoumeh_Bahman/post/What_Is_Qualitative_Research/attachment/59d6277279197b8077985b9d/AS
%3A325803062644739%401454688912157/download/qualitative-research-evaluation-methods-by-michael-
patton.pdf

Further Reading
Miles, M.B., & Huberman, A.M. (1994). Qualitative Data Analysis: An Expanded Sourcebook, 2nd Ed.
Thousand Oaks, CA: Sage.
Kozinets, R.V. (2015). Netnography. In The International Encyclopedia of Digital Communication and Society (eds P. H.
Ang and R. Mansell). John Wiley & Sons, Ltd.
Salmons, J. (2014). Qualitative online interviews: Strategies, design, and skills. Thousand Oaks, CA, USA::
SAGE Publications, Inc.

Session 8. Quality Research Standards for Qualitative Inquiry (Trustworthiness)


Session Objectives
By the end of this session, participants will be able to do the following:
• Discuss the relevance of trustworthiness in qualitative evaluations
• Justify the choice of qualitative approach to be applied to a particular evaluation
• Develop a plan for establishing trustworthiness in a qualitative component of an evaluation

Topics Covered
• Trustworthiness with respect to evaluation in health systems
• Practical application of trustworthiness and programs

Required Reading
Robert Wood Johnson Foundation. (2008). Lincoln and Guba’s Evaluative Criteria. Retrieved from
http://www.qualres.org/HomeLinc-3684.html

Coryn, C.L. (2007). The Holy Trinity of Methodological Rigor: A Skeptical View. Journal of MultiDisciplinary
Evaluation; 4(7):26–31. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.899.
2553&rep=rep1&type=pdf
14 Qualitative Methods in Evaluation of Public Health Programs
Further Reading
Rolfe, G. (2006). Validity, trustworthiness and rigour: quality and the idea of qualitative research. Journal of
advanced nursing; 53(3):304–310.

Tracy, S.J. (2010). Qualitative quality: Eight “big-tent” criteria for excellent qualitative research. Qualitative
inquiry; 16(10):837–851. Retrieved from http://journals.sagepub.com/doi/abs/10.1177/1077800410383121

Session 9. Developing a Fieldwork Plan for Qualitative Evaluation


Session Objectives
By the end of this session, participants will be able to do the following:
• Understand what qualitative data collection in evaluation requires
• Outline field data collection, identify timeline components, and find potential solutions
to timing constraints
• Describe key components of a field data collection budget and potential solutions
to budget-related constraints
• Describe the interviewer field team—hiring, training, and field supervising needs
• Understand considerations related to the funding agency or government regulatory body requirements
• Recognize the special considerations required for qualitative methods and the management of crisis
during fieldwork

Topics Covered
• From A to Z in qualitative evaluation fieldwork
• Fieldwork: time and budget
• Fieldwork team: aspects of quality and care
• Agencies and government regulatory aspects
• Special considerations in qualitative evaluation
• Management of crisis during fieldwork

Required Reading
Bamberger, M., Rugh, J., & Mabry, L. (2012). RealWorld Evaluation: Working Under Budget, Time, Data,
and Political Constraints, 2nd edition: A Condensed Overview. SAGE Publications, Inc. Retrieved from
https://usaidlearninglab.org/sites/default/files/resource/files/Condensed_Summary_Overview_of_
RealWorld_Evaluation_2nd_edition.pdf

Further Reading
Corbin, J., & Strauss, A. (2008). Basics of Qualitative Research: :Techniques and Procedures for Developing
Grounded Theory (3rd ed.). Thousand Oaks, CA, USA: SAGE Publications, Inc.

Patton, M.Q. (2015). Qualitative Research & Evaluation Methods: Integrating Theory and Practice (4th
Edition). Thousand Oaks, CA, USA: SAGE Publications, Inc.

Syllabus 15
Session 10. Data Presentation and Dissemination
Session Objectives
By the end of this session, participants will be able to do the following:
• Organize evaluation findings in a coherent and clear storyline
• Propose and negotiate the report format and dissemination plan with stakeholders
• Demonstrate how dissemination will be appropriate for various stakeholders, including potentially
vulnerable or special populations
• Formulate a dissemination plan that provides actionable recommendations based on qualitative data

Topics Covered
• Writing a report for the funding agency; writing a report for government program
• Report review: clarifications and changes after external reviewers’ comments
• Presenting results with funders and mandatory evaluations: using evaluation results for recommended
changes and program modification
• How to disseminate results (report, sharing results with community, scientific paper)
• Presenting results to different audiences (presenting sensible results)
• What to show, how to show, and where to show in order to ensure the use of results

Required Reading
Tong, A., Sainsbury, P., & Craig, J. (2007). Consolidated criteria for reporting qualitative research (COREQ):
a 32-item checklist for interviews and focus groups. International journal for quality in health care; 19(6):349–357.
Retrieved from https://academic.oup.com/intqhc/article/19/6/349/1791966

Further Reading
Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health
Promotion, Office on Smoking and Health, Division of Nutrition, Physical Activity and Obesity,. (2013).
Developing an effective evaluation report: Setting the course for effective program evaluation. Atlanta,
Georgia: CDC. Retrieved from https://www.cdc.gov/eval/materials/developing-an-effective-evaluation-
report_tag508.pdf

Reid, A., & Gough, S. (2000). Guidelines for reporting and evaluating qualitative research: what are the
alternatives? Environmental Education Research; 6(1):59–91.

Spencer, L., Ritchie, J., Lewis, J., & Dillon, L. (2003). Quality in qualitative evaluation: a framework for assessing
research evidence. Government Chief Social Researcher’s Office, London: Cabinet Office. Retrieved from
https://www.heacademy.ac.uk/system/files/166_policy_hub_a_quality_framework.pdf

Session 11. Key Ethical Principles in Qualitative Evaluation


Session Objectives
By the end of this session, participants will be able to do the following:
• Specify the basic tenets of ethical protocols for field data collection
• Given a specific evaluation context or area/location, identify potential vulnerable or special populations
• Describe types of consent for data collection and basic components of a consent form

16 Qualitative Methods in Evaluation of Public Health Programs


• Identify special ethical considerations in qualitative evaluation when using methods such as case studies,
focus group discussions, interviews, or observations
• Explain data security considerations and steps to ensure data security
• Describe ethical issues in evaluation design, data collection, analysis, and dissemination/use
• Understand potential influence of political and cultural context in evaluation

Topics Covered
• What a protocol/evaluation plan must have in respect to the basics of ethics in evaluation (informed
consent, freedom/leaving the evaluation, equal opportunities, anonymity, confidentiality, no harm/
harm reduction)
• Cultural aspects of evaluation topics, how evaluation and qualitative techniques can lead
to subject vulnerability
• Ethical aspects of qualitative inquiry
• Institutional review and informed consent
• Reporting sound data, reviewing with funding agency and government
• Confidentiality and anonymity in reporting

Required Reading
Hewitt, J. (2007). Ethical components of researcher-researched relationships in qualitative interviewing.
Qualitative health research; 17(8):1149–1159. Retrieved from http://journals.sagepub.com/doi/abs/10.1177/1049
732307308305?url_ver=Z39.88-2003&rfr_id=ori:rid:crossref.org&rfr_dat=cr_pub%3dpubmed

Further Reading
General Assembly of the World Medical Association. (2014). World Medical Association Declaration of
Helsinki: ethical principles for medical research involving human subjects. The Journal of the American College of
Dentists; 81(3):14. Retrieved from http://jamanetwork.com/journals/jama/fullarticle/1760318

Session 12. Integrating Gender into Your Evaluation


Session Objectives
By the end of this session, participants will be able to do the following:
• Define gender and related terms
• Identify why gender is important to qualitative evaluation of public health programs
• Describe gender issues in qualitative evaluation design, data collection, analysis, and dissemination/use

Topics Covered
• Key gender-related definitions
• Importance of gender to health outcomes
• Sex-disaggregation in qualitative data
• Gender-sensitive measures in qualitative data
• How gender matters in the qualitative evaluation design
• Impact of gender-related norms on data collection logistics
• Gender integration in analysis and use of qualitative data
• Gender biases in data collection and analysis

Syllabus 17
Required Reading
Day, S., Mason, R., Lagosky, S., & Rochon, P.A. (2016). Integrating and evaluating sex and gender in health
research. Health Research Policy and Systems; 14:75. Retrieved from http://doi.org/10.1186/s12961-016-0147-7

Further Reading
MEASURE Evaluation. (2018). Standard Operating Procedure for Integrating Gender in Monitoring, Evalua-
tion, and Research. Chapel Hill, NC, USA: MEASURE Evaluation. Retrieved from https://www.measureeval-
uation.org/resources/publications/fs-17-247b

MEASURE Evaluation. (2017). Gender in Series. Chapel Hill, NC, USA: MEASURE Evaluation. Retrieved
from https://www.measureevaluation.org/our-work/gender/gender-in-series

Morgan, R. et al. (2016). How to do (or not to do)… gender analysis in health systems research. Health Policy
and Planning; 31(8)1069–1078. Retrieved from https://academic.oup.com/heapol/article/31/8/1069/2198200

World Bank. (2005). Module 16. Gender issues in monitoring and evaluation overview. In Gender, Monitoring,
Evaluation and Learning Key Resources. Washington, DC: World Bank. Retrieved from http://siteresources.world-
bank.org/INTGENAGRLIVSOUBOOK/Resources/Module16.pdf

18 Qualitative Methods in Evaluation of Public Health Programs


APPENDIX 3. QUALITATIVE METHODS IN PUBLIC HEALTH EVALUATION SHORT COURSE: AGENDA
Day 1: Day 2: Day 3: Day 4: Day 5: Day 6: Day 7: Day 8: Day 9: Day 10:
Thursday Friday Saturday Sunday Monday Tuesday Wednesday Thursday Friday Saturday
8:30–9:00a 9:00–10:30a 9:00–10:30a 9:00–10:30a 9:00–10:30a 9:00–10:00a 9:00–10:30a 9:00–10:30a 9:00–10:30a 9:00–10:30a
Registration Session 3: Off Session 5: Session 6 Session 6/7: 9–9:30a recap Session 11: Group
Session 1
9:00–10:30a continued Trouble- Sampling continued time for data of session 8, Key Ethical presentations (3)
Opening/workshop shooting in Strategies analysis recap, questions Principles in
10:00–10:30a
objectives/ Selected and questions, Qualitative
Session 7: 9:30–10:30a:
agenda/ Qualitative Saturation facilitator-led Evaluation
logistics and Methods for Qualitative discussion Session 9:
introductions Evaluation Data Analysis: Developing a
Hands-On Fieldwork Plan
for Qualitative
Evaluation

10:30–10:45a Tea break


10:45a–12:30p 10:45a–12:30p 10:45a–12:30p 10:45a–12:30p 10:45a–12:30p 10:45a–12:30p 10:45a–12:30p 10:45a–12:30p 10:45a–12:30p 10:45–12:30p
Session 1: Session 2: Session 3 Off 10:45–11:45 Session 7 Session 8: Session 9 Session 11 Group
Introduction Creating and continued Session 5: Sampling continued Quality continued continued presentations (2)
to Paradigms Strategies and Research
Conceptual-
Saturation 12:30–1:00p
and Qualitative izing Standards for
Evaluation Qualitative 11:45–12:30 Qualitative Closing/
Evaluation Session 6: Qualitative Inquiry: evaluation
Questions Data Analysis Trustworthiness
Techniques for
Drawing Themes

12:30–1:30p Lunch
1:30–2:45 1:30–2:45pm 1:30–2:45p Off 1:30–2:45p 1:30–2:45p 1:30–2:45p 1:30–2:45p 1:30–2:45p
Session 1 Session 2 Session 4: Session 6 Session 7 Session 8 Session 10: Session 12:
continued continued Developing continued continued continued Data Integrating
Data Presentation Gender
Collection and into Your
Tools
Dissemination Evaluation

2:45–3:00p Tea break


3:00–4:30p 3:00–5:00p 3:00–4:30p 3:00–5:00p 3:00–5:00p 3:00–5:00p 3:00–5:00p 3:00–4:30p 3:00–5:00p
Group work Group work Session 4 Off Group work Group work Off Session 10 Group work
organization continued continued
5:30pm
Group dinner
offsite—
location TBD

19
APPENDIX 4. ADVISORY COMMITTEE

Name Position Organization

Jessica Fehringer Chair MEASURE Evaluation

Carolina Mejia Assistant Chair MEASURE Evaluation

Elizabeth Archer Member University of Pretoria, South Africa

Emily Bobrow Member MEASURE Evaluation

Jen Curran Member MEASURE Evaluation (formerly)

Phyllis Dako-Gyeke Member University of Ghana

Sunil George Member Public Health Foundation of India (PHFI), India

Hemali Kulatilaka Member MEASURE Evaluation

Liz Millar Member MEASURE Evaluation

Pilar Torres Member National Institute of Public Health (INSP), Mexico

20 Qualitative Methods in Evaluation of Public Health Programs


APPENDIX 5. CURRICULUM CONTRIBUTORS

Session Name Organization

Session 1 Elizabeth Archer University of Pretoria

Session 1 Emily Bobrow MEASURE Evaluation

Session 1 Carolina Mejia MEASURE Evaluation (formerly)

Session 2 Jessica Fehringer MEASURE Evaluation

Session 2 Carolina Mejia MEASURE Evaluation (formerly)

Session 2 Liz Millar MEASURE Evaluation

Session 3 Jessica Fehringer MEASURE Evaluation

Session 3 Carolina Mejia MEASURE Evaluation (formerly)

Session 3 Liz Millar MEASURE Evaluation

Session 4 Phyllis Dako-Gyeke University of Ghana

Session 4 Jessica Fehringer MEASURE Evaluation

Session 5 Phyllis Dako-Gyeke University of Ghana

Session 5 Jessica Fehringer MEASURE Evaluation

Session 5 Liz Millar MEASURE Evaluation

Session 6 Phyllis Dako-Gyeke University of Ghana

Session 6 Pilar Torres National Institute of Public Health (INSP), Mexico

Session 7 Elizabeth Archer University of Pretoria

Session 7 Phyllis Dako-Gyeke University of Ghana

Session 7 Pilar Torres National Institute of Public Health (INSP), Mexico

Session 8 Elizabeth Archer University of Pretoria

Session 8 Carolina Mejia MEASURE Evaluation (formerly)

Session 9 Jessica Fehringer MEASURE Evaluation

Session 9 Pilar Torres National Institute of Public Health (INSP), Mexico

Session 10 Elizabeth Archer University of Pretoria

Session 10 Carolina Mejia MEASURE Evaluation (formerly)

Session 10 Pilar Torres National Institute of Public Health (INSP), Mexico

Session 11 Jessica Fehringer MEASURE Evaluation

Session 11 Pilar Torres National Institute of Public Health (INSP), Mexico

Session 12 Jessica Fehringer MEASURE Evaluation

Session 12 Brittany Iskarpatyoti MEASURE Evaluation

Syllabus 21
REFERENCES
Braun, V., & Clarke, V. (2012). Thematic analysis. In H. Cooper, Ed. APA Handbook of Research Methods in Psychology,
Vol. 2. American Psychological Association. Retrieved from https://www.researchgate.net/profile/Victoria_
Clarke2/publication/269930410_Thematic_analysis/links/5499ad060cf22a83139626ed/Thematic-analysis

Coryn, L.S. (2007). The holy trinity of methodological rigor: A Skeptical view. Journal of MultiDisciplinary Evaluation:
4(7). Retrieved from http://evaluation.wmich.edu/jmde/

Devers, K.J., & Frankel, R. (2000). Study design in qualitative research—2: Sampling and data collection strategies.
Education for Health; 13(2):263–271. Retrieved from https://www.ncbi.nlm.nih.gov/pubmed/14742088

Escobar, A. González de la Rocha, M. (2002). Seguimiento de impacto 2001–2002; Comunidades de 2,500 a


50,000 habitantes. Retrieved from http://lanic.utexas.edu/project/etext/oportunidades/2002/escobar2.pdf

Fereday, J., & Miur-Cochrane, E. (2006). Demonstrating rigor using thematic analysis: A hybrid approach of
inductive and deductive coding and theme development. International Journal of Qualitative Methods, 50(1). Alberta,
Canada: International Institute for Qualitative Methodology. Retrieved from http://journals.sagepub.com/doi/
pdf/10.1177/160940690600500107

González de la Rocha, M. (2008). La vida después de Oportunidades: Impacto del programa a diez años de
su creación. In Evaluación Externa del Programa Oportunidades, Volumen 1, (pp.121–199). México DF: SEDESOL.
Retrieved from http://lanic.utexas.edu/project/etext/oportunidades/

Guest, G., Namey, E., & McKenna, K. (2016). How many focus groups are enough? Building an evidence
base for non probability sample sizes. Field Methods; 1–20. Retrieved from https://www.researchgate.net/
publication/301719869_How_Many_Focus_Groups_Are_Enough_Building_an_Evidence_Base_for_
Nonprobability_Sample_Sizes

Guzmán, J.M. (2002). Envejecimiento y desarrollo en América Latina y el Caribe, Serie Población y Desarrollo No.
28. Santiago, Chile: United Nations, CELADE-División de Población. Retrieved from http://gerontologia.org/
portal/archivosUpload/uploadManual/10_envejecimiento_y_desarrollo.pdf

Guzmán, J.M., Huenchuan, S., & Montes de Oca, V. (2003). Redes de apoyo social de las personas mayores: Marco
conceptual, en revista notas de población de la Economic Commission for Latin America No. 77. CELADE
División de Población de la CEPAL. Retrieved from http://repositorio.cepal.org/handle/11362/12750
Harries, E., Hodgson, L., & Noble, J. (2014). Creating your theory of change: NPC’s practical guide. London,
England: New Philanthropy Capital (NPC). Retrieved from http://www.thinknpc.org/publications/creating-your-
theory-of-change/

Lincoln, Y. & Guba, E.G. (1985). Chapter 11. Naturalistic Inquiry. Newbury Park, CA: SAGE

MacQueen, K.M., McLellan, E., Kay, K., & Milstein, B. (1998). Codebook development for team-based
qualitative analysis. Cultural Anthropology Methods; 10(2):31–36. Retrieved from https://www.researchgate.net/
publication/215666089_Codebook_Development_for_Team-Based_Qualitative_Analysis

Martínez, I. (2003). Recomendaciones sobre métodos e instrumentos para el estudio de redes de apoyo social de
personas mayores. In Redes de Apoyo Social de las Personas Mayores en América Latina y el Caribe (pp.67–75). Santiago
Chile. Retrieved from https://www.cepal.org/publicaciones/xml/2/14182/lcl1995_2.pdf

22 Qualitative Methods in Evaluation of Public Health Programs


Mpembeni, R.N.M., Bhatnagar, A., LeFevre A., Chitama, D., Urassa, D.P., Kilewo, C., George, A. (2015).
Motivation and satisfaction among community health workers in Morogoro Region, Tanzania: Nuanced needs
and varied ambitions. Human Resources for Health, 13(1):44. Retrieved from https://human-resources-health.
biomedcentral.com/articles/10.1186/s12960-015-0035-1

Onwuegbuzie, A. J. (2002). Why can’t we all get along? Towards a framework for unifying research paradigms.
Education; 122(3):518. Retrieved from http://eds.a.ebscohost.com/eds/detail/detail?vid=0&sid=814bda50-24b9-
41b1-9c34-ebebfd49bbda%40sessionmgr4008&bdata=JnNpdGU9ZWRzLWxpdmUmc2NvcGU9c2l0ZQ%3d%
3d#AN=6763557&db=aph

Palomba, R. (2003). Recomendaciones para investigaciones sobre redes de apoyo y calidad de vida: Agenda de
investigación, métodos e instrumentos para estudios cualitativos y cuantitativos. In Redes de Apoyo Social de las
Personas Mayores en América Latina y el Caribe (pp.77–83). Santiago, Chile: United Nations.
Retrieved from http://repositorio.cepal.org/handle/11362/12757

Patton, M.Q. (1990). Qualitative Evaluation and Research Methods (pp.169–186). Beverly Hills, CA, USA: Sage.

Patton, M.Q. (2002). Chapter 2. Qualitative Design and Data Collection. In Qualitative Research and Evaluation
Methods (pp.207–339). Sage, Thousand Oaks, London, New Delhi, 3rd Edition.

Ritchie, J., & Lewis, J. (2003). Qualitative Research Practice: A Guide for Social Science Students and Researchers. Thousand
Oaks: Sage Publications.

Rogers, P. (2014). Theory of Change, Methodological Briefs: Impact Evaluation 2. Florence, Italy: United Nations
Children’s Fund, Office of Research. Retrieved from http://www.entwicklung.at/fileadmin/user_upload/
Dokumente/Evaluierung/Theory_of_Change/UNICEF_Theory_of_change.pdf

Starks, H., & Trinidad, S.B. (2007). Choose your method; A comparison of phenomenology, discourse analysis,
and grounded theory. Qualitative Health Research; 17(10):1372–1380. Retrieved from https://www.ncbi.nlm.nih.gov/
pubmed/18000076
Teddlie C., & Yu, F. (2007). Mixed methods sampling: A typology with examples. Journal of Mixed Methods Research;
1(77). Retrieved from http://mmr.sagepub.com/cgi/content/abstract/1/1/77

Vera, M. (2007). Significado de la calidad de vida del adulto mayor para sí mismo y para su familia. Anales de la
Facultad de Medicina; 68(3):284–290. Lima, Peru: Universidad Nacional Mayor de San Marcos. Retrieved from
http://revistasinvestigacion.unmsm.edu.pe/index.php/anales/article/view/1218

Syllabus 23
MEASURE Evaluation
University of North Carolina at Chapel Hill
123 West Franklin Street Building C, Suite 330
Chapel Hill, North Carolina, USA 27516
Phone: +1 919-445-9350
[email protected]
www.measureevaluation.org

This publication was produced with the support of the United States Agency
for International Development (USAID) under the terms of MEASURE Evaluation
cooperative agreement AID-OAA-L-14-00004. MEASURE Evaluation is
implemented by the Carolina Population Center, University of North Carolina at
Chapel Hill in partnership with ICF International; John Snow, Inc.; Management
Sciences for Health; Palladium; and Tulane University. Views expressed are not
necessarily those of USAID or the United States government. MS-17-121A
ISBN: 978-1-64232-082-4 © 2018 by MEASURE Evaluation
-

You might also like