0% found this document useful (0 votes)
123 views

6 Steps 4 Standards Program Evaluation

The document outlines a 6-step framework for conducting program evaluations that includes defining stakeholders, describing the program, focusing the evaluation design, gathering evidence, drawing conclusions, and presenting findings. It also discusses 4 key standards for evaluations - utility, feasibility, propriety, and accuracy - to ensure they serve user needs, are practical, conducted ethically, and provide technically accurate information.

Uploaded by

Adeq Norzuani
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
123 views

6 Steps 4 Standards Program Evaluation

The document outlines a 6-step framework for conducting program evaluations that includes defining stakeholders, describing the program, focusing the evaluation design, gathering evidence, drawing conclusions, and presenting findings. It also discusses 4 key standards for evaluations - utility, feasibility, propriety, and accuracy - to ensure they serve user needs, are practical, conducted ethically, and provide technically accurate information.

Uploaded by

Adeq Norzuani
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 26

Program Evaluation The How

6-Steps & 4 Standards


Adapted from
http://www.nwcphp.org/evaluation/learn-evaluation/program-evaluation-tips

A Framework for Program Evaluation

Program Evaluation
Effective program evaluation is a systematic way to improve and account for actions. Evaluation involves procedures that are useful, feasible, ethical, and accurate. A practical, non-prescriptive tool, the evaluation framework summarizes and organizes the steps and standards for effective program evaluation.

Program Evaluation
Definitions Evaluation is the systematic investigation of the merit, worth or significance of an object (Scriven, 1999), hence assigning value to a programs efforts means addressing those three inter-related domains: Merit (or quality) Worth (or value, i.e., cost-effectiveness) Significance (or importance)

Program Evaluation cont


A strong evaluation approach ensures that the following questions will be addressed as part of the evaluation so that the value of program efforts can be determined and judgments about value can be made on the basis of evidence: 1. What will be evaluated? (i.e., what is "the program" and in what context does it exist?) 2. What aspects of the program will be considered when judging program performance? 3. What standards (i.e., type or level of performance) must be reached for the program to be considered successful? 4. What evidence will be used to indicate how the program has performed? 5. What conclusions regarding program performance are justified by comparing the available evidence to the selected standards? 6. How will the lessons learned from the inquiry be used to improve public health effectiveness?

Evaluation Framework
Evaluation framework provides a systematic way to approach and answer these questions using a set of 6 steps and 4 standards.

Six Steps of Program Evaluation


Program planning and evaluation go together. These six steps can help put your organization on the right track for continuous quality improvement.

6-Step Process

Step 1: Define your stakeholders


Your stakeholders are supporters, implementers, recipients, and decision-makers related to your program. Getting them involved early on will help you get different perspectives on the program and establish common expectations. This helps to clarify goals and objectives of the program youll evaluate, so everyone understands its purpose.

Step 2: Describe the program


Taking the time to articulate what your program does and what you want to accomplish is essential to establishing your evaluation plan. Your descriptions should answer questions like: What is the goal of our program? Which activities will we pursue to reach our goal? How will we do it? What are our resources? How many people do we expect to serve? Articulating the answers to those questions will not only help with accountability and quality improvement, but it will also help you promote the program to its beneficiaries.

Step 3: Focus the design of your evaluation


Evaluations can focus on process, means, resources, activities, and outputs. They can focus on outcomes or how well you achieved your goal. You may also choose to evaluate both process and outcomes. As you begin formulating your evaluation, think about the specific purpose of the evaluation what questions are you trying to answer? How will the information be used? What informationgathering methods are best suited for collecting what our organization needs to know?

Step 4: Gather evidence


Qualitative and quantitative data are the two main forms of data you may collect. Qualitative data offers descriptive information that may capture experience, behavior, opinion, value, feeling, knowledge, sensory response, or observable phenomena. Three commonly used methods used for gathering qualitative evaluation data are: key informant interviews, focus groups, and participant observation. Quantitative methods refer to information that may be measured by numbers or tallies. Methods for collecting quantitative data include counting systems, surveys, and questionnaires.

Step 5: Draw conclusions


This is the step where you answer the bottomline question: Are we getting better, getting worse, or staying the same? Data comparisons show trends, gaps, strengths, weaknesses. You can compare evaluation data with targets set for the program, against standards established by your stakeholders or funders, or make comparisons with other programs.

Step 6: Present findings and ensure use


It is important that all the work you put into program evaluation gets used for quality improvement. When you present your findings and recommendations, it is important to know the values, beliefs, and perceptions of your group; build on the groups background and build on common ground; and state the underlying purpose for your recommendations before you get to the details.

Standards of Program Evaluation

Program Evaluation Standards


developed by the American Joint Committee on Standards for Educational Evaluation (AJCSEE) increasingly been promoted through professional evaluation associations, including the American and African evaluation associations. These standards can be used both as a guide for managing the evaluation process and to assess an existing evaluation. The standards highlight the considerations that must be weighed in formulating an evaluation design.

Program Evaluation Standards


1. Utility: seek to ensure that an evaluation will serve the information needs of intended users. 2. Feasibility: seek to ensure that an evaluation will be realistic, prudent, diplomatic, and frugal. 3. Propriety: seek to ensure that an evaluation will be conducted legally, ethically, and with due regard for the welfare of those involved in the evaluation, as well as those affected by its results. 4. Accuracy: seek to ensure that an evaluation will reveal and convey technically adequate information about the features that determine the worth or merit of the program being evaluated.

1. Utility
A. Stakeholder Identification Persons involved in or affected by the evaluation should be identified so their needs can be addressed. B. Evaluator Credibility Persons conducting the evaluation should be both trustworthy and competent to perform the evaluation so its findings achieve maximum credibility and acceptance. C. Information Scope and Selection Information collected should be broadly selected to address pertinent questions about the program and be responsive to the needs and interests of clients and other specified stakeholders. D. Values Identification The perspectives, procedures, and rationale used to interpret the findings should be carefully described so the bases for value judgements are clear.

1. Utility cont
E. Report Clarity Evaluation reports should clearly describe the program being evaluated, including its context, purposes, procedures, and findings so that essential information is provided and easily understood. F. Report Timeliness and Dissemination Significant interim findings and evaluation reports should be disseminated to intended users so they can be used in a timely fashion. G. Evaluation Impact Evaluations should be planned, conducted, and reported in ways that encourage follow-through by stakeholders to increase the likelihood that the evaluation will be used.

2. Feasibility
A. Practical Procedures The evaluation procedures should be practical to keep disruption to a minimum while needed information is obtained. B. Political Viability The evaluation should be planned and conducted with anticipation of the different positions of various interest groups so their co-operation may be obtained, and possible attempts by any of these groups to curtail evaluation operations or to bias or misapply the results can be averted or counteracted. C. Cost Effectiveness The evaluation should be efficient and produce information of sufficient value so the resources expended can be justified.

3. Propriety
A. Service Orientation Evaluations should be designed to help organisations address and effectively serve the needs of the full range of participants. B. Formal Agreement The obligations of the formal parties to an evaluation (what is to be done, how, by whom, when) should be agreed to in writing to ensure that they adhere to all conditions of the agreement or that they formally renegotiate it. C. Rights of Human Subjects Evaluations should be designed and conducted to respect and protect the rights and welfare of human subjects. D. Human Interactions Evaluators should respect human dignity and worth in their interactions with other persons associated with an evaluation so participants are not threatened or harmed.

3. Propriety cont
E. Complete and Fair Assessment The evaluation should be complete and fair in its examination and recording of strengths and weaknesses of the program being evaluated so that strengths can be built upon and problem areas addressed. F. Disclosure of Findings The formal parties to an evaluation should ensure that the full set of evaluation findings along with pertinent limitations are made accessible to the persons affected by the evaluation, and any others with expressed legal rights to receive the results. G. Conflict of Interest Conflict of interest should be dealt with openly and honestly so it does not compromise the evaluation processes and results. H. Fiscal Responsibility The evaluator's allocation and expenditure of resources should reflect sound accountability procedures, and otherwise be prudent and ethically responsible to ensure they are accounted for and appropriate.

4. Accuracy
A. Program Documentation The program being evaluated should be described and documented clearly and accurately. B. Context Analysis The context of the program should be examined in enough detail so its likely influences can be identified. C. Described Purposes and Procedures The purposes and procedures of the evaluation should be monitored and described in enough detail so they can be identified and assessed. D. Defensible Information Sources The sources of information used in a program evaluation should be described in enough detail so their adequacy can be assessed. E. Valid Information The information-gathering procedures should be chosen or developed and implemented to ensure that the interpretation is valid for the intended use.

4. Accuracy cont
F. Reliable Information The information-gathering procedures should be chosen or developed and implemented to ensure that the information is sufficiently reliable for the intended use. G. Systematic Information The information collected, processed, and reported in an evaluation should be systematically reviewed, and any errors found should be corrected H. Analysis of Quantitative Information Quantitative information should be appropriately and systematically analysed so evaluation questions are effectively answered. I. Analysis of Qualitative Information Qualitative information should be appropriately and systematically analysed so evaluation questions are effectively answered.

4. Accuracy cont
J. Justified Conclusions The conclusions reached in an evaluation should be explicitly justified so stakeholders can assess them. K. Impartial Reporting Reporting procedures should guard against distortion caused by personal feelings and biases of any party to the evaluation so that evaluation reports fairly reflect the evaluation findings. L. Meta-evaluation The evaluation itself should be formatively and summatively evaluated against these and other pertinent standards so that its conduct is appropriately guided, and, on completion, stakeholders can closely examine its strengths and weaknesses.

You might also like