0% found this document useful (0 votes)
13 views

Program Evaluation ADP

This document provides an overview of program evaluation. It defines program evaluation as the systematic examination of a program's worth or significance. It discusses how program evaluation can be used to assess different types of programs and initiatives, including direct services, community mobilization, research, advocacy, and training programs. The document also outlines common evaluation questions around implementation, effectiveness, efficiency, cost-effectiveness, and attribution. Finally, it compares the principles of research and program evaluation, discussing differences in their design, data collection, analysis, judgments, conclusions, and uses.

Uploaded by

asmita saini
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

Program Evaluation ADP

This document provides an overview of program evaluation. It defines program evaluation as the systematic examination of a program's worth or significance. It discusses how program evaluation can be used to assess different types of programs and initiatives, including direct services, community mobilization, research, advocacy, and training programs. The document also outlines common evaluation questions around implementation, effectiveness, efficiency, cost-effectiveness, and attribution. Finally, it compares the principles of research and program evaluation, discussing differences in their design, data collection, analysis, judgments, conclusions, and uses.

Uploaded by

asmita saini
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 13

Program Development and

Evaluation
What is Program Evaluation
Examination of the worth, merit, or significance of an object/program

• Direct service interventions (e.g., a program that offers midday


meal for school children)
• Community mobilization efforts (e.g., an effort to organize a
swachata abhiyan)
• Research initiatives (e.g., an effort to find out whether disparities
in education is based on demograpgy)
• Advocacy work (e.g., a campaign to influence the state legislature
to pass legislation regarding LGBT)
• Training programs (e.g., a job training/skill training program to
reduce unemployment in urban neighborhoods)
To sum up:
Program evaluation is the systematic
collection of information about the activities,
characteristics, and outcomes of programs to
make judgments about the program, improve
program effectiveness, and/or inform
decisions about future program development
• Many different questions can be part of a
program evaluation, depending on how long
the program has been in existence, who is
asking the question, and why the information
is needed.
Evaluation questions fall into these groups

• Implementation: Were your program’s activities put into place as originally


intended?

• Effectiveness: Is your program achieving the goals and objectives it was


intended to accomplish?

• Efficiency: Are your program’s activities being produced with appropriate use of
resources such as budget and staff time?

• Cost-Effectiveness: Does the value or benefit of achieving your program’s goals


and objectives exceed the cost of producing them?

• Attribution: Can progress on goals and objectives be shown to be related to your


program, as opposed to other things that are going on at the same time?
Some Reasons to evaluate program
• To monitor progress toward the program’s goals

• To determine whether program components are producing the desired progress on


outcomes

• To permit comparisons among groups, particularly among populations with


disproportionately high risk factors and adverse outcomes

• To justify the need for further funding and support

• To find opportunities for continuous quality improvement.

• To ensure that effective programs are maintained and resources are not wasted on
ineffective programs
Research & Program Evaluation
Concepts Research Principles Program Evaluation Principles

Planning Scientific method Framework for program evaluation


State hypothesis. Engage stakeholders.
Collect data. Describe the program.
Analyze data. Focus the evaluation design.
Draw conclusions. Gather credible evidence.
Justify conclusions.
Ensure use and share lessons learned.

Decision Making Investigator-controlled Stakeholder-controlled


Authoritative. Collaborative.

Standards Validity Repeatability program evaluation standards


Internal (accuracy, precision). Utility.
External (generalizability). Feasibility.
Propriety.
Accuracy.

Questions Facts Values


Descriptions. Merit (i.e., quality).
Associations. Worth (i.e., value).
Effects. Significance (i.e., importance).
Concepts Research Program Evaluation
Design Isolate changes and control Incorporate changes and
circumstances account for circumstances
Narrow experimental influences. Expand to see all domains
Ensure stability over time. of influence.
Minimize context dependence. Encourage flexibility and
Treat contextual factors as confounding improvement.
Maximize context
(e.g., randomization, adjustment, sensitivity.
statistical control). Treat contextual factors as
Understand that comparison groups essential information (e.g.,
are a necessity. system diagrams, logic
models, hierarchical or
ecological modeling).
Understand that
comparison groups are
optional (and sometimes
harmful).
Data Collection Sources Sources
Limited number (accuracy Multiple (triangulation
preferred). preferred).
Sampling strategies are Sampling strategies are
critical. critical.
Concern for protecting Concern for protecting
human subjects. human subjects,
organizations, and
Indicators/Measures communities.
Quantitative.
Qualitative. Indicators/Measures
Mixed methods
(qualitative, quantitative,
and integrated).
Analysis and Timing Timing
Synthesis One-time (at the end). Ongoing (formative and summative).
Scope Scope
Focus on specific Integrate all data.
variables.

Judgments Implicit Explicit


Attempt to remain Examine agreement on values.
value-free State precisely whose values are used.
Conclusion Attribution Attribution and contribution
Establish time sequence. Establish time sequence.
Demonstrate plausible mechanisms. Demonstrate plausible
Control for confounding. mechanisms.
Replicate findings. Account for alternative
explanations.
Show similar effects in similar
contexts

Uses Disseminate to interested Feedback to stakeholders


audiences Focus on intended uses by
Content and format varies to intended users.
maximize comprehension Build capacity.

Disseminate to interested
audiences
Content and format varies to
maximize comprehension.
Emphasis on full disclosure.
Requirement for balanced
assessment

You might also like