0% found this document useful (0 votes)
40 views

Program Evaluation: Eltecon 2016/17 Autumn Instructor: Dániel Horn (Slides: Gábor Kézdi)

This document provides an introduction to a course on program evaluation. It outlines the following key points: - The course will introduce students to the logic and methods of modern social program evaluation, including regression methods, matching, and regression discontinuity. - Students will learn analytic and practical skills to evaluate other researchers' work, design their own studies, and present findings. - Impact evaluation assesses whether a program achieves its goals and has unintended consequences, which is important for determining if demonstration programs should influence policy. - Evaluating impact is challenging but the course will teach how to do so using methods like randomized controlled trials and regression analysis.

Uploaded by

vrhdzv
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views

Program Evaluation: Eltecon 2016/17 Autumn Instructor: Dániel Horn (Slides: Gábor Kézdi)

This document provides an introduction to a course on program evaluation. It outlines the following key points: - The course will introduce students to the logic and methods of modern social program evaluation, including regression methods, matching, and regression discontinuity. - Students will learn analytic and practical skills to evaluate other researchers' work, design their own studies, and present findings. - Impact evaluation assesses whether a program achieves its goals and has unintended consequences, which is important for determining if demonstration programs should influence policy. - Evaluating impact is challenging but the course will teach how to do so using methods like randomized controlled trials and regression analysis.

Uploaded by

vrhdzv
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 24

01 Introduction

Program Evaluation

ELTEcon 2016/17 Autumn


Instructor: Dániel Horn
(slides: Gábor Kézdi)
Content
• The course in a nutshell
• Admin
• The policy context

ProgEval 01 Intro 2
In a Nutshell
• An introduction to the
• logic and
• methods

• of modern social program evaluation


• a.k.a. impact assessment

ProgEval 01 Intro 3
In a Nutshell
• This is an applied methods course
• Methods
• New look at good old regression methods
• Some (new) methods
• Matching
• (Regression Discontinuity)

• Applications
• Classical case studies
• Actual evaluations
• Reproduction
ProgEval 01 Intro 4
In a Nutshell
• Outline
• Intro, context, concepts
• The potential outcomes framework
• Randomized controlled trials (RCT)
• Linear Regression method
• (Regression Discontinuity)
• Matching

ProgEval 01 Intro 5
Learning Outcomes
• Analytic skills and practical knowledge
• Read and evaluate other people’s research
• and understand any program evaluation research output
and evaluate its merit

• Think about own research and assist research


• carry out simple analyses, and
• be able to participate in more complex analyses

• Think about and assist design


• Evaluate the design and, carry out such research projects

• Present and discuss


ProgEval 01 Intro 6
Material
• Lecture slides
• Textbook
• Impact Evaluation in Practice
• published by The World Bank
• available online for free
site moves around, just type title in Google

• Methods summary article


• Guido W. Imbens and Jeffrey M. Wooldridge, “Recent
Developments in the Econometrics of Program Evaluation.”
Journal of Economic Literature 2009, 47:1, 5–86

• Case studies (journal articles)


ProgEval 01 Intro 7
Course Administration
• Coospace

ProgEval 01 Intro 8
Prerequisites
• Knowledge of statistics at an undergraduate level,
including regression analysis.
• Some knowledge of Stata (or R)

ProgEval 01 Intro 9
Assessment
• Active participation (5%)
• 5 min. quizzes on an important concept from previous lecture
(10%)
• Individual work

• Term project
(35%)
• Hands on data analysis excersise
• to evaluate the merits of experimental vs non-experimental control groups
• and the various non-experimental methods covered in the course
• group presentation on last day,
• individually submitted reports in during exam period
• (+ „things that I take-away and that are still unclear” at the end of each
day)
ProgEval 01 Intro 10
The Policy Context
• Social programs
• Designed and implemented by government agencies or
NGOs
• Examples
• Assisting unemployed in job search
• Teaching entrepreneurship skills to high-school students
• Giving tax break to companies on innovation expenditures
• Helping mothers improve their parenting skills

• “Stakeholders”
• People and institutions that have a stake
• may affect or be affected by the program or its evaluation
ProgEval 01 Intro 11
Example: Job Corps, USA
• An intensive training program in the U.S.
• Target group 16-24 old high school dropouts
• Increase their employment chances
• through developing their skills

• 6-month intensive program


• Focusing on general skills + flexible curriculum
• Participants live in dormitories

• Started small in 1964, now big


• 60.000 participants, budget $1.5 billion per year

• Federally funded
ProgEval 01 Intro 12
Example: Job Corps, USA
• Who are its stakeholders?
• What are outcomes to look at?
• What are policy questions that an impact
assessment can help answer?

ProgEval 01 Intro 13
Example: Job Corps, USA
• Outcome measures
• Level of individual.
• Average
• Fraction positive, fraction negative
• etc.

• What group to consider for impact?


• Participants
• All HS dropouts in US in relevant age group
• All secondary school dropouts in developed countries
• etc.
ProgEval 01 Intro 14
Program Impact
• The extent to which the program lead to changes
in “outcomes”
• employment, well-being, skills, attitudes etc. of
individuals
• profitability, growth, productivity, etc. of firms
• etc.

• Changes attributed to the program


• Causal relationship

ProgEval 01 Intro 15
Program Impact
• Levels of aggregation
• Effect on each individual, firm, etc.
• Effect aggregated to all individuals, firms, etc.

• Statistics if aggregate
• Average effect
• Other statistics are often impossible to assess
• Smallest (most negative) effect, largest (most positive effect),
median effect, fraction with negative/positive effect, etc.

• Aggregated to what group


• Actual participants, potential participants, total population
ProgEval 01 Intro 16
Impact Evaluation, Monitoring, etc.
• Monitoring
• Keeping track of what’s happening
• Continuous

• Evaluation
• Objective assessments of
• Periodic (often after program ends)

• Process vs Impact
• Process evaluation: the extent to which program was
implemented the way it was supposed to
• Impact evaluation is different
ProgEval 01 Intro 17
Validity of An Evaluation
• Internal validity
• The extent to which the effect shown by the evaluation is
really the effect of the program
• As it was implemented, there and then

• External validity
• generalizability
• The extent to which the effect shown there and then is
expected to carry over to other programs
• Implemented some other time or place

• Language
• high validity versus low validity
ProgEval 01 Intro 18
Impact Evaluation and Policy Decisions
• Evidence-based policymaking
• Implements policies based on their expected impacts
• Uses evidence to assess expected impacts

• Prospective evaluation
• “ex ante:” before implementation
• use evidence from past, data on participants, and assumptions

• Retrospective evaluation
• “ex post:” after implementation
• effect of program as it was implemented
• the focus of our course

ProgEval 01 Intro 19
Cost-Benefit Analysis
• Costs of program
• Collect information on costs from implementation
• (May add negative impacts on non-participants)

• Benefits of program
• Assess impact of program on outcomes
• Assign benefit values ($) to effects on outcomes
• (may include negative effects on participants as well as
non-participants – alternative to include latter in costs)

• Impact assessment is necessary ingredient


• We won’t cover the rest in this (part of the) course
ProgEval 01 Intro 20
The Policy Context
• Not every stakeholder is always keen on learning
about impact
• Job of analyst can be sensitive
• Sometimes goals of impact assessment can (or appear
to) interfere with implementation
• Sometimes evaluation does not show expected positive
effects or shows unintended negative effects

• Analyst(s) cannot work in isolation


• Need to work closely with the implementing agency

ProgEval 01 Intro 21
When To Evaluate
• Not every program should be evaluated
• Evaluation is costly

• Evaluate programs with demonstration purposes


• Innovative
• Replicable
• Strategically relevant
• Untested
• Expected to influence policy

ProgEval 01 Intro 22
General Applicability of Methods
• Interventions abound in business
• Many are very similar to social programs
• Employee training, marketing campaigns, etc..
• Methods learned here can be often directly applied to
evaluate the effect business intervention
• The “policy context” matters there, too

• Causal analysis of policies


• Not just “programs”
• Examples: changes in the tax system or other regulations

• Causal analysis in general


ProgEval 01 Intro 23
Take-away
• Assess impact to see if
• program achieves its ultimate goal(s)
• there are unintended consequences
• Especially important for demonstration programs
• And cost-benefit analysis

• Impact assessment is different from


• Monitoring
• Process evaluation

• Assessing impact is difficult


• But we’ll learn how to do it (when possible)

• Methods applicable more generally


ProgEval 01 Intro 24

You might also like