0% found this document useful (0 votes)
15 views

Developing An Effective Evaluation Plan

See this

Uploaded by

maryjoyjimenez91
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views

Developing An Effective Evaluation Plan

See this

Uploaded by

maryjoyjimenez91
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

Institutional Research and Decision Support (in collaboration with

Center for Teaching and Learning)

Developing an Effective
Evaluation/Assessment Plan

IUPUI
Webinar outcomes
Upon completion of this webinar, attendees should be able to:

1. Differentiate between assessment and evaluation.


2. Obtain a basic understanding of the key components of an
evaluation/assessment plan.
3. Distinguish the difference between formative and summative
evaluation/assessment.
4. Describe logic models (as used in a program evaluation).
5. Develop an effective Evaluation/Assessment Plan [Section 5 of
Curriculum Enhancement Grant RFP]
6. Describe the requirements for the dissemination, timeline,
and budget sections of a CEG proposal

IUPUI
CEG RFP: Assessment/Evaluation Plan
1. Address how the overall project effectiveness will be
measured

2. Describe the strategy that will be used to monitor the


effectiveness of the project as it evolves (formative
evaluation/assessment)

3. Describe the evidence that will be used to measure impact


on student learning and/or success, e.g., measures of
student performance, enrollment change, course DFW
rates, program graduation rates (for multi-course series)

IUPUI
Difference Between Assessment and Evaluation
(in an instructional setting)
Assessment is a systematic process of acquiring, documenting,
reviewing and using information about someone of something, so as
to make improvement where necessary. Assessment is more process
oriented, improves quality and is used to provide feedback.
Evaluation is derived from the word ‘value’; hence, evaluation
focuses on making a judgment or conducting an examination of
something to determine its utility, value, or merit. Evaluation is more
product oriented and is mainly judgmental.

IUPUI
What is an evaluation plan?

• A written plan or document that provides details


of the project (or intervention) being evaluated
• Describes and justifies the evaluation approach
selected
• Provides instructions for the evaluation or a
guide for each step of the evaluation process

IUPUI
Key components of an evaluation plan
• Project goals
• Description of intervention / impact theory (logic
model)
• Evaluation methods (design, data collection,
analysis)
• Data analysis
• Timeline

IUPUI
Posing Evaluation Questions
Two different types of evaluation questions: formative help you to improve your
program; and summative help you to prove whether your project worked the way you
planned.
Benefits of Formative and Summative Evaluation Questions*
Formative Evaluation - Improve Summative Evaluation - Prove
Provides information that helps you Generates information that can be used to
improve your program. Generated periodic demonstrate the results of your program to
reports Information can be shared quickly. funders and your community.

Focuses most on program activities, Focuses most on program’s intermediate-


outputs, and short-term outcomes for the term outcomes and impact. Although data
purpose of monitoring progress and may be collected throughout the program,
making mid-course corrections when the purpose is to determine the value and
needed. worth of a program based on results.

Helpful in bringing suggestions for Helpful in describing the quality and


improvement to attention of project staff. effectiveness of your program by
documenting its impact on participants and
the community.

*Adapted from Bond, Boyd & Montgomery (1997)


IUPUI
Description of intervention/program theory

Your Planned Work


What resources you need to implement your project and what
activities you intend to do to accomplish your project goal(s).
• Resources/Inputs include the human, financial,
organizational, and community resources a project has
available for doing the proposed work.
• Activities are the processes, tools, events, and actions that
are used to bring about the intended program changes or
results.

Source: W.K. Kellogg Foundation (2004) Logic Model Development Guide.

IUPUI
Description of intervention/program theory
(continued)

Your Intended Results


All of the project’s desired results (outputs, outcomes, and
impact).
• Outputs are the direct products of a project’s activities and
may include types, levels and targets of services to be
delivered by the project.
• Outcomes/Impacts are the specific changes in a project
participants’ knowledge, skills, attitude(s), and behavior(s).
• Long-term goal is the intended or unintended change
occurring in study participants, organizations, communities or
systems as a result of project activities over time.

Source: W.K. Kellogg Foundation Evaluation Handbook (2004)

IUPUI
Description of Intervention or Program Theory

• Problem: The issue being addressed by the program


• Goal(s): Intended aims or impacts over the duration of the program

Resources / Activities Outputs Outcomes


Inputs
The actions The tangible, The expected
The that the direct products changes among
resources program take of a program’s study
dedicated to to achieve activities participants of a
or consumed desired program
by the outcomes
program

Rationale and Assumptions: What are your underlying assumptions and


rationale regarding the intervention and how it works?
External Factors: What else might affect the program?

IUPUI
Example: Logic Model for Project EPIC at IUPUI
INPUTS OUTPUTS OUTCOMES
STEM leaders
increased their
Developed Leadership knowledge of STEM
STEM Training Workshops on equity & department
Leadership & Diversity, Equity & inclusion leadership
Faculty Inclusion (DEI) identified Improved
appropriate climate &
STEM leaders actions to take accounta-
Money Targeted better
bility
(NSF ADVANCE
Conducted STEM understood
grant, Department
enhancement leadership leaders their own
grants) training & faculty leadership
workshops attended styles
STEM
leadership &
Partners STEM leaders faculty
Provided Increased
(Advisory gained skills implemented
mentoring / represent-
Committees, through effective DEI
ARC Network) networking ation &
practice in strategies
retention
effective of STEM
Research leadership & faculty
organizational
strategies
g

Assumptions Short-term Medium-term Long-term

IUPUI
Evaluating the Effectiveness of Learning
Interventions
A “multiple methods” approach is
recommended to assess student learning
outcomes (directly and indirectly).
 Direct measures vs. Indirect measures

Theoretical Foundations / Evaluation Considerations


 Learning Goals (Knowledge, Skills or Attitudes)
 Short Term vs Long Term Learning
 Evaluation vs Improvement (Evaluation Purposes)

IUPUI
Mixed Methods Approaches
• Mixed Methods approach involves combining both statistical trends
(quantitative data) and stories (qualitative data) to study research
problems.
• Core assumption: When an investigator combines both statistical
trends and stories, that combination provides a more complete
understanding of the research problem than either statistical trends
or stories alone.
• Convergent mixed methods – the investigator converges or
merges quantitative and qualitative data in order to provide a
comprehensive analysis of the research problem.
• Explanatory mixed methods – the researcher first conducts
quantitative research, analyzes the results and then builds on the
results to explain them in more detail with qualitative research.

IUPUI
Mixed Methods Approaches
Note: Multi-Methods research designs employ multiple
quantitative or multiple qualitative approaches …
o Example: Rebman CEG Proposal used a variety of direct
and indirect measures of student success (i.e., multiple
methods / data sources) that included course-based
assessments, national standardized examinations, pre-
test/post-test surveys, student course evaluations,
graduate exit interviews (qualitative data sources) to
evaluate the effectiveness of a new pedagogical method
used in the SHRS K504 course.

IUPUI
4 Key Features of Mixed Methods Approach*

1. Collecting an analyzing quantitative and qualitative data


(closed- and open-ended) in response to research questions

2. Using rigorous quantitative and qualitative methods

3. Combining or integrating quantitative and qualitative data


using a specific type of mixed methods design

4. Framing the mixed methods design within a broader


framework (e.g., experiment, causal-comparative approach,
content analysis, grounded theory, etc.)

*Source: Creswell, J. W. (2013, Spring). “What is Mixed Methods Research”


[YouTube video - https://www.youtube.com/watch?v=1OaNiTlpyX8 ]

IUPUI
Mixed Methods Approaches

1. Allow investigators to:


o Triangulate findings from multiple data sources.
o Converge or corroborate findings.
o Strengthen the internal validity of the studies.
o Create elaborated understandings of complex
constructs for assessing/evaluating student success
such as “critical thinking” or “integrative learning.”

IUPUI
Evaluation Considerations
1. Learning Goals (Student Learning Outcomes)
o Knowledge: “what facts and concepts students should understand”
o Skills: “what tasks student should be able to perform”
o Attitudes: “what attitude, beliefs & motivation students should possess”

2. Short Term vs Long Term Learning


o Short-term learning (internal validity): was the intervention
successful in achieving its learning goals/objectives?
(This relates to the effectiveness of specific strategies addressed by a project.)

o Long-term learning (external validity): Did the intervention(s)


contribute to the students’ overall learning experiences?
(Addresses the issue of relevance and/or broader impact.)

3. Evaluation Purpose(s) [Use(s) of assessment/evaluation data]


 Formative Evaluation vs Summative Evaluation

IUPUI
Types of Assessment / Evaluation
Formative vs Summative Evaluation:
• The aim of formative evaluation is to improve upon what has
been learnt whereas the aim of summative evaluation is to prove
the amount of learning that has taken place.
• Formative evaluation is a technique that aims at validating the
aims or goals of instruction and also to better the standards of
instruction. Goal of formative evaluation is to monitor student
learning to provide ongoing feedback that can be used by faculty
to improve their teaching and by students to improve their
learning.
• Summative assessment or evaluation is cumulative
assessment or evaluation technique (to evaluate student
learning) performed at the end of a semester or any other
instructional unit, to see how well a student has gained from the
instruction. The focus in summative evaluation is on the
outcome…
IUPUI
Types of Assessment / Evaluation
Example: CEG Proposal (by Gina Londino-Smolar)
- Development of Investigating Forensic Science
Laboratory
Online

Note: Evaluation and Assessment Plan includes both


components on Formative and Summative
Assessments …

IUPUI
Types of Assessment / Evaluation
• Summative Assessment (Assessment of learning):
o Summative evaluation collects data to ascertain how
things went.
o Assessments or tests generally taken by students at the
end of a unit or term to demonstrate the “sum” of what
they have or have not learned
o Summative data [e.g., via use of end of course
evaluation surveys, Quality Matters (QM) rubric,
Student Assessment of Learning Gains (SALG), NSSE,
Academic Self-Efficacy / Self-Confidence Scales, etc. ]
may also reflect students’ levels of satisfaction with the
class and/or outline specific elements or actions
students took in support of their own learning.

IUPUI
Types of Evaluation Measures
Direct Measures Indirect Measures:
• Course-embedded assessments • Pre-Post Knowledge Surveys (or national
standardized competency measures (e.g.,
- Quizzes/Tests/Exams, Papers, Assignments, PACKRAT & PANCE for PA students)
Oral/Written Presentations, Project work, etc.
• Participant Satisfaction Surveys
• Pre-test/Post-test measures of academic
• Interviews (e.g., Graduate Exit Interviews)
achievement/proficiency
• Standardized Achievement Tests • Focus Groups with students

• Common Final Exams


• Usage Data Records (e.g., Canvas LMS tools)

• Student ePortfolio Assessments


• Course Evaluations / Preceptor Evaluations

• Quality Matters Rubric • Extant Data (e.g., class enrollment, in-class


participation, completion, retention, demographic
data, grades, GPAs, %DFW rates, national
norms, data from prior cohorts & related data)

 Examples of CEG Proposals (with acceptable Evaluation/Assessment Plans):


 Higbee & Miller (BME Department Proposal 2018)
 Rebman CEG Proposal (IU Master of Physician Assistant Studies, MPAS)
 Gina Londino-Smolar (Development of Investigating Forensic Science Lab Online)

IUPUI
Example of a CEG Proposal*
*(Adapted from Higbee & Miller - BME Department Proposal 2018)
 Note a very useful presentation format for the Evaluation/Assessment Plan

Outcome 1: Students will demonstrate knowledge of the engineering design process

Performance Indicator Method of Assessment Targeted Course(s) Target or Performance

Student teams will identify Quiz question(s) BME 24100, BME 22200, 70% of students will score
definitions of design BME 38300, BME 35400 at least 70% on assessed
control problem(s)

Outcome 2: Students will appropriately integrate BME coursework knowledge within the
engineering design
Performance Indicator Method of Assessment Targeted Course(s) Target for Performance

Student teams will apply Project report of BME 24100, BME 22200, 75% of teams will deliver
knowledge of presentation (instructor BME 38300, BME 35400 a working prototype
mathematics, science, rubric)
and engineering to deliver 75% of teams will
a working prototype of a appropriately identify prior
design knowledge and concepts
applied towards design

IUPUI
Use Authentic, Embedded Assessment
• Goal of many undergraduate programs is for students to become
lifelong learners by enhancing students’ communication skills,
critical thinking, and problem solving abilities.
• With authentic, embedded assessment tasks students are asked to
demonstrate what they know and are able to do in meaningful
ways.
• Authentic assessment tasks are often multidimensional and require
higher levels of cognitive thinking such as problem solving and
critical thinking.
• Embedded assessment means that “that opportunities to assess
student progress and performance are integrated into the
instructional materials and are virtually indistinguishable from the
day-to-day classroom activities” (Wilson & Sloane, 2000).

 Example: See Section 5 of the CEG Proposal by Gina


Londino-Smolar (Development of Investigating Forensic
Science Laboratory Online)
IUPUI
Evaluation Methods and Data Analysis
• Data Collection Methods and Analysis:
o Each of your evaluation questions should address the following:

 When collected and by whom?

o Specific dates, times, persons?

 How are data to be analyzed?

o Statistical analysis for quantitative data (descriptive & inferential


statistical procedures such as mean, median, chi-square, t-test,
ANOVA, regression, calculation and reporting Effect Size (ES)
statistics, etc.)

o Content analysis of qualitative data (thematic analysis to identify


common themes, ideas, topics/categories, and patterns of responses
obtained from interviews, focus groups, open-ended survey or case
study data)

IUPUI
Planning for Learning and Assessment
©T. W. Banta

1. What 2. How 3. How will 4. How 5. What are 6. What


general would you help could you the improve-
outcome you know students measure assess- ments
are you it (the learn it? each of ment might be
seeking? outcome) (in class the findings? based on
if you or out of desired assess-
saw it? class) behaviors ment
(What listed in findings?
will the #2?
student
know or
be able
to do?)

IUPUI
Selected References
Banta, T. W., & Palomba, C. A. (2014). Assessment essentials: Planning,
implementing, and improving assessment in higher education (2nd ed.). San Francisco:
Jossey-Bass.

Bond, S. L., Boyd, S. E., & Montgomery, D. L. (1997). Taking Stock: A Practical Guide
to Evaluating Your Own Programs. Chapel Hill, NC: Horizon Research, Inc.

Creswell, J. W. & Creswell, J. D. (2018). Research Design: Qualitative, Quantitative,


and Mixed Methods Approaches (5th ed.). Thousand Oaks, CA: Sage.

Suskie, L. (2018). Assessing student learning: A common sense guide. (3rd ed.). San
Francisco: Jossey-Bass.

Walvoord, B. E. (2010). Assessment clear and simple: A practical guide for institutions,
departments, and general education (2nd ed.). San Francisco: Jossey-Bass.

Nicol, D. J., & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated


learning: A model and seven principles of good feedback practice. Studies in Higher
Education, 31(2), 199-218

IUPUI
Selected References (continued)
Nitko, A. J. (1996). Educational Assessment of Students (2nd ed.). Englewood Cliffs,
NJ: Merrill/Prentice Hall.

Wilson, M. & Sloane, K. (2000). From principles to practice: an embedded assessment


system. Applied Measurement In Education, 13(2), 181–208.

W.K. Kellogg Foundation (2010). Evaluation Handbook. Battle Creek, MI: Author.

W.K. Kellogg Foundation (2004). Logic Model Development Guide. Battle Creek, MI:
Author.

IUPUI
Contact Information
 Howard Mzumara, Ph.D.
Director, Evaluation and Psychometric Services

Office of Institutional Research and Decision Support (IRDS)

Indiana University – Purdue University Indianapolis (IUPUI)

(317) 278-2214 (office phone)


[email protected]
irds.iupui.edu

IUPUI
Section 6: Dissemination

1. A part of any scholarly process


2. Dissemination in research projects
3. Dissemination in teaching projects
4. Your work presented to peers in the field
5. Your work presented to colleagues nearer to home

IUPUI
Section 7: Timeline

1. Managing your time


2. Timeline also a way envision what your project entails
from start to finish
3. A planning tool and a series of prompts for your
ongoing reflection
4. A measure for readers to judge your readiness to
undertake and succeed with a CEG project

IUPUI
Budget Worksheet
Download the Budget Worksheet

IUPUI
Additional Guidelines for Developing or Reviewing a CEG
Proposal
1. Rationale: the proposal should define thoroughly the
learning issue/challenge the project addressed and make
a compelling case for why the project is important.
2. Problem Statement: the project should focus on an
interesting and testable research question rooted in the
literature.
3. Literature Review: the proposal should demonstrate a
firm understanding of prior research relating to the
teaching and learning topic.

IUPUI
Additional Guidelines for Developing or Reviewing a
CEG Proposal

3. Student Outcomes: the proposal should indicate what


measures of student learning and success will be in place and
how they clearly align with the goals and methods of the
project.
4. Teaching Intervention: the proposal should make clear how
the project’s intervention aligns with the project rationale,
problem statement, and student learning outcomes.
5. Project Methods: the proposed methods (intervention,
assessment plan, data collection and analysis) are appropriate
and rigorous enough to answer the research question.
6. Results: the proposal should indicate some preliminary
theories of what results the project will find.

*Adapted from CIRTL Teaching-as-Research (TAR) Rubric

IUPUI
Teaching@IUPUI
Questions and Discussion
Howard Mzumara
[email protected]

Terri A. Tarr Richard Turner


[email protected] [email protected]

317-274-1300 | UL 1125
ctl.iupui.edu

IUPUI
Teaching@IUPUI
Questions and Discussion

Terri A. Tarr Richard Turner


[email protected] [email protected]

317-274-1300 | UL 1125
ctl.iupui.edu

IUPUI
Thank you for joining us!
Please take a few minutes
to complete webinar
evaluations at
http://go.iupui.edu/2dae

You might also like