0% found this document useful (0 votes)
1 views

Assessment-in-Learning

The document outlines the foundations and practices of assessment in education, emphasizing its role in understanding student learning and informing instruction. It categorizes assessments into types such as formative, summative, diagnostic, and placement, and discusses principles of effective assessment including validity, reliability, fairness, and practicality. Additionally, it highlights the importance of aligning assessments with learning objectives and utilizing statistical tools to analyze assessment effectiveness.

Uploaded by

Lorraine Campos
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views

Assessment-in-Learning

The document outlines the foundations and practices of assessment in education, emphasizing its role in understanding student learning and informing instruction. It categorizes assessments into types such as formative, summative, diagnostic, and placement, and discusses principles of effective assessment including validity, reliability, fairness, and practicality. Additionally, it highlights the importance of aligning assessments with learning objectives and utilizing statistical tools to analyze assessment effectiveness.

Uploaded by

Lorraine Campos
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 64

Assessment in

Learning:
Foundations and
Practices
CECILLE B. DELGADO

LIMAY SHS STEM TEACHER


Session Overview

Identify types, Explore alignment of


Analyze qualities of
purposes, and objectives, teaching
effective assessment
principles of strategies, and
tools and practices.
assessment. assessments.

Apply statistical tools Craft assessment


Discuss classroom-
in test analysis, items aligned with
based assessment
including difficulty and specific learning
scenarios.
discrimination indices. outcomes.
Assessment
A TOOL THAT HELPS TEACHERS UNDERSTAND WHAT
STUDENTS KNOW AND CAN DO.
The Nature of Assessment

Assessment is a process of collecting, interpreting, and using


information to make decisions about students’ learning.

It provides evidence of student progress and informs instruction.

In the Philippine context, DepEd Order No. 8, s. 2015 guides


classroom assessment.
What Kind of Assessment Is
This?
• Scenario 1: A short quiz after a lesson to check
understanding.
• Scenario 2: A test at the start of the year to
gauge prior knowledge.
• Scenario 3: Reviewing a student’s portfolio at
semester-end.
• Scenario 4: A test to determine math level for
placement.
Types of Assessment:

1 2 3 4
Formative: e.g., Summative: e.g., Diagnostic: e.g., Placement: e.g.,
Exit tickets, class End-of-term test, Pre-tests, Grade-level entry
polls, concept final project learning style tests, language
maps during presentation inventories proficiency
lessons assessment
Types of Assessment

Diagnostic: Conducted before instruction (e.g., Pre-tests to check prior


knowledge).

Formative: Ongoing checks during instruction (e.g., seatwork, group


discussions).

Summative: Conducted at the end of a learning unit (e.g., periodical exams).

Placement: Used to assign students to appropriate levels (e.g., entrance tests).


Types of Assessment for Learning (AfL): Focuses on the use of
assessment to inform and improve ongoing learning. It is
Assessment: typically formative and guides both teachers and students
to identify areas for improvement.

Assessment as Learning (AaL): Involves students in the


process of assessment as they reflect on and monitor their
own learning. This type of assessment encourages
students to actively participate in their learning journey
through self-assessment and metacognition.

Assessment of Learning (AoL): Typically, summative and


focuses on evaluating and measuring the outcomes of
learning. It is often used to assign grades or certify
achievement, such as final exams or standardized tests.
High-quality
assessment
ensures
accurate, fair,
and actionable
results.
What Makes an Assessment Effective?

• As a student, have you ever


struggled with assessments that
didn’t accommodate your learning
style? How can you, as a future
teacher, create assessments that
allow all students to showcase
their strengths?
Principles of Validity: Assessment must measure
Effective what it intends to measure.
Assessment Reliability: Assessment results should
be consistent and reproducible.

Fairness: Tasks should be free from bias


and accessible to all learners.

Practicality: Assessment should be


feasible in terms of time and resources.
Applying the Principles of Assessment

Validity: Does a math Reliability: Will two


test include only math teachers score the
questions? same essay similarly?

Fairness: Can students Practicality: Can it be


with IEPs complete the done in 30 minutes
test equally? using available tools?
Which Assessment is Valid?
Objective: Students will explain the process of
photosynthesis.

ASSESSMENT A: SPELLING ASSESSMENT B: DRAW AND ASSESSMENT C: CREATE A


QUESTIONS ABOUT PLANT LABEL STAGES OF PLANT CARE ROUTINE.
NAMES. PHOTOSYNTHESIS.
Teaching is like planning
a trip: objectives are the
destination, activities are
the path, and
assessments tell us if
we arrived.
Learning Targets and Assessment Alignment

Learning Objective
Aligned Assessment
Example: 'Students will be
Example: Students draw
able to describe the water
and explain stages.
cycle using a diagram.’
Do They Match?
LEARNING OBJECTIVE BEST ASSESSMENT TASK
1. Describe plant parts a) Debate & reflection
2. Demonstrate handwashing b) Video campaign
3. Analyze pollution effects c) Multiple choice
4. Recall scientific method d) Oral recitation
Assessment Methods
Non-Traditional Assessment
Traditional Assessment Definition:
Definition: An alternative approach that evaluates how
A formal method of evaluating student learning students apply knowledge and skills through real-
through structured tests focused on recalling or life, creative, or performance-based tasks.
understanding content. Common Examples:
Common Examples: • Portfolios
• Multiple-choice tests
• Performance tasks (e.g., role-plays,
• True/false items demonstrations)

• Short answer questions • Journals or reflections

• Periodic or final exams • Experiments and investigations

• Diagnostic or standardized tests • Creative outputs (e.g., posters, models, brochures)


Tips for Future Teachers

Use Use traditional assessments to quickly check for understanding and prepare students for standardized tests.

Incorporate Incorporate non-traditional assessments to foster engagement, deeper thinking, and real-world connections.

Give Give students a voice and choice—let them propose how they want to be assessed for certain tasks.

Be Be mindful of fairness and clarity in rubrics and expectations.

Blend Blend both approaches to support diverse learners.


Have you ever taken a test that felt
confusing or unfair? What made it so?
Situational In a Grade 9 classroom in a public high school in Bataan, a
teacher administers a quarterly exam composed mostly of
Analysis: Are multiple-choice items taken from online sources.

We Assessing
Fairly? After releasing the scores, students complain that many
items were unfamiliar and not discussed in class.

The teacher realizes that the scores are very low, with
most students failing.
Essentials of Good Test Items

Use Avoid Follow TOS

Use clear Avoid tricky or Follow item Use Table of


instructions, fair biased items formats: MCQ, Specifications
content, proper True/False, (TOS)
coverage Matching, Essays
What is TOS (Table of
Specifications)?
• A Table of Specifications (TOS) is a planning
tool used by teachers to ensure that a test is
balanced, aligned with learning objectives, and
fair. It’s basically a two-way chart that maps
content areas against cognitive levels (like
remembering, understanding, applying, etc.),
helping teachers determine the number and
type of test items to include.
Components of a Table of Specifications (TOS)

Learning Objectives – What students are expected to learn

Content Areas or Topics – Units or lessons to be covered

Cognitive Levels – Based on Bloom’s Taxonomy

Number of Items – How many test questions per topic/skill level

Percentage or Weight – Distribution of test items per topic or objective


Revised Remembering – Recall of facts and basic concepts

Bloom’s
Understanding – Explain ideas or concepts
Taxonomy
Levels Applying – Use information in new situations

(Cognitive
Domain): Analyzing – Draw connections among ideas

Evaluating – Justify a decision or course of action

Creating – Produce new or original work


No. of Items= 50 x (3/40)

= 3.75
No. of Items per Cog.Level = 4 x (75/100)

= 3.00
No. of Items per Cog.Level = 4 x (25/100)

= 1.00
Use rubrics for performance tasks
How Do We
Score and
Grade?
Other tools: rating scales, checklists,
answer keys
A rubric is a scoring guide used to evaluate
a student's performance based on specific
criteria. It outlines the expectations for an
assignment or task, provides detailed
descriptions of different performance
levels, and often includes a point system
for grading.
Key Features of a Rubric:

Criteria: The aspects or Performance Levels:


components being evaluated Describes different levels of
(e.g., organization, content, quality (e.g., Excellent, Good,
grammar). Needs Improvement).

Descriptors: Specific Point System: Points or


descriptions of what is scores assigned to each
expected for each level of level to quantify the
performance. assessment.
Rubric Tip: Instead of
just 'neatness,' try:
'Work is clean, legible,
and free from doodles
of cats or anime
characters.'
Why Use Rubrics?

Clarify Expectations: Rubrics define what students need to do to succeed.

Consistency: They provide a standardized way to assess all students, reducing


bias.

Feedback: Rubrics give specific, constructive feedback on areas for


improvement.

Self-Assessment: Students can use rubrics to assess their own work before
submitting it.
Types of Rubrics:

ANALYTIC RUBRIC: BREAKS HOLISTIC RUBRIC: ASSESSES


DOWN THE TASK INTO THE OVERALL PERFORMANCE
INDIVIDUAL CRITERIA AND BASED ON AN OVERALL
EVALUATES EACH CRITERION IMPRESSION, WITHOUT
SEPARATELY. THIS PROVIDES SEPARATING THE
DETAILED FEEDBACK. COMPONENTS.
Analytical
Holistic
Rating Scale
Rating Scale
Answer Key
From Scores
to Action

DATA-DRIVEN TEACHING MEANS NOT JUST


TEACHING HARDER BUT TEACHING
SMARTER.
Can you think of a time a
teacher adjusted their lesson
based on class performance?
Example

• If most students score low


on 'photosynthesis,' the
teacher might reteach it
using a video or a hands-
on plant demo.
Introduction to Statistical Tools in Assessment

Why use statistics in assessment? To evaluate item


quality and overall test effectiveness.

Commonly used tools include: - Difficulty Index (P) -


Discrimination Index (D)- Measures of Central
Tendency (Mean, Median, Mode)
Step-by-Step: Difficulty Index (P)

1. Count the number 2. Divide by the total


of students who number of students
3. Interpret:
answered the item who answered the
correctly. item.

- 0.41 - 0.80: - 0.00 - 0.40:


- 0.81 - 1.00: Easy
Moderate Difficult
Example: Difficulty Index

1 2 3
30 out of 40 students P = 30 / 40 = 0.75 → Implication: Item is
answered Item #1 Moderate Difficulty acceptable but may
correctly. need revision if
misaligned with
objectives.
Step-by-Step: Discrimination Index (D)
1. Rank students by total score and divide into upper and lower groups.
2. Count how many in each group answered the item correctly.
3. Use formula: D = (Correct_U - Correct_L) / (n)
4. Interpret:
- 0.40 and above: Excellent
- 0.30–0.39: Good
- 0.20–0.29: Acceptable
- Below 0.20: Poor (item may be discarded or revised)
Example: Upper Group (n=10): 8 correct
Discrimination
Index
Lower Group (n=10): 3 correct

D = (8 - 3) / 10 = 0.5 → Excellent

Implication: Item effectively differentiates


between high and low performers.
Exercise: Difficulty Index
Instructions:

1. You gave a 10-item quiz to 50 students.

2. For Item #7, 35 students answered


correctly.

Task: Compute the difficulty index and


interpret the result.

Guide formula: P = Correct Answers /


Total Responses
Exercise: Discrimination Index
Instructions:

1. 40 students took a quiz.

2. Item #4 was answered correctly by 9


students in the upper group, 4 in the
lower group.

Task: Compute the discrimination index.

Guide formula: D = (Correct_U -


Correct_L) / (n)
Mean Percentage Score

MPS is the average percentage score of all students who took an


assessment. It tells you how well a class performed on a test on average and
is commonly used in analyzing test results and reporting performance.
Why It’s Useful:

Gives an overview of class performance.

Helps determine if the test was too easy or too difficult.

Used in data-based decision-making and improvement planning.


DepEd’s Grading System
Using assessment data for
instruction means analyzing the
results of student assessments to
guide teaching decisions, improve
learning outcomes, and provide
targeted support. It's not just about
grading—it's about using the data
to adjust instruction meaningfully.
Why Use Identify learning gaps
→ Which concepts do students struggle with?
Assessment
Data in Differentiate instruction
→ Who needs enrichment, remediation, or extra support?
Instruction?
Monitor progress
→ Are students improving over time?

Inform lesson planning


→ What should I reteach or approach differently?

Support evidence-based teaching


→ What strategies or methods are working?
Key Take-aways
“WHAT KIND OF TEACHER DO YOU WANT TO BE WHEN IT
COMES TO ASSESSMENT?”
Real-Life Classroom Example

A COLLEAGUE ONCE SAID HER


STUDENTS UNDERSTOOD SUPPLY
THINK OF A TOPIC YOU’LL TEACH
AND DEMAND, BUT HER QUIZ JUST
NEXT WEEK. WHAT’S A POSSIBLE
ASKED THEM TO DEFINE THE
LEARNING OUTCOME? CAN YOU
TERMS. AFTER REALIGNING, SHE
WRITE ONE ASSESSMENT ITEM
USED A GRAPH-BASED PROBLEM
THAT TRULY REFLECTS THAT
WHERE STUDENTS HAD TO
OUTCOME?
ANALYZE A PRICE DROP. THAT
MATCHED THE INTENDED SKILL.
Exit Ticket
Craft assessment items aligned with
specific learning outcomes.

At least 3 objectives, 15 items and 5


hours of teaching, provide the TOS
and type of test for each objective.

You might also like