0% found this document useful (0 votes)
84 views

Cavite State University-Indang Education Circle 2021-2022: The Teacher'S Archive

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
84 views

Cavite State University-Indang Education Circle 2021-2022: The Teacher'S Archive

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

THE TEACHER’S ARCHIVE 2021-2022 CvSU-INDANG/ EDUCATION CIRCLE

Cavite State University-Indang


Education Circle
2021-2022

THE TEACHER’S ARCHIVE

Assessment in Learning 1
EDUC 75

DINAMPO, NOBLE, MALLO, BACALA, MADRAZO, TAPALLA, CABRAL 1


THE TEACHER’S ARCHIVE 2021-2022 CvSU-INDANG/ EDUCATION CIRCLE

Dinampo, Ryan Jay V. ((BSNED 2-1)


Noble, Danielle Jana C. (BSNED 2-1)
Mallo, Thea P. (BNSED 2-2)
Bacala, Kaila Marie T. (BTLED 2-1)
Madrazo, Melroette Faith P. (BNSED 2-2)
Tapalla, Mark Gilbert R. (BSE SS 2-1)
Cabral, Kristofherson Carlo A. (BSE SS 2-1)

Project Heads:

PYAR CHING KIMBERLY NICOLE TAN


Vice Chairman-External Auditor

DINAMPO, NOBLE, MALLO, BACALA, MADRAZO, TAPALLA, CABRAL 2


THE TEACHER’S ARCHIVE 2021-2022 CvSU-INDANG/ EDUCATION CIRCLE

CvSU Vision Republic of the Philippines CvSU Mission


The premier university CAVITE STATE UNIVERSITY Cavite State University shall provide
in historic Cavite
recognized for excellence
Don Severino de las Alas Campus excellent, equitable and relevant educational
opportunities in the arts, science and
in the development of Indang, Cavite technology through quality instruction and
globally competitive and relevant research and development activities.
morally upright individuals. It shall produce professional, skilled and
morally upright individuals for global
competitiveness.

Acknowledgement of Responsibility

DISCLAIMER
This document is strictly confidential. Any review, retransmission, dissemination or other use of, without the prior written
consent of Education Circle of Cavite State University-Indang is prohibited.

Informed Consent
I understand that my participation is voluntary and that I am free to withdraw at any time, without cost. I understand that I
will be given a copy of this consent form.

I voluntarily agree to take part in this academic matter.

Participant’s signature: ___JOVAN B. ALITAGTAG____________________ Date: March 1, 2022

Prepared by:

PYAR CHING KIMBERLY NICOLE TAN


Vice Chairman-External Auditor

Conformed by:

ANNE JADE NICOLE MANICAD MAE ZAVILL CRISTORIA ANGELO DE VILLA


Chairman Vice Chairman-Internal Secretary

JAMES WARREN CRUSPE LAARNI JANE PAREJA DANA MARGARETTE JUGANAS


Treasurer Public Relations Officer Business Manager-Internal

ELLA MARIE PAMPLINA ROVI MAE PEREY ABEGAIL VALENZONA MELANIO


Business Manager-External Sergeant at Arms Second Year Representative

VAL PATRICK DELA REA YUAN ANGELO


Third Year Representative First Year Representative

Approved by:

JAKE RAYMUND F. FABREGAR, DEM JOVAN B. ALITAGTAG, DEM, LPT


Adviser, Education Circle Adviser, Education Circle

DINAMPO, NOBLE, MALLO, BACALA, MADRAZO, TAPALLA, CABRAL 1


THE TEACHER’S ARCHIVE 2021-2022 CvSU-INDANG/ EDUCATION CIRCLE

SYLLABUS COPY

CvSU Vision Republic of the Philippines CvSU Mission


The premier Cavite State University shall provide
university in historic CAVITE STATE UNIVERSITY excellent, equitable and relevant
Cavite recognized for educational opportunities in the arts,
Cavite State University science and technology through quality
excellence in the
development of Indang, Cavite instruction and relevant research and
development activities.
globally competitive
It shall produce professional, skilled
and morally upright and morally upright individuals for global
individuals. competitiveness.

COLLEGE OF EDUCATION
TEACHER EDUCATION DEPARTMENT
COURSE SYLLABUS
First Semester, AY 2021-2022
Lecture
Course Course Assessment in / ;
EDUC 75 Type Credit Units 3
Code Title Learning 1 Laboratory
___
This course discusses the principles, development, and utilization of conventional assessment tools
Course to improve the teaching-learning process. It emphasizes on the use of assessment of, as, and for, in
Descriptio meaningful knowledge, comprehension and other thinking skills in the cognitive, psychomotor or
n effective domains. It allows students to go through the standard steps in test construction and
development and the application in grading systems.
Lecture: 3_____________
Pre-
None Course Schedule Wed 7:00-10:00 Laboratory: ______________
requisites

Students are expected to live by and stand for the following University tenets:

TRUTH is demonstrated by the student’s objectivity and honesty during examinations, class activities
and in the development of projects.
Core
EXCELLENCE is exhibited by the students’ self-confidence, punctuality, diligence and commitment
Values
in the assigned tasks, class performance and other course requirements.
SERVICE is manifested by the students’ respect, rapport, fairness, and cooperation in dealing with
their peers and members of the community.
In addition, they should exhibit love and respect for nature and support for the cause of humanity.

The College shall endeavor to achieve the following goals:

1. Offering of varied undergraduate and graduate degree courses leading to various professions that
Goals of will cater to the needs of the society;
the 2. Offering short-term courses that will directly benefit the client system;
College/ 3. Improvement of student performance
Campus 4. Improvement of facilities for both students and facilitators of learning;
5. Strengthening of linkage between research and the client system;
6. Conduct community development services to the different clienteles.
The department shall endeavor to:
Objectives
of the
1. provide relevant and quality course offering at the graduate and undergraduate levels to improve
Departmen
student performance;
t
2. conduct relevant research in the different areas in education to enrich the learning process;

DINAMPO, NOBLE, MALLO, BACALA, MADRAZO, TAPALLA, CABRAL 1


THE TEACHER’S ARCHIVE 2021-2022 CvSU-INDANG/ EDUCATION CIRCLE

3. conduct relevant community services to disseminate information and technologies to target


clienteles to improve their well-being;
4. publish research journals and other related publications to disseminate relevant information; and
5. produce instructional materials to improve student performance.

Program Educational Objectives (Based on CMO No. 74 Series of 2017)

The program aims to produce graduates who can:

1. Articulate and discuss the latest developments in the specific field of practice.
2. Effectively communicate in English, Filipino, both orally and in writing.
3. Work effectively and collaboratively with a substantial degree of independence in multi-disciplinary and multi-
cultural teams.
4. Act in recognition of professional, social, and ethical responsibility.
5. Preserve and promote Filipino historical and cultural heritage.
6. Articulate the rootedness of education in philosophical, socio-cultural, historical, psychological and political
contexts.
7. Demonstrate mastery of subject matter/discipline.
8. Facilitate learning using a wide range of teaching methodologies and delivery modes appropriate to specific
learners and their environments.
9. Develop innovative curricula, instructional plans, teaching approaches, and resources for diverse learners.
10. Apply skills in the development and utilization of ICT to promote quality, relevant and sustainable educational
practices.
11. Demonstrate a variety of thinking skills in planning, monitoring, assessing and reporting learning processes
and outcomes.
12. Practice professional and ethical teaching standards sensitive to the local, national and global realities.
13. Pursue lifelong learning for personal and professional growth through varied experiential and field-based
opportunities.

Student Outcomes and Relationship to Program Educational Objectives


Program Educational Objectives Code
Program/Student Outcomes (Based on CMO No. 74 S. 2017) (based on the program CMO)
1 2 3 4 5 6 7 8 9 10 11 12 13
The students will be able to:
a.demonstrate in-depth understanding of the diversity of ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
learners in various learning areas;
b.manifest meaningful and comprehensive pedagogical content ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
knowledge (PCK) in the different subject areas;
c.utilize appropriate assessment and evaluation tools to ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
measure learning outcomes;
manifest skills in communication, higher order thinking and ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
d.use of tools and technology to accelerate learning and
teaching;
e.demonstrate positive attributes of a model teacher, both as an ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
individual and as professional; and
f.manifest a desire to continuously pursue personal and ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
professional development.
Course Outcomes and Relationship to Student Outcomes
Program/Student Outcomes Code
Program Outcomes Addressed by the Course
a b c d e F
After completing this course, the students will be able to:
1. explain the basic concepts and principles of I I I I I

DINAMPO, NOBLE, MALLO, BACALA, MADRAZO, TAPALLA, CABRAL 2


THE TEACHER’S ARCHIVE 2021-2022 CvSU-INDANG/ EDUCATION CIRCLE

assessing student learning;


2. identify the different learning targets and their
I I I I I I
appropriate assessment technique/test formats;
3. design an examination with Table of Test
E E I
Specification on specific field of specialization;
4. evaluate given test items based on the principles
E E E E E
and guidelines in test construction; and
5. compute and explain for the values of mean,
median, mode, range and standard deviation and I I E E D
interpret the meaning of such values.
*Level : I-Introductory E- Enabling D-Demonstrative

COURSE COVERAGE
Teaching Resources Due Date
Mode of Outcomes-
Week Intended and Needed of
Delivery based
No. Learning Topic Learning Submiss
Assessment
Outcomes (ILO) Activities ion of
(OBA)
(TLA) Output
1 After the Overview of the
completion of the course
chapter, students Presentation Distance Student Survey Week 1
will A. CvSU VGMO s mode handbook
be able to: B. College Goals Course
C. Program Objectives Surveying syllabus
1. inculcate in their Student’s Electronic
D. Course Content
minds and Online device
E. GAD Access Data
hearts the
F. Classroom network
mission and
Rules/Netiquette Creating FB connection
vision of the Group
G. Course
university;
Requirements
2. get oriented
with the course
requirements
and proper
classroom
decorum.
3. prepare and be
equip for the
distance
learning
discussion
2 After the I. Measurement, Brainstormin Distance Research Research Week 2
completion of the Assessment and g mode assignment assignment
chapter, students Evaluation
dimensional Portfolio:
should be able to: A. Measurement KWL Chart
question Infographic
B. Assessment approach
1. state the KWL Chart
C. Assessment (KWL) Infographic
importance of Principles
OBE; Access to Quiz Challenge

DINAMPO, NOBLE, MALLO, BACALA, MADRAZO, TAPALLA, CABRAL 3


THE TEACHER’S ARCHIVE 2021-2022 CvSU-INDANG/ EDUCATION CIRCLE

2. distinguish D. Approaches to buzz Google


among Assessment sessions classroom
measurement, E. Evaluation Pop Quiz
group
assessment
dynamics
and evaluation;
and lecture
3. explain the
approaches to
assessment.
3-5 After the II. Roles of Think-Pair- Distance Module Research Week 5
completion of the Assessment in Share mode assignment
chapter, students Instructional Handouts
should be able to: Decisions Write it Out Advance
A. Placement Powerpoint Organizer
1. differentiate the B. Diagnostic Individual/Gr Presentation
different kinds Portfolio:
C. Formative oup
of test; and Write-it-out
D. Summative Discussion
Essay
2. discern the
proper
Quiz
application of
the different
kinds of
assessment
(e.g. when to
conduct a
placement test).
6-8 After the III.Instructional Recitation Distance Module Learning log Week 8
completion of the Objectives and Lecture mode
chapter, students Learning Outcomes Buzz session Handouts Portfolio
should be able to: Note-taking Entries
A.Taxonomy of Brainstormin Powerpoint
1. identify the Educational g Presentation Enrichment
taxonomy of Objectives Interactive Activities
educational 1. Cognitive Discussion
objective; and Domain Writing
2. develop a 1. Psychomotor Objectives
lesson plan Domain
using the 1. Affective
Domain
appropriate
learning
objectives.
9 MIDTERM EXAMINATION
10-12 After the IV. Development of Questions Distance Module Research Week 12
completion of the Classroom and mode
chapter, students Assessment Tools Discussion Handouts Assessment
Tool
should be able to: for Measuring
Lecture construction
Learning Powerpoint
1. develop a Presentation

DINAMPO, NOBLE, MALLO, BACALA, MADRAZO, TAPALLA, CABRAL 4


THE TEACHER’S ARCHIVE 2021-2022 CvSU-INDANG/ EDUCATION CIRCLE

classroom Outcomes Test TOS


assessment Planning and Construction
tools for A. Test Planning: construction
Table of Written Test
measuring
Semantic Construction
learning Specifications
1. Guidelines webbing
outcomes;
2. apply the 2. Preparation
guidelines in B. Steps in
making Table of Constructing
Specifications; Assessment
and Tools
3. analyze the C. Test Designing
difficulty and and Construction
discrimination 1. Types of Test
of test items. Guidelines
on Test
Construction
D. Item Analysis
13-14 At the end of the V. Organization, Interactive Distance Module Graphical Week 14
unit, the students Analysis and Discussion mode presentation of
will be able to: Interpretation of Handouts assessment
data
Assessment Data Problem-
1. use the A. Organization of Based Powerpoint Reporting
graphical Test Data using activity Presentation
organizers to Tables and
organize Graphs Note-taking
assessment data B. Introduction to
and Statistical
Concepts
2. compute for the 1. Measures of
Central
mean, median and
Tendency:
mode Mean, Median,
Mode
15-17 At the end of the VI. Utilization and Discussion Distance Module Self/peer Week 17
unit, the students Communication of mode assessment
will be able to: Assessment Data Peer- Handouts
Learning Quiz
1. distinguish the 1. Grading and Powerpoint Enrichment
difference Assigned
Reporting of reading Presentation activity
between norm Test Results
and criterion A. Purpose of Problem Problem
reference test; Grading based analysis
2. show familiarity B. Grading activity
with the grading System of the
system of Philippine K-12Forum
Program
DepEd; and
3. compute
students’

DINAMPO, NOBLE, MALLO, BACALA, MADRAZO, TAPALLA, CABRAL 5


THE TEACHER’S ARCHIVE 2021-2022 CvSU-INDANG/ EDUCATION CIRCLE

grades based
on the grading
system of
DepEd; and
interpret
results based
on the
statistical
concept.

18 FINAL EXAMINATION

COURSE REQUIREMENTS

Suggested Lecture Requirements:


1. Mid-Term Examination
2. Final Examination
3. Quizzes/Seat works/Recitations
4. Video presentation
5. Fact Sheet
6. Class Reporting/Reaction Paper
7. Assignments
8. Class or Group Project (Term Paper/Project Design/Case Study/Feasibility Study/Culminating
Activity/Portfolio)
9. Class Attendance

Suggested Laboratory Requirements:


1. Laboratory Reports
2. Individual Performance
3. Quizzes
4. Mid-Term Examination
5. Final Examination
6. Video presentation
7. Fact Sheet
8. Attendance

*All exams must follow a Table of Specifications (TOS) and Rubrics for evaluation of student’ performance or
projects.

GRADING SYSTEM
A. Grading system for 2 units lecture and 1 unit laboratory (i.e. DCIT 21; 3 units; Lec - 2 hrs & Lab - 3
hrs)
Lecture – 60%
Laboratory – 40%

B. Grading system for 1 unit lecture and 2 units laboratory (i.e. DCIT 22; 3 units; Lec -1 hr & Lab - 6 hrs)
Lecture – 40%
Laboratory – 60%

C. Grading system for 2 units lecture and 3 units laboratory (i.e. ELEX 50; 5 units; Lec – 2 hrs & Lab – 9

DINAMPO, NOBLE, MALLO, BACALA, MADRAZO, TAPALLA, CABRAL 6


THE TEACHER’S ARCHIVE 2021-2022 CvSU-INDANG/ EDUCATION CIRCLE

hrs)
Lecture – 30%
Laboratory – 70%

STANDARD TRANSMUTATION TABLE FOR ALL COURSES

96.7 – 100.0 1.00


93.4 – 96.6 1.25
90.1 - 93.30 1.50
86.7 – 90.0 1.75
83.4 – 86.6 2.00
80.1 – 83.3 2.25
76.7 – 80.0 2.50
73.4 – 76.6 2.75
70.00 – 73.3 3.00
50.0-69.9 4.00
Below 50 5.00
INC Passed the course but lack some requirements.
Dropped If unexcused absence is at least 20% of the Total Class Hours.
Total Class Hours/Semester: (3 unit Lec – 54 hrs; 2 unit Lec – 36 hrs)
(1 unit Lab – 54 hrs; 2 units Lab – 108 hrs; 3 units Lab – 162 hrs)

CLASS POLICIES
A. Attendance
Students are not allowed to have 20% or more unexcused absences of the total face to face class hours;
otherwise, they will be graded as “DROPPED”.

B. Classroom Decorum

During face to face mode


Students are required to:
1. wear identification cards at all times;
2. wear face mask at all times
3. observe physical/social distancing at all times
4. clean the classroom before and after classes;
5. avoid unnecessary noise that might disturb other classes;
6. practice good manners and right conduct at all times;
7. practice gender sensitivity and awareness inside the classroom; and
8. come to class on time.

During distance mode


Students are required to:
1. sign an honor system pledge;
2. avoid giving or receiving unauthorized aid of any kind on their examinations, papers, projects, and
assignments,
3. observe proper netiquette during online activities, and
4. submit take-home assignments on time.

C. Examination/ Evaluation
1. Quizzes may be announced or unannounced.
2. Mid-term and Final Examinations are scheduled.

DINAMPO, NOBLE, MALLO, BACALA, MADRAZO, TAPALLA, CABRAL 7


THE TEACHER’S ARCHIVE 2021-2022 CvSU-INDANG/ EDUCATION CIRCLE

3. Cheating is strictly prohibited. A student who is caught cheating will be given a score of ”0” for the first
offense. For the second offense, the student will be automatically given a failing grade in the subject.
4. Students who will miss a mid-term or final examination, a laboratory exercise, or a class project may
be excused and allowed to take a special exam, conduct a laboratory exercise or pass a class project
for any of the following reasons:
a. participation in a University/College-approved field trip or activity;
b. due to illness or death in the family; and
c. due to force majeure or natural calamities.

REFERENCES & SUPPLEMENTARY READINGS


References:

A. Required Textbook/Workbook – None


B. Laboratory Manual (if with laboratory) – NA
C. Reference Books

Buenaflor, R. C. (2012). Assessment of learning book one: the conventional approach.


Quezon City: Great Books Publishing
Calmorin, L.P. (2011). Assessment of student learning 1. Manila: Rex Book Store, Inc.
Garcia, C.D. (2013). Measuring and evaluating learning outcomes: a textbook in educational assessment
1&2. Second Ed. Mandaluyong City: Books Atbp. Publishing Corp.
Navarro, R.L., Santos R.G. (2012). Assessment of learning outcomes (assessment 1). Quezon City:
Lorimar Publishing Inc.
Navarro,R,L.et al.(2017). Assessment of Learning 1. Quezon City: Lorimar Publishing Inc.

D. Electronic References (E-books/Websites)

DepEd Oder. (2016). Guidelines on the request and transfer of learner’s school records.
Retrieved from http://www.deped.gov.ph/2016-guidelines-on-the-request-and-transfer-
of-learners-school-records/

REVISION HISTORY
Revision Number Date of Revision Date of Implementation Highlights of Revision
1 June 2020 1st Semester 2020-2021 Format and Flexible
Learning Mode of Delivery

Prepared by: Evaluated by: Approved:

MARIA JOVYLYN M. TAN FLORENCIO R. ABANES, EdD AMMIE P. FERRER, Ph.D.


Instructor Department Chairperson College Dean
CP #: 09664921554 Department of Teacher College of Education
E-mail Address: Education Date Approved:
[email protected] E-mail ________________________
Department of Teacher Education Address:__________________
Consultation Schedule: Thurs. 10:00- _
12:00 Date
Date Prepared: September 6, 2021 Evaluated:___________________

DINAMPO, NOBLE, MALLO, BACALA, MADRAZO, TAPALLA, CABRAL 8


THE TEACHER’S ARCHIVE 2021-2022 CvSU-INDANG/ EDUCATION CIRCLE

ASSESSMENT IN LEARNING 1
Table of Contents
I. Shift of Education Focus (Content to Learning Outcomes)………………………………….1
a) Outcome Based Education …………………………………………………………………..1
b) Outcomes of Education……………………………………………………………………….2
c) Institutional Program Course and Learning Outcomes ………………………………...2
d) Sample Educational Objectives ……………………………………………………………..2
II. Measurement, Assessment, and Evaluation ...............................................................3
a) Measurement .....................................................................………………….................................3
b) Assessment .....................................................................…………………...................................3
c) Assessment Principles .....................................................................……………………………..3
d) Approaches to Assessment ………………………………………………………………………….3
e) Evaluation ……………………………………………………………………………………………….3

III. Roles of Assessment in Instructional Decisions


a. Placement ………………………………………………………………………………………………..4
b) Diagnostic
c) Formative
d) Summative
IV. Instructional Objectives and Learning Outcomes ………………………………………..5
A. Taxonomy of Educational Objectives
a) Cognitive Domain………………………………………………………………………………………….5
b) Psychomotor Domain…………………………………………………………………………………...6
c) Affective Domain…………………………………………………………………………………………..6
d) Principles of Good Assessmet………………………………………………………………………….9
V. Development of Classroom Assessment Tools for Measuring Learning Outcomes
a) Test Planning: …………………………………………………………………………………………15
1) Table of Specifications……………………………………………………………………………….15
2) Guidelines
3) Preparation
b) Steps in Constructing Assessment Tools
c) Test Designing and Construction
1) Types of Test……………………………………………………………………………………………..15
2) Guidelines on Test Construction
VI. Organization, Analysis, and Interpretation of Assessment Data
a) Item Analysis…………………………………………………………………………………………..20
b) Organization of Test Data using Tables and Graphs…………………………………………..21
c) Introduction to Statistical Concepts………………………………………………………………22
VII. Measures of Central Tendency: Mean, Median, Mode……………………………………………24
VIII. Utilization and Communication of Assessment Data……………………………………………..26
1. Grading and Reporting of Test Results
a) Purpose of Grading
b) Grading System of the Philippine K-12 Program
IX. REFERENCES ……………………………………………………………………………………………30

DINAMPO, NOBLE, MALLO, BACALA, MADRAZO, TAPALLA, CABRAL 1


THE TEACHER’S ARCHIVE 2021-2022 CvSU-INDANG/ EDUCATION CIRCLE

DINAMPO, NOBLE, MALLO, BACALA, MADRAZO, TAPALLA, CABRAL 2


THE TEACHER’S ARCHIVE 2021-2022 CvSU-INDANG/ EDUCATION CIRCLE

EDUC 75 - ASSESSMENT IN LEARNING I


I. Shift of Educational Focus from Content Mental Skills also called knowledge, refers
to mental skills such as remembering,
to Learning Outcomes Cognitive
understanding, applying, analyzing,
evaluating, synthesizing/creating.
Manual Or Physical Skills
INTRODUCTION
also referred to as skills, includes manual
 Education originated from the terms “educare” or
or physical skills, which proceed from
“educere” which meant “to draw out.”
Psychomotor mental activities and range from the
 However, for centuries we succeeded in perpetuating
simplest to the complex such as
the belief that education is a “pouring in” process
observing, imitating, practicing, adapting
wherein the teacher was the infallible giver of
and innovating.
knowledge and the student was the passive recipient.
growth in feelings or emotion
 With knowledge explosion, students are surrounded
also known as attitude, refers to growth in
by various sources of facts and information accessible
feelings or emotions from the simplest
through user-friendly technology. The teacher has Affective behavior to the most complex such as
become a facilitator of knowledge who assists in the
receiving, responding, valuing, organizing,
organization, interpretation, and validation of acquired
and internalizing.
facts and information.
 the teacher was the infallible giver of knowledge and
the student was the passive recipient. 3.Drafting outcomes assessment procedure
 The teacher has become a facilitator This procedure will enable the teacher to determine the degree
to which the students are attaining the desired learning
OUTCOME-BASED EDUCATION outcomes. It identifies for every outcome the data that will be
Matching Intentions with Accomplishment The change in gathered which will guide the selection of the assessment tools
educational perspective called Outcome-based Education to be used and at what point assessment will be done.
(OBE) has three (3) characteristics:
1. student-centered
that is, it places the students at the center of the
process by focusing on Student Learning Outcomes
(SLO).
2. faculty-driven
that is, it encourages faculty responsibility for teaching,
assessing program outcomes, and motivating
participation from the students.
3. Meaningful
that is, it provides data to guide the teacher in making
valid and continuing improvements in instruction and
assessment activities.
Procedure to Implement Outcome-Based Education:
1. Identification of the educational objectives of
the subject/course.
Educational objectives are the broad goals that the
subject/course expects to achieve and define in general terms
the knowledge, skills, and attitude that the teacher will help the
students to attain. The objectives are stated from the point of
view of the teacher such as: “to develop, to provide, to
enhance, to inculcate, etc.”
2. Listing of learning specified for each
subject/course objective, uses active verbs
Since subject/course objectives are broadly stated, they do not
provide a detailed guide to be teachable and measurable.
Learning outcomes are stated as concrete active verbs such
as: to demonstrate, to explain, to differentiate, to illustrate, etc.
A good source of learning outcomes statements is the
taxonomy of educational objectives by Benjamin Bloom.

Bloom’s
Taxonomy

DINAMPO, NOBLE, MALLO, BACALA, MADRAZO, TAPALLA, CABRAL 1


EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

Example of Educational Objectives and Learning


THE OUTCOMES OF EDUCATION Outcomes
Outcomes-based education focuses classroom instruction on Educational Objectives Learning Outcomes
the skills and competencies that students must demonstrate 1.1 The students can describe
1. To provide instruction that
when they exit. There are two (2) types of outcomes: the characteristics of their
will enable the students to
physical environment
understand their
using their five senses.
Two types immediate physical
of Example environment by using
1.2 The students can
Outcomes their senses, questioning,
formulate questions about
sharing ideas, and
how the physical
• Ability to communicate identifying simple cause
environment impacts
in writing and and effect relationships.
individual well-being.
speaking (Cognitive Objectives)
2. To equip the students with
• Mathematical problem the skill to conduct a
competencies/skills -solving skill guided investigation by 2.1 The students can perform
required upon completion following a series of steps simple experiments
Immediate
of an instruction, a subject, • Skill in identifying that includes making and efficiently
Outcome objects by using the
a grave level of the testing predictions, 2.2 The students can make
program different senses collecting and recording use of results to discover
data, discovering patterns, patterns that support the
• Ability to produce and suggesting possible hypothesis
artistic or literary work explanations (psycho-
motor objectives).
• Ability to do research 3.1 The students can
• and write the results demonstrate sensitivity
Success in 3. To encourage among the
towards the difference
professional students a deep
between plants and
practice or occupation understanding and
animals
ability to apply cognitive, appreciation of the
psychomotor, and affective Promotion in a job differences of the plant
Deferred 3.2 The students can
skills in various situations and animal groups found
Outcome appreciate the distinct
in the locality
many years after Success in career characteristics of plants
completion of a degree planning, health and and of animals.
program wellness

Awards and
recognition

immediate and deferred outcomes.

Outcome in Outcome-based Education


comes in four levels:

The outcome in
Outcome-based
Education

It is the statement of what graduates


Institutional of the institution are supposed to be
Outcomes able to do.

Program It is the statement of what graduates


Outcomes from a particular degree program
should be able to do.

Course It is the statement of what students


Outcomes should be able to do after the
completion of a given course.

It is the statement of what is


Lesson expected that a student will be able
Outcomes BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.
to do as a result of a learning 2
activity.
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

II. DETERMINING THE PROGRESS • From the Latin word ASSIDERE meaning “To Sit
Beside”.
TOWARDS THE ATTAINMENT OF
• The process of gathering evidence on the
LEARNING OUTCOMES performance of a student over a period of time to
identify learning and mastery of skills.
INTRODUCTION
● Measurement, Assessment, and Evaluation plays a There are two kinds of assessment
major role in the attainment of learning outcomes 1. ASSESSMENT OF UNDERSTANDING
-Test is a common kind of assessment.
● There are different types of measurements, as well as Measures the students’ understanding on a
assessments that an educator can use to attain a specific certain topic.
learning outcome
2. ASSESSMENT OF SKILL
● There are three approaches to assessment and these
-Students’ skills are being tested or simply
are Assessment AS learning, Assessment FOR learning, and
the performance-based tasks.
Assessment OF learning

IMPORTANCE OF ASSESSMENT
MEASUREMENT
We assess to IMPROVE, to INFORM, and to PROVE. Aside
• Measurement is the process of describing as well as
from those three mentioned, here are the importance of doing
determining the attributes or characteristics of an object in
assessments.
terms of quantity.
• Aside from physical objects, we can also measure those 1. To find out what the students know (KNOWLEDGE)
abstract ones. Abstract subjects can be measured through 2. To determine what students can do and how they can
numbers (ex. grades). do it (SKILL; PERFORMANCE)
3. To find out how students will do the task (PROCESS)
There are two TYPES of Measurement 4. To determine how students feel about their work
1. OBJECTIVE (MOTIVATION; EFFORT)
• The standardized tests, and it has more/less the
same outcome.
Ex. Height, Weight, Answer in a Multiple-Choice test

2.
SUBJECTIVE

Dependent to the assessor or there’s bias to the
assessor.
• Have varied outcomes
Ex. Scores of each group during a performance task

MEASUREMENT INDICATORS

VARIABLES

•Quantity or faction that can assume any given value or


set of values.
• Educational variables (denoted by an English alphabet
like X) are measurable characteristics of a student.
• It can be directly measurable (e.g. X=Age of Student) but
most often it can’t be directly measured (e.g. X=Class
participation of student A)
INDICATORS


Building blocks of educational measurement upon which
other forms of measurement are built.
• Group of indicators constitutes a Variable.
• Indicators (I) denotes the presence or absence of a
measured characteristic.
FACTORS

• Formed through a group of variables


ASSESSMENT

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


3
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

TYPES OF ASSESSMENT APPROACHES TO ASSESSMENT

1. ASSESSMENT AS LEARNING

Self-monitoring, Use of assessment tools in helping students to


monitor their performance, strengths and weaknesses.

2. ASSESSMENT FOR LEARNING

Formative meaning it should be done before, during, and after


the learning process.

3. ASSESSMENT OF LEARNING

Summative meaning done after a unit, or semester for grading


purposes and to judge learning.

PRINCIPLES OF ASSESSMENT

1. Assessments should be valid


-Must measure what it intends to measure, it must be
clear and the skill and knowledge that will be
assessed should be a perfect match in order to come
up with a reliable result.

2. Assessments should be reliable


-Pertaining to the assessment tools producing stable,
reliable results.

3. Assessments should be equitable


-Equitable means fair and free from discrimination.
Every student must have an opportunity to share their
knowledge and skills to the class.

4. Assessments should be explicit and transparent


-Teacher’s expectations must be known and clear to
the students as well as how the students will be
graded.

5. Assessments should support learning process


-This should be the main drive to the student’s
progress. Feedbacks can help for the student’s
improvement.

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


4
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

process that will lead to an outcome. Outcomes are the results


of instruction, without regard to process. A standard is the
clear definitions of expectations, or targets for student
performance against which we gauge success in achieving an
outcome. Often based upon a rating scale or grading rubric,
standards differentiate outstanding, adequate, and poor
performance.

THE
THREE
TYPES
OF
LEARN
ING
There
is
more
than
one
type of
learnin
g. A
commi
ttee of
colleges, led by Benjamin Bloom (1956), identified three
domains of educational activities:

1. Cognitive: mental skills (Knowledge)

2. Affective: growth in feelings or emotional areas


(Attitude)

3. Psychomotor: manual or physical skills (Skills)

Since the work was produced by higher education, the words


tend to be a little bigger than we normally use. Domains can be
thought of as categories. Trainers often refer to these three
domains as Knowledge, Skills and Attitude (KSA). This
taxonomy of learning behaviors can be thought of as ‘the goals
III. STUDENT OUTCOMES AND STUDENT of the training process.’ That is, after the training session, the
LEARNING OUTCOMES learner should have acquired new skills, knowledge and/or
attitudes.

They also produced an elaborate compilation for the cognitive


PROGRAM OUTCOMES AND STUDENT LEARNING and affective domains, but none for the psychomotor domain.
OUTCOMES Their explanation for this oversight was that they have little
Program outcomes examine what a program or process is to
experience in teaching manual skills within the college level.
do, achieve, or accomplish for its own improvement and/or in
support of institutional or divisional goals: generally, numbers, This compilation divides the three domains into subdivisions,
needs, or satisfaction driven. They can address quality, starting from the simplest behaviors to the most complex. The
quantity, fiscal sustainability, facilities and infrastructure, or divisions outlined are not absolutes and there are other
growth. systems or hierarchies that have been devised in the
educational and training world. However, Bloom's taxonomy is
While a learning outcome describes the learning
easily understood and is probably the most widely applied one
impact that a program or service has on students and
in use today.
stakeholders, a program or operational outcome describes
what the program will do. As with learning outcomes, these The original Bloom’s educational taxonomy Bloom et. al (1956)
outcomes can serve as an important agreement between a unit published the taxonomy of educational objectives: a cognitive
and its stakeholders. Related to learning goals, these are the domain. Bloom et. al (1956) classified forms and levels of
specific result that is measurable statements of the knowledge, learning based on cognitive processes that learners involved in
skills, attitudes, and habits of mind that students acquire as a when they learn. Bloom considered his initial effort to be a
result of the learning experience. starting point, as evidenced in memorandum from 1971 in
which he stated “Ideally each major field should have its own
Objectives are the tasks to be accomplished in order
taxonomy in its own language – more detailed, closer to the
to achieve the goal. These are the means to the end the
special language and thinking of its experts, reflecting its own

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


5
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

appropriate sub-division and levels of education, with possible THE AFFECTIVE DOMAIN
new categories, combinations of categories and omitting
categories as appropriate” The affective domain (Krathwohl, Bloom, Masia, 1973)
includes the manner in which we deal with things emotionally,
Anderson’s Revised taxonomy as a match to Bloom’s such as feelings, values, appreciation, enthusiasms,
taxonomy Anderson (1990), a former student of Bloom, motivations, and attitudes. The five major categories are listed
updated and revised the taxonomy reflecting relevance to 21st from the simplest behavior to the most complex.
century work for both students and teachers as she said
“The affective domain includes the manner in which we
(Anderson& Krathwohl, 2001). Anderson changed the
deal with things emotionally, such as feelings, values,
taxonomy in three broad categories: terminology, structure and
appreciation, enthusiasms, motivations, and attitudes” –
emphasis (Forehands, 2005). Anderson modified the original Donald Clark
terminology by changing Bloom’s categories from nouns to
verbs. Anderson renamed the knowledge 220 category into
remember, comprehension into understanding and synthesis
into create categories. Anderson also changed the order of
synthesis and placed it at the top of the triangle under the
name of Create (Taylor & Francis, 2002). Thus, Anderson and
Krathwohl’s (2001) revised Bloom’s taxonomy became:
Remember, Understand, Apply, Analyze, Evaluate and Create

THE COGNITIVE DOMAIN

The cognitive domain of learning involves thinking about facts,


terms, concepts, ideas, relationships, patterns, conclusions,
etc. A common taxonomy utilized to document learning within
the cognitive domain is Bloom's Taxonomy (as revised by
Krathwohl, et al.). Bloom's Taxonomy organizes cognitive
levels of learning into the following domains, escalating in
complexity from remember to create:

In the revised taxonomy, knowledge is at the basis of


these six cognitive processes, but its authors created a
separate taxonomy of the types of knowledge used in
cognition:

• Factual Knowledge
 Knowledge of terminology
 Knowledge of specific details and
elements
• Conceptual Knowledge
 Knowledge of classifications and
categories
 Knowledge of principles and
generalizations
 Knowledge of theories, models, and
structures
• Procedural Knowledge THE PSYCHOMOTOR DOMAIN
 Knowledge of subject-specific skills and
algorithms The psychomotor domain (Simpson, 1972) includes
 Knowledge of subject-specific physical movement, coordination, and use of the motor-skill
techniques and methods areas. Development of these skills requires practice and is
 Knowledge of criteria for determining measured in terms of speed, precision, distance, procedures,
when to use appropriate procedures or techniques in execution. The seven major categories are
• Metacognitive Knowledge listed from the simplest behavior to the most complex:
 Strategic Knowledge
 Knowledge about cognitive tasks,
including appropriate contextual and
conditional knowledge
 Self-knowledge

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


6
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

an overall structure or purpose. Tools-Mashing, mind


mapping, surveying, linking, validating
 Applying. To use information in new situations such
as models, diagrams, or presentations. Tools-
Calculating, Charting, editing, hacking, presenting,
uploading, operating, sharing with a group
 Understanding. To explain ideas, concepts, or
construct meaning from written material or
graphics. Tools-Advanced searching, annotating,
blog journaling, tweeting, tagging, commenting,
subscribing
 Remembering. To recall facts, basic concepts, or
retrieval of material. Tools-Bookmarking, copying,
googling, bullet-pointing, highlighting, group
networking, searching

DIGITAL TAXONOMY

The purpose of Bloom’s Digital Taxonomy is to inform


instructors of how to use technology and digital tools to
facilitate student learning experiences and outcomes. It aims,
“To expand upon the skills associated with each level as
technology becomes a more ingrained essential part of
learning.”

The use of this adapted version and the examples of


tools it provides focus “should not be on the tools themselves,
but rather on how the tools can act as vehicles for transforming
student thinking at different levels.” Outlined below are the
levels featured within Bloom’s Revised Taxonomy. Each
level is accompanied by a description of its relevance and
examples of digital tools that connect with this taxonomy
framework.

 Creating. To produce new or original work. Tools-


Animating, blogging, filming, podcasting, publishing,
simulating, wiki building, video blogging, programming,
directing
 Evaluating. To justify a stand or decision; to make
judgments based on criteria and standards through
checking and critiquing. Tools-Grading, networking,
rating, testing, reflecting, reviewing, blog commenting,
posting, moderating
 Analyzing. To draw connections among ideas,
concepts, or determining how each part interrelate to

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


7
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

Assessment plays an important role in the process of learning


and motivation. The types of assessment tasks that we ask our
students to do determine how students will approach the
learning task and what study behavior they will use. In the
words of higher education scholar John Biggs, “What and how
students learn depends to a major extent on how they think
they will be assessed.

 The Assessment of students learning starts with


learning institution’s mission and core values.
 Assessment works best when the program has clear
statement of objectives aligned with institutional
mission and core values.
 Outcomes- based assessment focuses in the student
activities that will still be relevant after formal
schooling concludes:
 Design assessment activities which are observable
and less abstract such as to determine the student’s
ability to write paragraph which is more observable
than to determine the student verbal activity.
 Assessment requires attention not only to outcomes
but also equally to the activities and experiences that
lead to the attainment of learning outcomes.
 Assessment works best when it is continuous,
ongoing and not episodic.
 Assessment should be cumulative because
improvement is the best achieved through a linked
series activities done over time in an instructional
cycle.
9 PRINCIPLES OF GOOD PRACTICE FOR ASSESSING

STUDENT LEARNING

 The Assessment of student learning begins with


educational values. Assessment is not an end in
itself but a vehicle for educational improvement. Its
effective practice, then, begins with and enacts a
vision of the kinds of learning we most value for
students and strive to help them achieve. Educational
values should drive not only what we choose to
assess but also how we do so. Where questions
about educational mission and values are skipped
over, assessment threatens to be an exercise in
measuring what's easy, rather than a process of
IV. ASSESSING STUDENTS LEARNING
improving what we really care about.
OUTCOMES
 Assessment is most effective when it reflects an
understanding of learning as multidimensional,
integrated, and revealed in performance over time.

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


8
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

Learning is a complex process. It entails not only what process of individual students, or of cohorts of
students know but what they can do with what they students; it may mean collecting the same examples
know; it involves not only knowledge and abilities but of student performance or using the same instrument
values, attitudes, and habits of mind that affect both semester after semester. The point is to monitor
academic success and performance beyond the progress toward intended goals in a spirit of
classroom. Assessment should reflect these continuous improvement. Along the way, the
understandings by employing a diverse array of assessment process itself should be evaluated and
methods, including those that call for actual refined in light of emerging insights.
performance, using them over time so as to reveal  Assessment fosters wider improvement when
change, growth, and increasing degrees of integration. representatives from across the educational
Such an approach aims for a more complete and community are involved. Student learning is a
accurate picture of learning, and therefore firmer campus-wide responsibility, and assessment is a way
bases for improving our students' educational of enacting that responsibility. Thus, while
experience. assessment efforts may start small, the aim over time
 Assessment works best when the programs it is to involve people from across the educational
seeks to improve have clear, explicitly stated community. Faculty play an especially important role,
purposes. Assessment is a goal-oriented process. It but assessment's questions can't be fully addressed
entails comparing educational performance with without participation by student-affairs educators,
educational purposes and expectations -- those librarians, administrators, and students. Assessment
derived from the institution's mission, from faculty may also involve individuals from beyond the campus
intentions in program and course design, and from (alumni/ae, trustees, employers) whose experience
knowledge of students' own goals. Where program can enrich the sense of appropriate aims and
purposes lack specificity or agreement, assessment standards for learning. Thus understood, assessment
as a process pushes a campus toward clarity about is not a task for small groups of experts but a
where to aim and what standards to apply; collaborative activity; its aim is wider, better-informed
assessment also prompts attention to where and how attention to student learning by all parties with a stake
program goals will be taught and learned. Clear, in its improvement.
shared, implementable goals are the cornerstone for  Assessment makes a difference when it begins
assessment that is focused and useful. with issues of use and illuminates’ questions that
 Assessment requires attention to outcomes but people really care about. Assessment recognizes
also and equally to the experiences that lead to the value of information in the process of
those outcomes. Information about outcomes is of improvement. But to be useful, information must be
high importance; where students "end up" matters connected to issues or questions that people really
greatly. But to improve outcomes, we need to know care about. This implies assessment approaches that
about student experience along the way -- about the produce evidence that relevant parties will find
curricula, teaching, and kind of student effort that led credible, suggestive, and applicable to decisions that
to particular outcomes. Assessment can help us need to be made. It means thinking in advance about
understand which students learn best under what how the information will be used, and by whom. The
conditions; with such knowledge comes the capacity point of assessment is not to gather data and return
to improve the whole of their learning. "results"; it is a process that starts with the questions
 Assessment works best when it is ongoing not of decision-makers, that involves them in the
episodic. Assessment is a process whose power is gathering and interpreting of data, and that informs
cumulative. Though isolated, "one-shot" assessment and helps guide continuous improvement.
can be better than none, improvement is best fostered  Assessment is most likely to lead to improvement
when assessment entails a linked series of activities when it is part of a larger set of conditions that
undertaken over time. This may mean tracking the promote change. Assessment alone changes little.

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


9
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

Its greatest contribution comes on campuses where 4. Assessment requires attention not only to outcomes but also
the quality of teaching and learning is visibly valued and equally to the activities and experiences that lead to the
and worked at. On such campuses, the push to attainment of learning outcomes.
improve educational performance is a visible and
5. Assessment works best when it continuous, ongoing and not
primary goal of leadership; improving the quality of
episodic. Assessment should be cumulative because
undergraduate education is central to the institution's
improvement is best achieved through a linked series of
planning, budgeting, and personnel decisions. On
activities done over time in an instructional cycle.
such campuses, information about learning outcomes
is seen as an integral part of decision making, and 6. Begin assessment by specifying clearly and exactly what
avidly sought. you want to assess. What you want to assess is/are stated in
 Through assessment, educators meet your learning outcome/ lesson objectives.
responsibilities to students and to the public.
7. The intended learning outcome/lesson objective NOT
There is a compelling public stake in education. As
CONTENT is the basis of the assessment task. You use
educators, we have a responsibility to the publics that
content in the development of the assessment tool and task
support or depend on us to provide information about
but itis the attainment of your learning outcome NOT content
the ways in which our students meet goals and
that you want to assess. This is Outcome-based Teaching and
expectations. But that responsibility goes beyond the
Learning.
reporting of such information; our deeper obligation --
to ourselves, our students, and society -- is to improve. 8. Set your criterion of success or acceptable standard of
Those to whom educators are accountable have a success. It is against this established standard that you will
corresponding obligation to support such attempts at interpret your assessment results. Example: Is a score of 7 out
improvement. of 10 (the highest possible score) acceptable or considered
OTHER SOURCES: success?)

PRINCIPLES OF GOOD PRACTICE IN ASSESSING 9. Make use of varied tools for assessment data-gathering and
multiple sources of assessment data. It is not pedagogically
LEARNING OUTCOMES sound to rely on just one source of data gathered by only one
assessment tool. Consider multiple intelligences and learning
1. The Assessment of student learning starts with the
styles. DEPED Order No. 73, s.2012 cites the use of multiple
institution's vision, mission and core values. There should be a
measures as one assessment guideline.
clear statement on the kinds of learning that the institution's
values most for this student. 10. Learners must be given feedback about their performance.
Feedback must be specific. "Good work!" is positive feedback
2. Assessment works best when the program has clear
and is welcome but actually is not very good feedback since it
statement of objectives aligned with the Institutional vision,
is not specific A more specific better feedback is "You
mission and core values. Such alignment ensures
observed rules on subject-verb agreement and variety of
clear, shared and implementable objectives.
sentences. Three of your commas were misplaced.
3. Outcome - Based assessment focuses on the student
11. Assessment should be on real-world application and not on
activities that will still be relevant after formal schooling
out of-context drills.
concludes. The approach is to design assessment
activities which are observable and less abstract such 12. Emphasize on the assessment of higher-order thinking.
as " to determine the student’s ability to write a
13. Provide opportunities for self-assessment.
paragraph" which is more observable than " to determine the
student’s verbal ability".

SAMPLE OF SUPPORTING STUDENT ACTIVITIES

Critical Thinking/Problem Solving

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


10
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

Definition: Exercise sound reasoning and logic to analyze Outcomes:


issues, make decisions, and overcome problems. The
Ability to select and use appropriate technology to accomplish
individual is able to obtain, interpret, and use knowledge and
a given task.
experience, facts, and data in the process, and may
demonstrate creativity and inventiveness. Capable of solving problems through computing skills.

Outcomes: Develop professionally though the use of online/computer-


based tutorials and trainings.
Ability to understand the value of making decisions based on
the conditions at hand whether individually or collaboratively. Leadership

Understands how to solve problems effectively and efficiently. Definition: Leverage the strengths of others to achieve
common goals, and use interpersonal skills to coach and
Gain knowledge and experience through researching and
develop others. The individual is able to assess and manage
analyzing information for more thorough understanding.
his/her emotions and those of others; use empathetic skills to
Oral/Written Communication guide and motivate; and organize, prioritize, and delegate work.

Definition: Articulate thoughts and ideas clearly and Outcomes:


effectively in written and oral forms to persons inside and
Ability to motivate and empower others.
outside of the organization. The individual has public speaking
skills; is able to express ideas to others; and can write/edit Utilizes others’ gifts effectively.
emails, letters, and complex technical reports clearly and
Professionalism/Work Ethics
effectively.
Definition: Demonstrate personal accountability and effective
Outcomes:
work habits. Demonstrates integrity and ethical behavior, acts
Ability to effectively communicate verbally with others one on responsibly with the interests of the larger community in mind,
one, in groups, and/or in front of large audiences. and is able to learn from his/her mistakes.

Capable of communicating in written format, including Outcomes:


demonstrating a clear organization of one’s thoughts, using
Takes responsibility for personal behavior and acts in an
words that reflect one’s intended meaning, and delivering the
ethical manner.
information in a readable, clear, and concise manner.
Places a value on taking initiative.
Teamwork/Collaboration
Ability to effectively communicate non-verbally.
Definition: Build collaborative relationships with colleagues
and customers, and is able to work within a team structure, Motivated to follow-through on responsibilities.
and can negotiate and manage conflict.
Articulates the value of reflecting on experiences to apply
Outcomes: learning in the future.

Ability to collaborate successfully. Punctuality, working productively with others, and time
workload management, and understand the impact of non-
Capable of negotiating conflict effectively.
verbal communication on professional work image.
Understands the process of group development.
Career Management
Digital Technology
Definition: Identify and articulate one's skills, strengths,
Definition: Select and use appropriate technology to knowledge, and experiences relevant to career goals, and
accomplish a given task. The individual is also able to apply identify areas necessary for professional growth and utilizes
computing skills to solve problems.

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


11
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

campus resources to develop post-graduation employment students have mastered the content standards expected for
skills. that grade level.

Outcomes: Summative Assessment Tools

Completion of Career Services and Talent Development Summative assessments are similar to diagnostic
FOCUS test. assessments, except that they are focused on determining
whether a student has mastered the skills and knowledge
Participate in departmental opportunities.
expected for a specific grade level. Summative assessments
include tests, portfolios, and other forms of assessment.
PHASES OF OUTCOMES CONSTRUCTIVE ALIGNMENT

“Constructive alignment is an outcomes-based approach to VARIETY OF ASSESSMENT METHODS, TOOLS, AND


teaching in which the learning outcomes that students are TASKS
intended to achieve are defined before teaching takes place.
There are several good reasons to consider offering a variety
Teaching and assessment methods are then designed to best
of assessment methods, beyond the typical quiz/test/exam:
achieve those outcomes and to assess the standard at which
they have been achieved (Biggs, 2014) Students need to understand concepts deeply, as opposed to
memorize information and reproduce it on an exam, so they
Outcomes based teaching and learning is based on meeting
can handle advanced course work and later work effectively in
set standards of teaching and learning to ensure students meet
their chosen field.
the requirements for a degree. Assessment is marked against
criteria referenced to the outcome. Students need to be able to apply knowledge in authentic
learning and assessment activities to develop the skills
The Outcomes Assessment phase of the instructional cycle is
necessary for work in their chosen field.
focused specifically on identifying the most appropriate types
of assessments for our own students as well as for use within Students have diverse abilities, backgrounds, interests, and
our building or school. This phase “includes decisions about learning styles, so assessment variety puts all students on a
the goals to be pursued, the assessment tools that will best level playing field in terms of demonstrating what they know
achieve those goals, and how the data from those and can do.
assessments will be used” (McCormick & Jones, 2012, p. 123).
This statement is typical: “In addition to knowledge and
technical proficiency in core content areas, … professionals
need well-developed oral and written communication skills,
There are many different types of assessments that can be need to be able to work well in interdisciplinary teams (either
used; they each provide different types of information that can as leaders or team members), need “people” skills to
benefit a variety of learners and grade levels. The types of successfully interact with a diverse set of colleagues and
assessments that we should be using in order to provide stakeholders, and need a well-developed appreciation of
optimal instruction and intervention can be broken down into professionalism and ethics” (Abbott, 34).
three broad categories: diagnostic assessments, summative
assessments, and formative assessment.
To develop these types of abilities, students need to engage in
Diagnostic Assessment Tools authentic learning assessment activities. Authentic
assessments:
Diagnostic assessments, such as standardized tests and
teacher created tests (TCT), provide information about
students’ knowledge, skills, or abilities. They are focused on Require application of knowledge and skills in a “real world”
determining whether a student has mastered all of his or her context (realistic, even if it’s an artificial learning environment).
subject matter. For example, there are state mandated tests in
the 8th and 9th grade allowing teachers to determine if their

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


12
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

Involve unstructured, complex problems that may have multiple 1. Define the purpose of the assignment/assessment for which
solutions. Such problems contain both relevant and irrelevant you are creating a rubric.
factors, unlabelled, just like real life, and students need to
Consider the following:
decide what’s relevant and to develop a solution they can
explain and defend. What exactly is the assigned task? Does it break down into a
variety of different tasks? Are these tasks equally important?
Require students to “perform” discipline-specific activities or
What are the learning objectives for this assignment/task?
procedures, drawing on a wide range of knowledge and skills.
What do you want students to demonstrate in their completed
Provide feedback, practice, and opportunities to revise and assignments/performances?
resubmit solutions, so they can refine their skills, rather like an
What might an exemplary student product/performance look
apprenticeship between the instructor/TA experts and students.
like? How might you describe an acceptable student
Authentic assessments include such things as performance product/performance? How might you describe work that falls
demonstrations of specific skills, use and manipulation of tools below expectations?
and instruments, oral and/or poster presentations, debates,
What kind of feedback do you want to give students on their
panel discussions, role plays, teaching others, conducting
work/performance? Do you want/need to give them a grade?
experiments, and conducting interviews. Also included are
Do you want to give them a single overall grade? Do you want
“product assessments” such as essays, research reports,
to give them detailed feedback on a variety of criteria? Do you
annotated bibliographies, data analysis and interpretation,
want to give them specific feedback that will help them improve
argument construction and analysis, reviews, critiques and
their future work?
analysis of written work, problem analysis, planning, mapping,
budget development, experimental design, peer editing, 2. Decide what kind of rubric you will use: a holistic rubric or an
portfolios, poster, games, and podcast, video, and multimedia analytic rubric? Holistic and analytic rubrics use a combination
productions (Abbott, 37). of descriptive rating scales (e.g., weak, satisfactory, strong)
and assessment criteria to guide the assessment process.
Begin assignment and assessment design by focusing on
learning outcomes: what do you want students to remember, Holistic rubric
understand, apply, analyze, evaluate, or create (Davis, 362)
A holistic rubric uses rating scales that include the criteria. For
The table below outlines a variety of assignment and
example,
assessment options with rationales for using them and
implementation details. Weak: thesis is unclear due to writing style, organization of
ideas, and/or grammatical errors.

SCORING RUBRICS Satisfactory: overall thesis is clear, writing style and


organization mostly support the thesis.

Strong: Introduction includes a thesis statement, writing style


and organization offer ample evidence to support the overall
thesis.

Advantages:

Emphasis on what the learner can demonstrate (rather than


what she cannot)

Saves time by minimizing the number of decisions made

Can be used consistently across raters, provided there has


HOW TO CREATE A GRADING RUBRIC been training

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


13
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

Disadvantages:

Does not provide specific feedback for improvement

Can be difficult to choose a score when student work is at


varying levels across the criteria

Criteria cannot be weighted.

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


14
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

V. DISTINGUISHING AND CONSTRUCTING


VARIOUS PAPER AND PENCIL TESTS
5.1 PLANNING A TEST
5.2 TYPES OF PAPER AND PENCIL TESTS
5.3 CONSTRUCTING SELECTED RESPONSE TYPE
 TRUE FALSE TEST
 MULTIPLE CHOICE
 MATCHING TYPE

5.4CONSTRUCTING SUPPLY TYPE OR CONSTRUCTED


REPONSE TYPE
 COMPLETION TYPE OF TESTS
 ESSAYS
 TYPE OF ESSAYS

PLANNING A TEST

The important steps in planning a test are:

1) Identify Test Objectives or Lesson Outcomes

2) Deciding on the type of objective test to be prepared

3) Preparing a Table of Specification (TOS)

4) Constructing the draft test items

5) Try-out and validation

1. IDENTIFY TEST OBJECTIVES OR LESSON OUTCOMES

Objective test, if it is to be comprehensive, must cover


various levels of Bloom's Taxonomy. Each objective consists of
a statement of what is to be achieved preferably by the
students.
Example:

Topic: Subject-Verb Agreement in English


(With both old and revised versions of Bloom's Taxonomy)
• Knowledge/Remembering: Identify the subject and the verb in
the sentence.
• Comprehension/Understanding: Determine the appropriate
form of a verb in a given sentence.
• Application/Applying: Write sentences observing rules on
subject-verb agreement.
• Analysis/Analyzing: Break down a given sentence into its
subject and predicate.
• Evaluation/Evaluating: Evaluate whether or not a sentence
observes rules on subject-verb agreement.
• Synthesis/Creating: Formulate rules to be followed regarding
the subject-verb agreement.
2) DECIDING ON THE TYPE OF OBJECTIVE TEST TO BE
PREPARED

• The test objectives guide the kind of objective tests that will
be designed and constructed by the teacher. This means
aligning the test with the lesson objective or outcome.

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


15
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

• At all times, the test formulated must be aligned with the a. True-False Items or Correct-Wrong
learning outcome. This is the principle of constructive
alignment. b. Multiple-choice Items (more than two, or 3+ choices)

3) PREPARING A TABLE OF SPECIFICATION (TOS) c. Matching Type

Table of Specification or TOS 2) Supply or Constructed Response Type – "self-


construct"
- a test map that guides the teacher in constructing a
test. This ensures that there is a balance between items that a. Enumeration
test lower-level thinking skills and those which test higher-order
b. Completion
thinking skills (or alternatively, a balance between easy and
difficult items) in the test. c. Essay
The simplest TOS consists of four columns:

• Level of the objective to be tested CONSTRUCTING SELECTED RESPONSE TYPE


• Statement of objective 1. TRUE OR FALSE
• Item numbers where such an objective is being tested - also known as binomial-choice or alternate response test
• Number of items and percentage out of the total for the - always have two options:
particular objective
True or False
• Total number of items
Right or Wrong
4) CONSTRUCTING THE DRAFT TEST ITEMS
Yes or No
• The actual construction of the test items follows the TOS
Fact or Bluff
• As a general rule, it is advised that the actual number of items
to be constructed in the draft should be DOUBLE THE • Ensure that a true-false item is able to discriminate properly
DESIRED NUMBER OF ITEMS between those who know and those who are guessing.

5) ITEM ANALYSIS/TRY-OUT AND VALIDATION • A modified true-false test can offset the effect of guessing by
requiring the students to explain their answer and to disregard
• The test draft is tried out to a group of pupils and students a correct answer if the explanation is incorrect.
• Its purposes: GUIDELINES FOR CONSTRUCTING TRUE OR FALSE
a. Item characteristics through item analysis TEST
b. Characteristic of the test itself—validity, reliability, and 1. Do not give a hint (inadvertently) in the body of the question
practicability Example:
The Philippines gained its independence in 1898 and therefore
celebrated its centennial year in 2000.
THINGS TO CONSIDER IN WRITING TEST ITEMS: Answer:
False, because 100 years from 1898 is not 2000 but 1998.
1. Use Table of Specifications (TOS) as a guide to item writing.

2. Construct more items than needed. 2. Avoid using the words "always", "never", " often", and other
words that tend to be either always true or always false.
3. Write the items ahead of the testing date. Example: Christmas always falls on a Sunday because it is a
4. Write each test item at an appropriate reading level and Sabbath Day.
difficulty. • Statements that use the word "always" are almost always
false.
5. Write each test item in a way that does not provide help in
answering other test items. 3. Avoid long sentences as these tend to be "true". Keep
sentences short.
6. Write each test item so that the task to be done is clearly
defined. 4. Avoid trick statements with some minor misleading word or
spelling anomaly, misplaced phrases, etc. A wise student who
7. Write a test item whose answer would be agreed upon by does not know the subject matter may detect this strategy and
the experts. thus get the answer correctly.

8. Whenever a test is revised, recheck its relevance. 5. Avoid quoting verbatim from reference materials or
textbooks. This practice sends the wrong signal to the students
that it is necessary to memorize the textbook word for word,
TYPES OF PAPER AND PENCIL TESTS: and, thus, acquisition of higher-level thinking skills is not given
due importance.
1) Selected Response Type – "choices"
6. Avoid specific determiners or give-away qualifiers

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


16
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

Example: Executives usually suffer from hyperacidity. 5. An item should only contain one correct or clearly best
• The statement tends to be correct. The word "usually" leads answer.
to the answer.
6. Items used to measure understanding should contain some
7. Avoid a grossly disproportionate number of either true or novelty, but not too much.
false statements or even patterns in the occurrence of true and
false statements. 7. All distracters should be plausible/attractive.

8. Avoid double negatives. This makes test items unclear and 8. Verbal associations between the stem and the correct
will confuse the students. answer should be avoided.
Example: The changes that take place in early childhood are
9. The relative length of the alternatives/options should not
NOT unchangeable.
provide a clue to the answer.
• The test item simply means "The changes in early childhood
are changeable". 10. The alternatives should be arranged logically.
2. MATCHING TYPE 11. The correct answer should appear in each of the alternative
positions and approximately an equal number of times but in
1. Use only homogenous material in a single matching exercise. random order.
2. Include an unequal number of responses and premises and 12. Always have the stem and the alternatives on the same
instruct the students those responses may be used once, more page.
than once, or not at all.
13. Use of special alternatives such as none of the above or all
Use a joker or jokers. of the above should be done sparingly.
3. Keep the list of items to be matched brief, and place the 14. Do not use multiple-choice items when other types are
shorter responses at the right. more appropriate.
4. Arrange the list of responses in logical order. 15. Provide at least four (4) options.
5. Indicate in the directions the basis for matching the
responses and premises.
Other Reminders:
6. Place all the items for one matching exercise on the same
page. 1. Do not use unfamiliar words, terms, and phrases.
7. Limit a matching exercise to not more than 10 to 15 items. 2. Do not use modifiers that are vague and whose meanings
can differ from one person to the next such as much, often,
Varieties of Matching Type Test usually, etc.
1. Balanced Variety - the number of items is equal to the 3. Avoid complex or awkward word arrangements. Also, avoid
number of options the use of negatives in the stem as this may add unnecessary
comprehension difficulties.
2. Unbalance Variety - There are unequal numbers in two
columns 4. Do not use negative or double negatives as such statements
tend to be confusing. It is best to use simpler sentences rather
- characterized by the use of distracters
than sentences that would require expertise in grammatical
which usually leaves a number of unused options
constructions.
3. Classification Variety - requires the sorting of words or other
5. Distracters should be equally plausible and attractive.
types of materials into their proper categories. The students
are asked to classify each specific word used in the context. 6. The length explicitness or degree technically of all
Distracters are seldom used because all terms are properly alternatives should not be determinants of the correctness of
classified. the answer.
4. MULTIPLE CHOICE 7. Avoid alternatives that are synonymous with others or those
that include or overlap others.
Writing Multiple-choice Items
8. Avoid the use of unnecessary words or phrases, which are
1. The stem of the item should be meaningful by itself and
not relevant to the problem at hand. Such items test the
should present a definite problem.
student’s reading comprehension rather than the knowledge of
2. The item stem should include as much of the item as the subject matter.
possible and should be free of irrelevant material.
9. The difficulty of a multiple-choice item may be controlled by
3. Use a negatively stated stem only when significant learning varying the homogeneity or degree of similarity of responses.
outcomes require it, and stress/highlight the negative words for The more homogenous, the more difficult the items.
emphasis.
CONSTRUCTING SUPPLY OR CONSTRUCTED RESPONSE
4. All the alternatives should be grammatically consistent with
the stem of the item. TYPE

1. SUPPLY TYPE/ENUMERATION

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


17
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

Guidelines in Writing Supply Type of Test 2. Inform the students of the criteria to be used for grading their
1. Word the item/s so that the required answer is both brief and essays. This rule allows the students to focus on relevant and
specific. substantive materials rather than on peripheral and
2. Do not take statements directly from textbooks. unnecessary facts and bits of information.
3. A direct question is generally more desirable than an
incomplete statement. Example: Write an essay on the topic: “Plant Photosynthesis”
4. If the item is to be expressed in numerical units, indicate the following criteria (a) coherence (b) accuracy of the statement (c)
type of answer wanted. use of the keywords, (d) clarity, and (e) extra points for
5. Blanks for answers should be equal in length and as much innovative presentation of ideas.
as possible at the end or near the end of the statement.
3. Put a limit on the essay test.
6. When completion items are to be used, do not indicate too
many blanks. 4. Decide on your essay grading system prior to getting the
essays of your students.
Other reminders:
1. Word the items so that the required answer is both brief and 5. Evaluate all of the students’ answers to one question before
definite proceeding to the next question.
2. Avoid over mutilated items
3. Do not lift statements directly from the book. When taken out 6. Evaluate answers to essay questions without knowing the
of the context, textbook statements are to general and identity of the writer.
ambiguous.
4. Where the answer is to be expressed in numerical units, 7. Whenever possible, have two or more persons grade each
indicate the type of answer wanted. answer.
2. ESSAY TYPES OF ESSAYS
Writing Essay type of Test 1. Narrative Essays
1. Restrict the use of essay questions to those learning A narrative essay is one which details a story, often times from
outcomes that cannot be satisfactorily measured by objective a particular point of view. When writing a narrative essay, you
items. should include a set of characters, a location, a good plot and
2. Construct questions that will call forth the skills specified in a climax to the story.
the learning standards.
3. Phrase each question so that so that the students’ task is 2. Descriptive Essay
clearly defined or indicated.
4. Avoid the use of optional questions. A descriptive essay will describe something in great detail. The
5. Indicate the approximate time limit or the number of points subject can be anything from people and places to objects and
for each question. events but the main point is to go into depth. You might
6. Prepare an outline of the expected answer in advance or a describe the item’s colour, where it came from, what it looks
scoring rubric. like, smells like, tastes like or how it feels.

3. Expository Essay
The Learning Outcomes Measurable by Essay Type of
Test: An expository essay is used as a way to look into a problem
and therefore compare it and explore it. For the expository
1. Comparison between two or more thing essay there is a little bit of storytelling involved but this type of
2. The development and defense of opinion essay goes beyond that. The main idea is that it should explain
3. Questions of cause and effect an idea giving information and explanation.
4. Explanation of meaning
5. Summarizing of information in a designated area 4. Argumentative Essay
6. Analysis
When writing an argumentative essay, you will be attempting to
7. Knowledge of relationship
convince your reader about an opinion or point of view. The
8. Illustrations of rules, principles, procedures, and application
idea is to show the reader whether the topic is true or false
9. Application of rules, laws, and principles to new situations.
along with giving your own opinion. It is very important that you
10. Criticism of adequacy, relevance, or
use facts and data to back up any claims that made within the
correctness of a concept, idea, or information
essay.
11. Formulation of new questions and problems
12. Reorganization of fact OTHER TYPES OF ESSAYS:
13. Discriminations between objects, concepts, or events
14. Inferential thinking Definition Essays
This is a type of essay which is used to define an idea, thing or
concept.

Simple Essays
The Rules which facilitate grading of essay papers:
This is, as its name would suggest, a simple essay which is
1. Phrase the direction in such a way that the students are made up from five paragraphs and can be written on any
guided on the concepts to be included. subject.

Example: Write an essay on the topic” Plant photosynthesis” Persuasive Essays


using the following keywords and phrases: chlorophyll, sunlight, The persuasive essay is one which can be used as a way of
water, carbon dioxide, oxygen, by-products. convincing the readers of an idea. It might also be used in

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


18
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

order to convince the reader not to do a particular thing, or ____1. Hero of the Battle of Mactan
indeed to do it..
____2. Longest Filipino Revolt
Rhetorical Analysis Essays
This type of essay is used as a way of analysing a piece of
rhetoric or a speech and looks at any rhetorical devices which 2. The statement should be phrased so that there is only
have been used. one response.
Analytical Essays Examples:
As the name of this type of essay might suggest, it is an essay
which is used to analyse something. This could be a piece of (Poor) ____1. Manuel L. Quezon
writing, a movie or anything else. The idea is that the analytical ____ 2. The City of Baguio
essay will look at what it is analysing from various viewpoints
allowing the reader to form their own opinion. (Better)____1. The First President of the Philippines
____2. The summer capital of the Philippines
Compare And Contrast Essays
When writing a compare and contrast essay, the author will be Arrangement of Elements
using it as a way of creating a comparison between two things
or finding a contrast between them. But it is not limited to one Ordering - measures memory of relationships and concepts of
or the other, you can also write a compare and contrast essay organization in many subjects.
to do both of these things in one.
- is rearrangement of elements consists of ordering
Cause And Effect Essays or assembling items in some basis.
This is a type of essay which allows the author to explain the 1. Chronological Order
cause of a certain thing as well as being able to explain the
effects of it. Example: Arrange the following Presidents in chronological
order from most recent to least recent. Write your answer on
Critical Essays the spaces at the right
When writing a critical essay, the author will be writing about a
piece of literature and evaluating it. They will use the good and Garcia 1. ____
bad points of the piece in order to do this. Marcos 2.____
Magsaysay 3.____
Process Essays Macapagal 4.____
The process essay is a way of outlining or detailing a process. Osmena 5.____
This is done by breaking down the process so that the readers Quezon 6.____
are able to understand it and even perform the process Quirino 7.____
themselves once they have read the essay. Roxas 8.____

Synthesis Essays
This is a type of essay which is used as a way to synthesis 2. Geographical Order – This is arrangement according to
various concepts in order to create a judgement on their good geographical location.
and bad points.
Example: Arrange the following provinces from north to south.
Write your answer on the spaces at the right
Review Essays Bulaca 1. __________
The review essay is one which looks at a piece of literature Cagayan. 2. __________
and gives a review on it based around the good and bad points Nueva Ecija 3. __________
within it. Sorsogon 4. __________
Research Essays
The research essay is one which is written based on a 3. Arrangement according to Magnitude – the basis of this
research question and aims to give a specific answer to it. The arrangement in size, which may be height, width, distance, etc
author will research the subject as a way of providing an
answer to the question that was posed. 4. Alphabetical Order – Arrangement of words according to
their appearance in the dictionary.
Explanatory Essays
This type of essay is used as way to explain any given piece of 5. Arrangement according to importance, Quality, etc.
written work or literature. They can be written on a variety of
types of literature such as poetry, novels or a short story.

3. IDENTIFICATION

Writing Identification type of Test

1. The definition, description, or explanation maybe


given by means of a phrase or incomplete statement.
If not, indicate a picture, or diagram.
Examples:

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


19
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

VI. ITEM ANALYSIS AND VALIDATION


INTRODUCTION
The teacher normally prepares a draft of the test. Such a draft
is subjected to item analysis and validation to ensure that
the final version of the test would be useful and functional.

Phases of preparing a test


Try-out phase
Item analysis phase
Item revision phase

ITEM ANALYSIS: DIFFICULTY INDEX AND


DISCRIMINATION INDEX

Item analysis is the act of analyzing student responses to


individual exam questions with the intention of evaluating
exam quality. It is an important tool to uphold test
effectiveness and fairness. Item analysis is likely
something educators do both consciously and
unconsciously on a regular basis.

TWO IMPORTANT CHARACTERISTICS


Item Difficulty
Discrimination Index

ITEM DIFFICULTY
Item Difficulty or the difficulty of an item is defined as the
number of students who are able to answer the item correctly
divided by the total number of students. Thus:

Figure 1. Item Difficulty

The item difficulty is usually expressed in percentage.

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


20
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

Example: What is the item difficulty index of an item if 25 This is a perfectly discriminating item and is the ideal item
students are unable to answer it correctly while 75 answered it that should be included in the test.
correctly?
As in the case of index difficulty, we have the following rule of
Here the total number of students is 100, hence, the item thumb:
difficulty index is 75/100 or 75%.
Table 2. Index range
One problem with this type of difficulty index is that it may INDEX RANGE INTERPRETATION ACTION
not actually indicate that the item is difficult or easy. -1.0 to -.50 Can discriminate Discarded
A student who does not know the subject matter will but the item is
naturally be unable to answer the item correctly even questionable
if the question is easy. How do we decide on the -.55 to .45 Non- discriminating Revised
basis of this index whether the item is too difficult or .46 to 1.0 Discriminating item Include
too easy?
Table 1. Range of difficulty index Example: Consider a multiple item choice type of test with the
ff. data were obtained:
RANGE OF INTERPRETATION ACTION
DIFFICULTY Table 3. Multiple choice type test
INDEX ITEM OPTIONS
0-0.25 Difficult Revise or A B* C D
discard 1
0.26-0.75 Right Difficulty Retain 0 40 20 20 TOTAL
0.76- above Easy Revise or
discard 0 15 5 0 Upper
Difficult items tend to discriminate between those 25%
who know and those who does not know the 0 5 10 5 Lower
answer. 25%
Easy items cannot discriminate between those
two groups of students. The correct response is B. Let us compute the difficulty index
and index of discrimination:
We are therefore interested in deriving a
measure that will tell us whether an item can
discriminate between these two groups of
students. Such a measure is called an index
of discrimination.

DISCRIMINATION INDEX
An easy way to derive such a measure is to measure how
difficult an item is with respect to those in the upper 25% of Figure 2. Difficulty index
the class and how difficult it is with respect to those in the =
lower 25% of the class. If the upper 25% of the class found
the item easy yet the lower 25% found it difficult, then the = 40%, within of a “good item”
item can discriminate properly between these two groups.
Thus: The discrimination index can be similarly computed:
Index of discrimination = DU – DL DU=
Example: Obtain the index of discrimination of an item if the = 15/20 = .75 or 75%
upper 25% of the class had a difficulty index of 0.60 (i.e., 60%
of the upper 25% got the correct answer) while the lower 25% DL=
of the class had a difficulty index of 0.20.
=5/20 = .25 or 25%
DU = 0.60 while DL = 0.20, thus index of discrimination = .60 -
.20 = .40. Discrimination index = DU-DL
Theoretically, the index of discrimination can range from -1.0 = .75-.25
(when DU =0 and DL = 1) to 1.0 (when DU = 1 and DL = 0) = .50 or 50%
When the index of discrimination is equal to -1, then this Thus, the item also has a “good discriminating power”.
means that all of the lower 25% of the students got the
correct answer while all of the upper 25% got the wrong It is also instructive to note that the distracter A is not an
answer. In a sense, such an index discriminates correctly effective distracter since this was never selected by the
between the two groups but the item itself is highly students. Distracter C and D appear to have a good
questionable. appeal as distracters.
On the other hand, if the index discrimination is 1.0, then this MORE SOPHISTICATED DISCRIMINATION INDEX
means that all of the lower 25% failed to get the correct
answer while all of the upper 25% got the correct answer.

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


21
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

Item Discrimination refers to the ability of an item to • P- Percentage who answered the item correctly
differentiate among students on the basis of how well they (index of difficulty)
know the material being tested. • R- Number who answered the item correctly
• T-Total number who tried the item
A good item is one that has good discriminating ability Figure 5. Percentage
and has a sufficient level of difficulty (not too difficult
nor too easy).

o At the end of the item analysis report, test


items are listed according to their degrees of
difficulty (easy, medium, hard) and
discrimination (good, fair, poor). These
distributions provide a quick overview of the
test and can be used to identify items which
are not performing well and which perhaps The smaller the percentage figure the more difficult the item.
be improved or discarded.
Estimate the item discriminating power using the formula below:
THE ITEM-ANALYSIS PROCEDURE FOR NORM PROVIDES
THE FOLLOWING INFORMATION:
1. The difficulty of an item
2. The discriminating power of an item
3. The effectiveness of each alternative

BENEFITS DERIVED FROM ITEM ANALYSIS


1. It provides useful information for class discussion of the
test.
2. It provides data which helps students improve their Figure 6. Discriminating power
learning.
3. It provides insights and skills that lead to the preparation The discriminating power of an item is reported as a decimal
of better tests in the future. fraction; maximum discriminating power is indicated by an
index of 1.00.
INDEX OF DIFFICULTY Maximum discrimination is usually found at the 50 per
cent level of difficulty.
0.00 – 0.20 = very difficult
0.21 – 0.80 = moderately difficult
0.81 – 1.00 = very easy
VALIDATION AND VALIDITY
Validity is important because it determines what survey
questions to use, and helps ensure that researchers are
using questions that truly measure the issues of
importance. The validity of a survey is considered to be
the degree to which it measures what it claims to measure.

VALIDATION
After performing the item analysis and revising the items which
Figure 3. Index of difficulty need revision, the next step is to validate the instrument.
Where: The purpose of validation is to determine the
characteristics of the whole test itself,
• Ru – The number in the upper group who answered namely, the validity and reliability of the test.
the item correctly. Validation is the process of collecting and
• RL- The number in the lower group who answered analyzing evidence to support the
the item correctly. meaningfulness and usefulness of the test.
• T- The total number who tried the item. VALIDITY

INDEX OF ITEM DISCRIMINATING POWER Validity is the extent to which measures what it purports to
measure or referring to the appropriateness, correctness,
meaningfulness, and usefulness of the specific decisions a
teacher makes based on the test results.

THERE ARE THREE MAIN TYPES OF EVIDENCES THAT


MAY BE COLLECTED:
1. Content-related evidence of validity
2. Criterion-related evidence of validity
3. Construct-related evidence of validity
Figure 4. Index of discriminating power
CONTENT-RELATED EVIDENCE OF VALIDITY
Where: refers to the content and format of the instrument.
• How appropriate is the content?

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


22
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

• How comprehensive?
• Does it logically get at the intended variable? How RELIABILITY
adequately does the sample of items or questions Refers to the consistency of the scores obtained – how
represent the content to be assessed? consistent they are for each individual from one
administration of an instrument to another and from one
CRITERION-RELATED EVIDENCE OF VALIDITY set of items to another.
We already have the formulas for computing the
REFERS TO THE RELATIONSHIP BETWEEN SCORES reliability of a test; for internal consistency,
OBTAINED USING THE instrument and scores obtained for instance, we could use the split-half
using one or more other test (often called criterion). method or the Kuder-Richardson formulae:
• How strong is this relationship?
• How well do such scores estimate present or predict KR-20 or KR-21
future performance of a certain type?
• Reliability and validity are related concepts. If an
CONSTRUCT-RELATED EVIDENCE OF VALIDITY instrument is unreliable, it cannot yet valid
outcomes.
refers to the nature of the psychological construct or As reliability improves, validity may improve (or
characteristic being measured by the test. may not).
• How well does a measure of the construct explain However, if an instrument is shown
differences in the behavior of the individuals or their scientifically to be valid then it is almost
performance on a certain task? certain that it is also reliable.
The following is standard followed by almost universally in
USUAL PROCEDURE FOR DETERMINING CONTENT educational tests and measurement:
VALIDITY
Teacher write out objectives based on TOS Table 5. Reliability
Gives the objectives and TOS to 2 experts along with a RELIABILITY INTERPRETATION
description of the test takers. .90 and above Excellent reliability; at the
The experts look at the objectives, read over the items in the level of the best
test and place a check mark in front of each question or standardized tests.
item that they feel does NOT measure one or more .80 - .90 Very good for a classroom
objectives. test.
.70 - .80 Good for classroom test; in
USUAL PROCEDURE FOR DETERMINING CONTENT the range of most. There are
VALIDITY probably a few items which
This continues until the experts approve all items and also could be improved.
when the experts agree that all of the objectives are .60 - .70 Somewhat low. This test
sufficiently covered by the test. should be supplemented by
OBTAINING EVIDENCE FOR CRITERION-RELATED other measures (e.g., more
tests) for grading.
VALIDITY
The teacher usually compares scores on the test in question
with the scores on some other independent criterion test
which presumably has already high validity (concurrent
validity).
Another type of validity is called the predictive validity wherein
the test scores in the instrument is correlated with scores
on later performance of the feelings.

GRONLUNDS EXPECTANCY TABLE


Table 4. Grade point average
GRADE POINT AVERAGE
TEST VERY GOOD NEEDS
SCORE GOOD IMPOVEMENT
HIGH 20 10 5
AVERAGE 10 25 5
LOW 1 10 14

The expectancy table shows that there were 20 students


getting high test scores and subsequently rated excellent
in terms of their final grades;
And finally, 14 students obtained low test scores and were later
graded as needing improvement.

The evidence for this particular test tends to indicate that


students getting high score on it would be graded
excellent; average scores on it would be rated good
later; and students getting low scores on the test
would be graded needing improvement later.

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


23
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

VII. MEASURES OF CENTRAL


TENDENCY AND DISPERSION/
VARIABILITY
INTRODUCTION
A measure of central tendency is a single value that
attempts to describe a set of data like scores. As such
measure of central tendency are sometimes called
measure of central location.

Measures of Central Tendency and Dispersion/Variability


Population – refers to the totality of all the elements or
persons for which one has an interest at a particular time.
For example, the members of the faculty of a school, the
graduating class, the male student, etc.
Sample – is a part of a population determined by sampling
procedures. It is usually denoted by n.

Frequency Distribution Table

FREQUENCY DISTRIBUTION TBLE

In statistics, numerical information may be treated as


ungroup or group data. In both cases, tabular presentation
is very important. This tabular presentation of data is
called the frequency distribution table.
10

Consider the midyear test scores of 45 students in


Mathematics VI
29 27 28 27 34 29 27 27 28
25 23 35 25 29 33 23 27 33
27 22 42 27 21 29 22 25 29
25 21 20 21 23 25 30 20 28
30 29 28 30 27 27 27 19 30

The table below shows the tabulation of the 45 scores treated


as ungrouped data. The tally and frequency for each score
is also indicated.

Frequency →the number of values fall in each class

STEPS IN CONSTRUCTING A FREQUENCY


DISTRIBUTION TABLE

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


24
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

Therefore the lowest class is from 19 to 22. This is


written as 19 – 22. In the class 19 – 22, 19 is
the lowest limit and 22 is the upper limit. The other
classes are formed in the same manner. If 22
is the upper limit of the lowest class and 4 is the class
interval, just simply add 4 to 22, 22 + 4 =
26, then 26 + 4 = 30, 30 + 4 = 34, 34 + 4 = 38, until you
reach the highest score of 42, 38 + 4 =
42. Note that the constructed number of classes is 6
classes.
5. Determine the class frequency (f) for each class by
counting the tally.

Table 6. Test Scores of 45 Students in Mathematics VI

The scores may be tabulated as grouped data. Usually,


data in great numbers are presented in a frequency
distribution table.
Here are the steps in constructing frequency distribution
table.
1. Find the range (r). The range is the difference of the
highest score minus the lowest score
In the given data above, the highest score is 42 and the
lowest score is 19. The range is
r = 42 – 19 = 23.
2. Compute the number of classes. A class is a
grouping or category. Statisticians said that the ideal
number of classes is between 5 and 15.
k = 1 + 3.322logn
where n → is the total number of observation n = 45 Table 7. Tally of the Test Scores of 45 Students in
k = 1 + 3.322log45 = 6.49 → rounded up to whole number Mathematics VI
=6
Therefore the number of classes is 6 The following numerical values are relevant in dealing
with frequency distribution:
1) Class mark (x). It is the middle value in a class.
In the class 19 – 22, to compute the class mark, 19 + 22
= 41 ÷ 2 = 20.5
Note: the lower and upper limit should always divided by
2.
Figure 7. Class interval formula 2) Class boundaries. Often described as the true limits
because these are more precise expressions
4. Determine the classes starting with the lowest class. of class limits. The lower boundary of a class is 0.5 less
The lowest score is 19. The lowest class is 19 than its lower limit, and its upper
to 22, 19 + class interval (4) then there is always a minus boundary is 0.5 more than its upper limit.
1. Hence, 19 + 4 – 1 = 23 - 1 = 22.

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


25
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

Note that 19, 23, 27, 31, 35, 39 are the lower limit and 22,
26, 30, 34, 38, 42 are the upper limit.
∑ - sigma notation or summation, is used to denote the
sum of values
In the class 19 – 22, to compute the lower boundary (LB),
lower limit minus 0.5, 19 – 0.5 =
18.5, therefore the lower boundary of 19 is 18.5
In the class 19 – 22, to compute the upper boundary
(UB), upper limit plus 0.5, 22 + 0.5 =
22.5, therefore the upper boundary of 22 is 22.5
3) Relative frequency distribution. Shows the proportion
in percent the frequency of each class to
the total frequency.
Relative frequency (%f) = frequency(f )
n
x 100
In the class 19 – 22, the corresponding frequency is 9,
VIII. THE GRADING SYSTEMS AND THE
and the total number of frequency or n is GRADING SYSTEM IN THE PHILIPPINES
45, hence, INTRODUCTION
Grading in education is the process of applying standardized
(%f) = 8 measurements of varying levels
of achievement in a course.
45 x 100 = 17.78%
Grades can be assigned in letters (for example, A, B, B+, B-, C,
Note: Rounded up to 2 decimal places. C-, D), as a seven-point in the
American system. 1, 1.25, 1.50, 1.75, 2.0, 2.5, 3.0 and 4.0 or an
4) Cumulative frequency distribution. Tries to determine eight-point system, as letters are
the partial sums from the data classified replaced with numerical values in the Philippine colleges and
universities. In basic education,
in terms of classes. This distribution answers problems grades are expressed as percentages (of accomplishment)
like the number of students who got a such as 80% or 75%. As a number out
of a possible total (for example out of 20 or 100), or as
passing mark; the number of employees who got
descriptors (excellent, great, satisfactory,
efficiency rating from 76% to 95%, and so on. needs improvement).
2 Types of Cumulative Frequency Distribution
NORM REFERENCED GRADING
a. Less than cumulative frequency (<cf)
Norm-Referenced Grading System refers to a grading
b. Greater than cumulative frequency (>cf) system wherein a student’s performance is
evaluated relatively to the performance of the other
student.
Using the norm-ref. grading system, a student
performance is evaluated relatively to the
performance of other student within the group.

ADVANTAGES
 It is very easy to use.
 It works well for the courses with retention policies
and it limits only few students to
 advance to the next level of the course.
 It is useful if the focus is the individual achievement of
the students.
 It is appropriate to a large group of students that is,
more than 40.
 The teacher easily identifies learning criteria – the
percentage of students who receive
 highest grade or lowest grade.
Table 8. Frequency Distribution Table of 45 Students in
Mathematics VI

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


26
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

DISADVANTAGES
 The performance of a student is not only determined For example, in the Philippine setting, not all high school
by his achievement, but also the students can actually advance to
 achievement of the other students.
 It promotes competition among the students rather college or university level because of financial constraints, the
than cooperation. norm-referenced grading system
 It cannot be used when the class size is smaller than
40. can be applied.
 Not all the student can pass the given subject or
course. Example: In a class of 100 students, the mean score in a test is
70 with a standard deviation of 5.

Construct a norm-referenced grading table that would have


seven grade scale.

Figure 8. Norm-Reference Grading System Example

Example: Consider the following two sets of scores in an


English 1 class for two sections of ten students each:
A = { 30, 40, 50, 55, 60, 65, 70, 75, 80, 85 }
B = { 60, 65, 70, 75,80, 85, 90, 90, 95, 100 }

Table 10. Grade Scale Equivalency

CRITERION REFERENCED GRADING


The students’ performance is evaluated against a certain
criteria or standard.
The criteria or standard is absolute in this grading
Figure 9. Norm-Reference Grading System Example 2 system and it is also possible that all the student
may receive the highest possible grade or all of them
This example above illustrates one difficulty with using a norm-
may pass the said test.
referenced grading system.
It is also possible that all students may receive a failing
This problem is called the problem of equivalency. It is grade if they will not reach the standard
therefore known in advance what percent of the students set by the teacher.
would pass or fail a given course.
ADVANTAGES
In norm-referenced grading, the students, while they may work  The performance of the students will not be affected
individually, are actually in competition to achieve a standard by the performance of the whole
of performances that will classify them into the desired grade  class.
range.  It promotes cooperation among the students.
 All students may pass the subject or course when
Example: A teacher may establish a grading policy whereby they meet the standard set by the
the top 15% of students will receive a mark of excellent or  teacher.
outstanding.
DISADVANTAGES
 It is difficult to set a reasonable standard if it is
not stated in the grading policies of the
instruction.
 All students may not pass the subject or course
when they do not meet the standard set by the
teacher or the institution.
Table 9. Norm Referenced System Scale

Example: In a class of 100 students using the table


The objective for this is to find out the best performers in the
group. Norm-referenced systems
below, no one get a grade of excellent if no one scores
98 above or 85 above depending on the criterion used.
are most often used for screening selected student populations There is no fixed percentage of students who are
in conditions where it is known expected to get the various grades in the criterion-
referenced grading system.
that not all students can advance due to limitations such as
available places, jobs, or other

controlling factors.

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


27
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

This is often referred to as the controversy between norm-


referenced versus criterion-referenced grading.

2. Should grades reflect achievements only or


nonacademic components such as attitude, speed and
diligence?

It is a very common practice to incorporate such things as


Table 11. Criterion Referenced Grade Scale Equivalency turning in assignments on time into overall grade in the course,
primarily because the need to motivate students to get their
work done is a real problem for instructors.
Criterion-referenced grading systems are often used in
situations where the teachers are 3. Should grades report status achieved or amount of
growth?
agreed on the meaning of a “standard of performance” in
a subject but the quality of the students In many beginning classes, the background of the students is
so varied that some students can achieved the end objectives
is unknown or uneven; where the work involves student with little or no trouble while others with weak backgrounds will
collaboration or teamwork. work twice a hard and still achieved only half as much.

4. How can several grades on diverse skills combine to


give a single mark?
What prevents teachers who use criterion-referenced
grading from setting the performance
The basic answers are that they can’t really. The results of
criteria so low that everyone can pass with ease? instruction are so varied that the single mark is really a “Rube
Goldberg”, as a far as indicating what a student has achieved.
It would complicate an already complicated task. The “halo”
effect of good performance in one area could spill over into
First, the criterion should not be based on only one
others. And finally, most outsider are looking for only one
teacher’s opinion or standard. overall classification of each person so that they can choose
the “best”.
Second, once the criterion is established, it must be
made public.
STANDARDIZED TEST SCORING

FOUR QUESTIONS IN THE GRADING SYSTEM: Test Standardization is a process by which teacher or
Marinila D. Svinicki (2007) of the Center for Teaching researcher-made tests are validated and
Effectiveness of the University of Texas item analyzed. After a thorough process of validation,
the test characteristics are established.
at Austin poses four intriguing questions relative to grading.

1. Should grades reflect absolute achievements level or These characteristics include: test validity, test reliability,
achievement relative to others in the test difficulty level and other
characteristics as previously discussed.
same class?
CUMMULATIVE AND AVERAGE SYSTEM OF GRADING
2. Should grades reflect achievements only or nonacademic
components such as attitude,
Averaging System is the grade of student on a particular
speed and diligence? grading period equals the average of
the grades obtained in the prior grading periods and the
3. Should grades report status achieved or amount of growth? current grading period.
4. How can several grades on diverse skills combine to give a Cumulative Grading System is the grade of a student in a
single mark? grading period equals his current
grading period grade which is assumed to have the cumulative
effects of the previous grading
periods.
WHAT SHOULD GO INTO A STUDENT’S GRADES?

The grading system an instructor selects reflects his


or her educational philosophy. There are no right or wrong WHAT IS THE DEPED K12 GRADING SYSTEM
systems, only systems which accomplish different objectives. The K to 12 Basic Education Program uses a standard and
The following are questions which an instructor may want to competency-based grading system. These are found in the
answer when choosing what will go into a student’s grade. curriculum guides. All grades will be based on the weighted
raw score of the learners’ summative assessments. The
1. Should grades reflect absolute achievements level or minimum grade needed to pass a specific learning area is 60,
achievement relative to others in the same class? which is transmuted to 75 in the report card. The lowest mark

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


28
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

that can appear on the report card is 60 for Quarterly Grades


and Final Grades.

For these guidelines, the Department will use a floor grade


considered as the lowest possible grade that will appear in a
learner’s report card.

Learners from Grades 1 to 12 are graded on Written Work,


Performance Tasks, and Quarterly Assessment every quarter.
These three are given specific percentage weights that vary
according to the nature of the learning area.

HOW TO COMPUTE FINAL GRADES AND AVERAGE IN


THE DEPED K-12 GRADING SYSTEM Step 4: The sum of the Weighted Scores in each component is
Step 1: Grades from all student work are added up. the Initial Grade.
This results in the total score for each component, namely
Written Work, Performance Tasks, and Quarterly Assessment. This Initial Grade will be transmuted using the given
Raw scores from each component have to be converted to a transmutation table to get the Quarterly Grade (QG).
Percentage Score. This is to ensure that values are parallel to
each other.

Step 2: The sum for each component is converted to the


Percentage Score.
To compute the Percentage Score (PS), divide the raw score
by the highest possible score then multiply the quotient by
100%. This is shown below:

Step 3: Percentage Scores are then converted to Weighted


Scores to show the importance of each component in
promoting learning in the different subjects.

To do this, the Percentage Score is multiplied by the weight of


the component found in Table 4 for Grades 1 to 10 and Table 5
for Senior High School. The product is known as the Weighted
Score (WS).

Table 4. Weight of the Components for Grades 1-10

The grading system for Senior High School (SHS) follows a


different set of weights for each component. Table 5 presents
the weights for the core and track subjects.
Step 5: The Quarterly Grade for each learning area is written in
the report card of the student.
Table 5. Weight of the Components for SHS

For a better understanding of how to record the summative


assessments, Table 6 presents a sample class record showing
three learners for the first quarter of Grade 4 English. On the
basis of this class record, Table 7 presents a step-by-step
process on how to compute for the Quarterly Grade.

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


29
EDUC 70 FACILITATING LEARNER-CENTERED TEACHING

Table 6. Sample Class Record for English Grade 4 (First


Quarter)

Table 7. Steps for Computing Grades

Steps for Computing Grades

1. Get the total score for each component.

2. Divide the total raw score by the highest possible


score then multiply the quotient by 100%.

3. Convert Percentage Scores to Weighted Scores.


Multiply the Percentage Score by the weight of the component
indicated in Table 4 and Table 5.

4. Add the Weighted Scores of each component. The


result will be the Initial Grade.

5. Transmute the Initial Grade using the Transmutation


Table.

BSNED 2-1| DINAMPO, MALLO, NOBLE, ET. AL.


30
REFERENCES:

List of References Provided and other References:


Babakr, Zana H., Mohamedamin, Pakstan, and Kakamad,
Karwan. (2019), Piaget’s Cognitive Developmental Theory:
Critical Review. In: Education Quarterly Reviews, Vol.2, No.3,
517-524.

Balubayan, CD.(2015,July 11). Assessment Lecture 1.


Retrieved from https://www.slideshare.net/SircDb/assessment-
lecture1

23, M., 22, M., 21, F., 21, A., 10, A., 31, D., &amp; 5, D. (2020,
May 14). The shift of educational focus from content to learning
outcomes. ELCOMBLUS. Retrieved February 25, 2022, from
https://www.elcomblus.com/the-shift-of-educational-focus-from-
content-to-learning-outcomes/

Bloom’s Digital Taxonomy. (2015, January 15). Retrieved May


03, 2016, from
https://www.commonsensemedia.org/videos/blooms-digital-
taxonomy

Student learning outcomes: Student activities: Radford


University. Student Activities | Radford University. (n.d.).
Retrieved February 25, 2022, from
Isabella. (2021, April 2).

What is an essay? Different types of essays with examples.


7ESL. Retrieved February 10, 2022, from
https://7esl.com/essay/

“Item Analysis of Classroom Tests: Aims and Simplified


Procedures.” n.d. Www1.Udel.edu.
http://www1.udel.edu/educ/gottfredson/451/unit9-guidance.htm.
Tamayo, Angerica. 2015. “ITEM ANALYSIS and VALIDATION.”
Slideshare.net. September 23, 2015.
https://www.slideshare.net/ricanice16/item-analysis-and-
validation-53094605.

“What Is Validity and Why Is It Important for Survey Results?”


2015. NBRI. 2015. https://www.nbrii.com/faqs/data-
analysis/validity-important/.

Buenaflor, R. C. (2012). Assessment of learning book one: the


conventional approach. Quezon City: Great Books Publishing
Calmorin, L.P. (2011). Assessment of student learning 1.
Manila: Rex Book Store, Inc.

Garcia, C.D. (2013). Measuring and evaluating learning


outcomes: a textbook in educational assessment 1&2. Second
Ed. Mandaluyong City: Books Atbp. Publishing Corp.

Navarro, R.L., Santos R.G. (2012). Assessment of learning


outcomes (assessment 1). Quezon City: Lorimar Publishing Inc.

Navarro,R,L.et al.(2017). Assessment of Learning 1. Quezon


City: Lorimar Publishing Inc.

You might also like