Cavite State University-Indang Education Circle 2021-2022: The Teacher'S Archive
Cavite State University-Indang Education Circle 2021-2022: The Teacher'S Archive
Assessment in Learning 1
EDUC 75
Project Heads:
Acknowledgement of Responsibility
DISCLAIMER
This document is strictly confidential. Any review, retransmission, dissemination or other use of, without the prior written
consent of Education Circle of Cavite State University-Indang is prohibited.
Informed Consent
I understand that my participation is voluntary and that I am free to withdraw at any time, without cost. I understand that I
will be given a copy of this consent form.
Prepared by:
Conformed by:
Approved by:
SYLLABUS COPY
COLLEGE OF EDUCATION
TEACHER EDUCATION DEPARTMENT
COURSE SYLLABUS
First Semester, AY 2021-2022
Lecture
Course Course Assessment in / ;
EDUC 75 Type Credit Units 3
Code Title Learning 1 Laboratory
___
This course discusses the principles, development, and utilization of conventional assessment tools
Course to improve the teaching-learning process. It emphasizes on the use of assessment of, as, and for, in
Descriptio meaningful knowledge, comprehension and other thinking skills in the cognitive, psychomotor or
n effective domains. It allows students to go through the standard steps in test construction and
development and the application in grading systems.
Lecture: 3_____________
Pre-
None Course Schedule Wed 7:00-10:00 Laboratory: ______________
requisites
Students are expected to live by and stand for the following University tenets:
TRUTH is demonstrated by the student’s objectivity and honesty during examinations, class activities
and in the development of projects.
Core
EXCELLENCE is exhibited by the students’ self-confidence, punctuality, diligence and commitment
Values
in the assigned tasks, class performance and other course requirements.
SERVICE is manifested by the students’ respect, rapport, fairness, and cooperation in dealing with
their peers and members of the community.
In addition, they should exhibit love and respect for nature and support for the cause of humanity.
1. Offering of varied undergraduate and graduate degree courses leading to various professions that
Goals of will cater to the needs of the society;
the 2. Offering short-term courses that will directly benefit the client system;
College/ 3. Improvement of student performance
Campus 4. Improvement of facilities for both students and facilitators of learning;
5. Strengthening of linkage between research and the client system;
6. Conduct community development services to the different clienteles.
The department shall endeavor to:
Objectives
of the
1. provide relevant and quality course offering at the graduate and undergraduate levels to improve
Departmen
student performance;
t
2. conduct relevant research in the different areas in education to enrich the learning process;
1. Articulate and discuss the latest developments in the specific field of practice.
2. Effectively communicate in English, Filipino, both orally and in writing.
3. Work effectively and collaboratively with a substantial degree of independence in multi-disciplinary and multi-
cultural teams.
4. Act in recognition of professional, social, and ethical responsibility.
5. Preserve and promote Filipino historical and cultural heritage.
6. Articulate the rootedness of education in philosophical, socio-cultural, historical, psychological and political
contexts.
7. Demonstrate mastery of subject matter/discipline.
8. Facilitate learning using a wide range of teaching methodologies and delivery modes appropriate to specific
learners and their environments.
9. Develop innovative curricula, instructional plans, teaching approaches, and resources for diverse learners.
10. Apply skills in the development and utilization of ICT to promote quality, relevant and sustainable educational
practices.
11. Demonstrate a variety of thinking skills in planning, monitoring, assessing and reporting learning processes
and outcomes.
12. Practice professional and ethical teaching standards sensitive to the local, national and global realities.
13. Pursue lifelong learning for personal and professional growth through varied experiential and field-based
opportunities.
COURSE COVERAGE
Teaching Resources Due Date
Mode of Outcomes-
Week Intended and Needed of
Delivery based
No. Learning Topic Learning Submiss
Assessment
Outcomes (ILO) Activities ion of
(OBA)
(TLA) Output
1 After the Overview of the
completion of the course
chapter, students Presentation Distance Student Survey Week 1
will A. CvSU VGMO s mode handbook
be able to: B. College Goals Course
C. Program Objectives Surveying syllabus
1. inculcate in their Student’s Electronic
D. Course Content
minds and Online device
E. GAD Access Data
hearts the
F. Classroom network
mission and
Rules/Netiquette Creating FB connection
vision of the Group
G. Course
university;
Requirements
2. get oriented
with the course
requirements
and proper
classroom
decorum.
3. prepare and be
equip for the
distance
learning
discussion
2 After the I. Measurement, Brainstormin Distance Research Research Week 2
completion of the Assessment and g mode assignment assignment
chapter, students Evaluation
dimensional Portfolio:
should be able to: A. Measurement KWL Chart
question Infographic
B. Assessment approach
1. state the KWL Chart
C. Assessment (KWL) Infographic
importance of Principles
OBE; Access to Quiz Challenge
grades based
on the grading
system of
DepEd; and
interpret
results based
on the
statistical
concept.
18 FINAL EXAMINATION
COURSE REQUIREMENTS
*All exams must follow a Table of Specifications (TOS) and Rubrics for evaluation of student’ performance or
projects.
GRADING SYSTEM
A. Grading system for 2 units lecture and 1 unit laboratory (i.e. DCIT 21; 3 units; Lec - 2 hrs & Lab - 3
hrs)
Lecture – 60%
Laboratory – 40%
B. Grading system for 1 unit lecture and 2 units laboratory (i.e. DCIT 22; 3 units; Lec -1 hr & Lab - 6 hrs)
Lecture – 40%
Laboratory – 60%
C. Grading system for 2 units lecture and 3 units laboratory (i.e. ELEX 50; 5 units; Lec – 2 hrs & Lab – 9
hrs)
Lecture – 30%
Laboratory – 70%
CLASS POLICIES
A. Attendance
Students are not allowed to have 20% or more unexcused absences of the total face to face class hours;
otherwise, they will be graded as “DROPPED”.
B. Classroom Decorum
C. Examination/ Evaluation
1. Quizzes may be announced or unannounced.
2. Mid-term and Final Examinations are scheduled.
3. Cheating is strictly prohibited. A student who is caught cheating will be given a score of ”0” for the first
offense. For the second offense, the student will be automatically given a failing grade in the subject.
4. Students who will miss a mid-term or final examination, a laboratory exercise, or a class project may
be excused and allowed to take a special exam, conduct a laboratory exercise or pass a class project
for any of the following reasons:
a. participation in a University/College-approved field trip or activity;
b. due to illness or death in the family; and
c. due to force majeure or natural calamities.
DepEd Oder. (2016). Guidelines on the request and transfer of learner’s school records.
Retrieved from http://www.deped.gov.ph/2016-guidelines-on-the-request-and-transfer-
of-learners-school-records/
REVISION HISTORY
Revision Number Date of Revision Date of Implementation Highlights of Revision
1 June 2020 1st Semester 2020-2021 Format and Flexible
Learning Mode of Delivery
ASSESSMENT IN LEARNING 1
Table of Contents
I. Shift of Education Focus (Content to Learning Outcomes)………………………………….1
a) Outcome Based Education …………………………………………………………………..1
b) Outcomes of Education……………………………………………………………………….2
c) Institutional Program Course and Learning Outcomes ………………………………...2
d) Sample Educational Objectives ……………………………………………………………..2
II. Measurement, Assessment, and Evaluation ...............................................................3
a) Measurement .....................................................................………………….................................3
b) Assessment .....................................................................…………………...................................3
c) Assessment Principles .....................................................................……………………………..3
d) Approaches to Assessment ………………………………………………………………………….3
e) Evaluation ……………………………………………………………………………………………….3
Bloom’s
Taxonomy
Awards and
recognition
The outcome in
Outcome-based
Education
II. DETERMINING THE PROGRESS • From the Latin word ASSIDERE meaning “To Sit
Beside”.
TOWARDS THE ATTAINMENT OF
• The process of gathering evidence on the
LEARNING OUTCOMES performance of a student over a period of time to
identify learning and mastery of skills.
INTRODUCTION
● Measurement, Assessment, and Evaluation plays a There are two kinds of assessment
major role in the attainment of learning outcomes 1. ASSESSMENT OF UNDERSTANDING
-Test is a common kind of assessment.
● There are different types of measurements, as well as Measures the students’ understanding on a
assessments that an educator can use to attain a specific certain topic.
learning outcome
2. ASSESSMENT OF SKILL
● There are three approaches to assessment and these
-Students’ skills are being tested or simply
are Assessment AS learning, Assessment FOR learning, and
the performance-based tasks.
Assessment OF learning
IMPORTANCE OF ASSESSMENT
MEASUREMENT
We assess to IMPROVE, to INFORM, and to PROVE. Aside
• Measurement is the process of describing as well as
from those three mentioned, here are the importance of doing
determining the attributes or characteristics of an object in
assessments.
terms of quantity.
• Aside from physical objects, we can also measure those 1. To find out what the students know (KNOWLEDGE)
abstract ones. Abstract subjects can be measured through 2. To determine what students can do and how they can
numbers (ex. grades). do it (SKILL; PERFORMANCE)
3. To find out how students will do the task (PROCESS)
There are two TYPES of Measurement 4. To determine how students feel about their work
1. OBJECTIVE (MOTIVATION; EFFORT)
• The standardized tests, and it has more/less the
same outcome.
Ex. Height, Weight, Answer in a Multiple-Choice test
2.
SUBJECTIVE
•
Dependent to the assessor or there’s bias to the
assessor.
• Have varied outcomes
Ex. Scores of each group during a performance task
MEASUREMENT INDICATORS
VARIABLES
•
Building blocks of educational measurement upon which
other forms of measurement are built.
• Group of indicators constitutes a Variable.
• Indicators (I) denotes the presence or absence of a
measured characteristic.
FACTORS
1. ASSESSMENT AS LEARNING
3. ASSESSMENT OF LEARNING
PRINCIPLES OF ASSESSMENT
THE
THREE
TYPES
OF
LEARN
ING
There
is
more
than
one
type of
learnin
g. A
commi
ttee of
colleges, led by Benjamin Bloom (1956), identified three
domains of educational activities:
appropriate sub-division and levels of education, with possible THE AFFECTIVE DOMAIN
new categories, combinations of categories and omitting
categories as appropriate” The affective domain (Krathwohl, Bloom, Masia, 1973)
includes the manner in which we deal with things emotionally,
Anderson’s Revised taxonomy as a match to Bloom’s such as feelings, values, appreciation, enthusiasms,
taxonomy Anderson (1990), a former student of Bloom, motivations, and attitudes. The five major categories are listed
updated and revised the taxonomy reflecting relevance to 21st from the simplest behavior to the most complex.
century work for both students and teachers as she said
“The affective domain includes the manner in which we
(Anderson& Krathwohl, 2001). Anderson changed the
deal with things emotionally, such as feelings, values,
taxonomy in three broad categories: terminology, structure and
appreciation, enthusiasms, motivations, and attitudes” –
emphasis (Forehands, 2005). Anderson modified the original Donald Clark
terminology by changing Bloom’s categories from nouns to
verbs. Anderson renamed the knowledge 220 category into
remember, comprehension into understanding and synthesis
into create categories. Anderson also changed the order of
synthesis and placed it at the top of the triangle under the
name of Create (Taylor & Francis, 2002). Thus, Anderson and
Krathwohl’s (2001) revised Bloom’s taxonomy became:
Remember, Understand, Apply, Analyze, Evaluate and Create
• Factual Knowledge
Knowledge of terminology
Knowledge of specific details and
elements
• Conceptual Knowledge
Knowledge of classifications and
categories
Knowledge of principles and
generalizations
Knowledge of theories, models, and
structures
• Procedural Knowledge THE PSYCHOMOTOR DOMAIN
Knowledge of subject-specific skills and
algorithms The psychomotor domain (Simpson, 1972) includes
Knowledge of subject-specific physical movement, coordination, and use of the motor-skill
techniques and methods areas. Development of these skills requires practice and is
Knowledge of criteria for determining measured in terms of speed, precision, distance, procedures,
when to use appropriate procedures or techniques in execution. The seven major categories are
• Metacognitive Knowledge listed from the simplest behavior to the most complex:
Strategic Knowledge
Knowledge about cognitive tasks,
including appropriate contextual and
conditional knowledge
Self-knowledge
DIGITAL TAXONOMY
STUDENT LEARNING
Learning is a complex process. It entails not only what process of individual students, or of cohorts of
students know but what they can do with what they students; it may mean collecting the same examples
know; it involves not only knowledge and abilities but of student performance or using the same instrument
values, attitudes, and habits of mind that affect both semester after semester. The point is to monitor
academic success and performance beyond the progress toward intended goals in a spirit of
classroom. Assessment should reflect these continuous improvement. Along the way, the
understandings by employing a diverse array of assessment process itself should be evaluated and
methods, including those that call for actual refined in light of emerging insights.
performance, using them over time so as to reveal Assessment fosters wider improvement when
change, growth, and increasing degrees of integration. representatives from across the educational
Such an approach aims for a more complete and community are involved. Student learning is a
accurate picture of learning, and therefore firmer campus-wide responsibility, and assessment is a way
bases for improving our students' educational of enacting that responsibility. Thus, while
experience. assessment efforts may start small, the aim over time
Assessment works best when the programs it is to involve people from across the educational
seeks to improve have clear, explicitly stated community. Faculty play an especially important role,
purposes. Assessment is a goal-oriented process. It but assessment's questions can't be fully addressed
entails comparing educational performance with without participation by student-affairs educators,
educational purposes and expectations -- those librarians, administrators, and students. Assessment
derived from the institution's mission, from faculty may also involve individuals from beyond the campus
intentions in program and course design, and from (alumni/ae, trustees, employers) whose experience
knowledge of students' own goals. Where program can enrich the sense of appropriate aims and
purposes lack specificity or agreement, assessment standards for learning. Thus understood, assessment
as a process pushes a campus toward clarity about is not a task for small groups of experts but a
where to aim and what standards to apply; collaborative activity; its aim is wider, better-informed
assessment also prompts attention to where and how attention to student learning by all parties with a stake
program goals will be taught and learned. Clear, in its improvement.
shared, implementable goals are the cornerstone for Assessment makes a difference when it begins
assessment that is focused and useful. with issues of use and illuminates’ questions that
Assessment requires attention to outcomes but people really care about. Assessment recognizes
also and equally to the experiences that lead to the value of information in the process of
those outcomes. Information about outcomes is of improvement. But to be useful, information must be
high importance; where students "end up" matters connected to issues or questions that people really
greatly. But to improve outcomes, we need to know care about. This implies assessment approaches that
about student experience along the way -- about the produce evidence that relevant parties will find
curricula, teaching, and kind of student effort that led credible, suggestive, and applicable to decisions that
to particular outcomes. Assessment can help us need to be made. It means thinking in advance about
understand which students learn best under what how the information will be used, and by whom. The
conditions; with such knowledge comes the capacity point of assessment is not to gather data and return
to improve the whole of their learning. "results"; it is a process that starts with the questions
Assessment works best when it is ongoing not of decision-makers, that involves them in the
episodic. Assessment is a process whose power is gathering and interpreting of data, and that informs
cumulative. Though isolated, "one-shot" assessment and helps guide continuous improvement.
can be better than none, improvement is best fostered Assessment is most likely to lead to improvement
when assessment entails a linked series of activities when it is part of a larger set of conditions that
undertaken over time. This may mean tracking the promote change. Assessment alone changes little.
Its greatest contribution comes on campuses where 4. Assessment requires attention not only to outcomes but also
the quality of teaching and learning is visibly valued and equally to the activities and experiences that lead to the
and worked at. On such campuses, the push to attainment of learning outcomes.
improve educational performance is a visible and
5. Assessment works best when it continuous, ongoing and not
primary goal of leadership; improving the quality of
episodic. Assessment should be cumulative because
undergraduate education is central to the institution's
improvement is best achieved through a linked series of
planning, budgeting, and personnel decisions. On
activities done over time in an instructional cycle.
such campuses, information about learning outcomes
is seen as an integral part of decision making, and 6. Begin assessment by specifying clearly and exactly what
avidly sought. you want to assess. What you want to assess is/are stated in
Through assessment, educators meet your learning outcome/ lesson objectives.
responsibilities to students and to the public.
7. The intended learning outcome/lesson objective NOT
There is a compelling public stake in education. As
CONTENT is the basis of the assessment task. You use
educators, we have a responsibility to the publics that
content in the development of the assessment tool and task
support or depend on us to provide information about
but itis the attainment of your learning outcome NOT content
the ways in which our students meet goals and
that you want to assess. This is Outcome-based Teaching and
expectations. But that responsibility goes beyond the
Learning.
reporting of such information; our deeper obligation --
to ourselves, our students, and society -- is to improve. 8. Set your criterion of success or acceptable standard of
Those to whom educators are accountable have a success. It is against this established standard that you will
corresponding obligation to support such attempts at interpret your assessment results. Example: Is a score of 7 out
improvement. of 10 (the highest possible score) acceptable or considered
OTHER SOURCES: success?)
PRINCIPLES OF GOOD PRACTICE IN ASSESSING 9. Make use of varied tools for assessment data-gathering and
multiple sources of assessment data. It is not pedagogically
LEARNING OUTCOMES sound to rely on just one source of data gathered by only one
assessment tool. Consider multiple intelligences and learning
1. The Assessment of student learning starts with the
styles. DEPED Order No. 73, s.2012 cites the use of multiple
institution's vision, mission and core values. There should be a
measures as one assessment guideline.
clear statement on the kinds of learning that the institution's
values most for this student. 10. Learners must be given feedback about their performance.
Feedback must be specific. "Good work!" is positive feedback
2. Assessment works best when the program has clear
and is welcome but actually is not very good feedback since it
statement of objectives aligned with the Institutional vision,
is not specific A more specific better feedback is "You
mission and core values. Such alignment ensures
observed rules on subject-verb agreement and variety of
clear, shared and implementable objectives.
sentences. Three of your commas were misplaced.
3. Outcome - Based assessment focuses on the student
11. Assessment should be on real-world application and not on
activities that will still be relevant after formal schooling
out of-context drills.
concludes. The approach is to design assessment
activities which are observable and less abstract such 12. Emphasize on the assessment of higher-order thinking.
as " to determine the student’s ability to write a
13. Provide opportunities for self-assessment.
paragraph" which is more observable than " to determine the
student’s verbal ability".
Understands how to solve problems effectively and efficiently. Definition: Leverage the strengths of others to achieve
common goals, and use interpersonal skills to coach and
Gain knowledge and experience through researching and
develop others. The individual is able to assess and manage
analyzing information for more thorough understanding.
his/her emotions and those of others; use empathetic skills to
Oral/Written Communication guide and motivate; and organize, prioritize, and delegate work.
Ability to collaborate successfully. Punctuality, working productively with others, and time
workload management, and understand the impact of non-
Capable of negotiating conflict effectively.
verbal communication on professional work image.
Understands the process of group development.
Career Management
Digital Technology
Definition: Identify and articulate one's skills, strengths,
Definition: Select and use appropriate technology to knowledge, and experiences relevant to career goals, and
accomplish a given task. The individual is also able to apply identify areas necessary for professional growth and utilizes
computing skills to solve problems.
campus resources to develop post-graduation employment students have mastered the content standards expected for
skills. that grade level.
Completion of Career Services and Talent Development Summative assessments are similar to diagnostic
FOCUS test. assessments, except that they are focused on determining
whether a student has mastered the skills and knowledge
Participate in departmental opportunities.
expected for a specific grade level. Summative assessments
include tests, portfolios, and other forms of assessment.
PHASES OF OUTCOMES CONSTRUCTIVE ALIGNMENT
Involve unstructured, complex problems that may have multiple 1. Define the purpose of the assignment/assessment for which
solutions. Such problems contain both relevant and irrelevant you are creating a rubric.
factors, unlabelled, just like real life, and students need to
Consider the following:
decide what’s relevant and to develop a solution they can
explain and defend. What exactly is the assigned task? Does it break down into a
variety of different tasks? Are these tasks equally important?
Require students to “perform” discipline-specific activities or
What are the learning objectives for this assignment/task?
procedures, drawing on a wide range of knowledge and skills.
What do you want students to demonstrate in their completed
Provide feedback, practice, and opportunities to revise and assignments/performances?
resubmit solutions, so they can refine their skills, rather like an
What might an exemplary student product/performance look
apprenticeship between the instructor/TA experts and students.
like? How might you describe an acceptable student
Authentic assessments include such things as performance product/performance? How might you describe work that falls
demonstrations of specific skills, use and manipulation of tools below expectations?
and instruments, oral and/or poster presentations, debates,
What kind of feedback do you want to give students on their
panel discussions, role plays, teaching others, conducting
work/performance? Do you want/need to give them a grade?
experiments, and conducting interviews. Also included are
Do you want to give them a single overall grade? Do you want
“product assessments” such as essays, research reports,
to give them detailed feedback on a variety of criteria? Do you
annotated bibliographies, data analysis and interpretation,
want to give them specific feedback that will help them improve
argument construction and analysis, reviews, critiques and
their future work?
analysis of written work, problem analysis, planning, mapping,
budget development, experimental design, peer editing, 2. Decide what kind of rubric you will use: a holistic rubric or an
portfolios, poster, games, and podcast, video, and multimedia analytic rubric? Holistic and analytic rubrics use a combination
productions (Abbott, 37). of descriptive rating scales (e.g., weak, satisfactory, strong)
and assessment criteria to guide the assessment process.
Begin assignment and assessment design by focusing on
learning outcomes: what do you want students to remember, Holistic rubric
understand, apply, analyze, evaluate, or create (Davis, 362)
A holistic rubric uses rating scales that include the criteria. For
The table below outlines a variety of assignment and
example,
assessment options with rationales for using them and
implementation details. Weak: thesis is unclear due to writing style, organization of
ideas, and/or grammatical errors.
Advantages:
Disadvantages:
PLANNING A TEST
• The test objectives guide the kind of objective tests that will
be designed and constructed by the teacher. This means
aligning the test with the lesson objective or outcome.
• At all times, the test formulated must be aligned with the a. True-False Items or Correct-Wrong
learning outcome. This is the principle of constructive
alignment. b. Multiple-choice Items (more than two, or 3+ choices)
5) ITEM ANALYSIS/TRY-OUT AND VALIDATION • A modified true-false test can offset the effect of guessing by
requiring the students to explain their answer and to disregard
• The test draft is tried out to a group of pupils and students a correct answer if the explanation is incorrect.
• Its purposes: GUIDELINES FOR CONSTRUCTING TRUE OR FALSE
a. Item characteristics through item analysis TEST
b. Characteristic of the test itself—validity, reliability, and 1. Do not give a hint (inadvertently) in the body of the question
practicability Example:
The Philippines gained its independence in 1898 and therefore
celebrated its centennial year in 2000.
THINGS TO CONSIDER IN WRITING TEST ITEMS: Answer:
False, because 100 years from 1898 is not 2000 but 1998.
1. Use Table of Specifications (TOS) as a guide to item writing.
2. Construct more items than needed. 2. Avoid using the words "always", "never", " often", and other
words that tend to be either always true or always false.
3. Write the items ahead of the testing date. Example: Christmas always falls on a Sunday because it is a
4. Write each test item at an appropriate reading level and Sabbath Day.
difficulty. • Statements that use the word "always" are almost always
false.
5. Write each test item in a way that does not provide help in
answering other test items. 3. Avoid long sentences as these tend to be "true". Keep
sentences short.
6. Write each test item so that the task to be done is clearly
defined. 4. Avoid trick statements with some minor misleading word or
spelling anomaly, misplaced phrases, etc. A wise student who
7. Write a test item whose answer would be agreed upon by does not know the subject matter may detect this strategy and
the experts. thus get the answer correctly.
8. Whenever a test is revised, recheck its relevance. 5. Avoid quoting verbatim from reference materials or
textbooks. This practice sends the wrong signal to the students
that it is necessary to memorize the textbook word for word,
TYPES OF PAPER AND PENCIL TESTS: and, thus, acquisition of higher-level thinking skills is not given
due importance.
1) Selected Response Type – "choices"
6. Avoid specific determiners or give-away qualifiers
Example: Executives usually suffer from hyperacidity. 5. An item should only contain one correct or clearly best
• The statement tends to be correct. The word "usually" leads answer.
to the answer.
6. Items used to measure understanding should contain some
7. Avoid a grossly disproportionate number of either true or novelty, but not too much.
false statements or even patterns in the occurrence of true and
false statements. 7. All distracters should be plausible/attractive.
8. Avoid double negatives. This makes test items unclear and 8. Verbal associations between the stem and the correct
will confuse the students. answer should be avoided.
Example: The changes that take place in early childhood are
9. The relative length of the alternatives/options should not
NOT unchangeable.
provide a clue to the answer.
• The test item simply means "The changes in early childhood
are changeable". 10. The alternatives should be arranged logically.
2. MATCHING TYPE 11. The correct answer should appear in each of the alternative
positions and approximately an equal number of times but in
1. Use only homogenous material in a single matching exercise. random order.
2. Include an unequal number of responses and premises and 12. Always have the stem and the alternatives on the same
instruct the students those responses may be used once, more page.
than once, or not at all.
13. Use of special alternatives such as none of the above or all
Use a joker or jokers. of the above should be done sparingly.
3. Keep the list of items to be matched brief, and place the 14. Do not use multiple-choice items when other types are
shorter responses at the right. more appropriate.
4. Arrange the list of responses in logical order. 15. Provide at least four (4) options.
5. Indicate in the directions the basis for matching the
responses and premises.
Other Reminders:
6. Place all the items for one matching exercise on the same
page. 1. Do not use unfamiliar words, terms, and phrases.
7. Limit a matching exercise to not more than 10 to 15 items. 2. Do not use modifiers that are vague and whose meanings
can differ from one person to the next such as much, often,
Varieties of Matching Type Test usually, etc.
1. Balanced Variety - the number of items is equal to the 3. Avoid complex or awkward word arrangements. Also, avoid
number of options the use of negatives in the stem as this may add unnecessary
comprehension difficulties.
2. Unbalance Variety - There are unequal numbers in two
columns 4. Do not use negative or double negatives as such statements
tend to be confusing. It is best to use simpler sentences rather
- characterized by the use of distracters
than sentences that would require expertise in grammatical
which usually leaves a number of unused options
constructions.
3. Classification Variety - requires the sorting of words or other
5. Distracters should be equally plausible and attractive.
types of materials into their proper categories. The students
are asked to classify each specific word used in the context. 6. The length explicitness or degree technically of all
Distracters are seldom used because all terms are properly alternatives should not be determinants of the correctness of
classified. the answer.
4. MULTIPLE CHOICE 7. Avoid alternatives that are synonymous with others or those
that include or overlap others.
Writing Multiple-choice Items
8. Avoid the use of unnecessary words or phrases, which are
1. The stem of the item should be meaningful by itself and
not relevant to the problem at hand. Such items test the
should present a definite problem.
student’s reading comprehension rather than the knowledge of
2. The item stem should include as much of the item as the subject matter.
possible and should be free of irrelevant material.
9. The difficulty of a multiple-choice item may be controlled by
3. Use a negatively stated stem only when significant learning varying the homogeneity or degree of similarity of responses.
outcomes require it, and stress/highlight the negative words for The more homogenous, the more difficult the items.
emphasis.
CONSTRUCTING SUPPLY OR CONSTRUCTED RESPONSE
4. All the alternatives should be grammatically consistent with
the stem of the item. TYPE
1. SUPPLY TYPE/ENUMERATION
Guidelines in Writing Supply Type of Test 2. Inform the students of the criteria to be used for grading their
1. Word the item/s so that the required answer is both brief and essays. This rule allows the students to focus on relevant and
specific. substantive materials rather than on peripheral and
2. Do not take statements directly from textbooks. unnecessary facts and bits of information.
3. A direct question is generally more desirable than an
incomplete statement. Example: Write an essay on the topic: “Plant Photosynthesis”
4. If the item is to be expressed in numerical units, indicate the following criteria (a) coherence (b) accuracy of the statement (c)
type of answer wanted. use of the keywords, (d) clarity, and (e) extra points for
5. Blanks for answers should be equal in length and as much innovative presentation of ideas.
as possible at the end or near the end of the statement.
3. Put a limit on the essay test.
6. When completion items are to be used, do not indicate too
many blanks. 4. Decide on your essay grading system prior to getting the
essays of your students.
Other reminders:
1. Word the items so that the required answer is both brief and 5. Evaluate all of the students’ answers to one question before
definite proceeding to the next question.
2. Avoid over mutilated items
3. Do not lift statements directly from the book. When taken out 6. Evaluate answers to essay questions without knowing the
of the context, textbook statements are to general and identity of the writer.
ambiguous.
4. Where the answer is to be expressed in numerical units, 7. Whenever possible, have two or more persons grade each
indicate the type of answer wanted. answer.
2. ESSAY TYPES OF ESSAYS
Writing Essay type of Test 1. Narrative Essays
1. Restrict the use of essay questions to those learning A narrative essay is one which details a story, often times from
outcomes that cannot be satisfactorily measured by objective a particular point of view. When writing a narrative essay, you
items. should include a set of characters, a location, a good plot and
2. Construct questions that will call forth the skills specified in a climax to the story.
the learning standards.
3. Phrase each question so that so that the students’ task is 2. Descriptive Essay
clearly defined or indicated.
4. Avoid the use of optional questions. A descriptive essay will describe something in great detail. The
5. Indicate the approximate time limit or the number of points subject can be anything from people and places to objects and
for each question. events but the main point is to go into depth. You might
6. Prepare an outline of the expected answer in advance or a describe the item’s colour, where it came from, what it looks
scoring rubric. like, smells like, tastes like or how it feels.
3. Expository Essay
The Learning Outcomes Measurable by Essay Type of
Test: An expository essay is used as a way to look into a problem
and therefore compare it and explore it. For the expository
1. Comparison between two or more thing essay there is a little bit of storytelling involved but this type of
2. The development and defense of opinion essay goes beyond that. The main idea is that it should explain
3. Questions of cause and effect an idea giving information and explanation.
4. Explanation of meaning
5. Summarizing of information in a designated area 4. Argumentative Essay
6. Analysis
When writing an argumentative essay, you will be attempting to
7. Knowledge of relationship
convince your reader about an opinion or point of view. The
8. Illustrations of rules, principles, procedures, and application
idea is to show the reader whether the topic is true or false
9. Application of rules, laws, and principles to new situations.
along with giving your own opinion. It is very important that you
10. Criticism of adequacy, relevance, or
use facts and data to back up any claims that made within the
correctness of a concept, idea, or information
essay.
11. Formulation of new questions and problems
12. Reorganization of fact OTHER TYPES OF ESSAYS:
13. Discriminations between objects, concepts, or events
14. Inferential thinking Definition Essays
This is a type of essay which is used to define an idea, thing or
concept.
Simple Essays
The Rules which facilitate grading of essay papers:
This is, as its name would suggest, a simple essay which is
1. Phrase the direction in such a way that the students are made up from five paragraphs and can be written on any
guided on the concepts to be included. subject.
order to convince the reader not to do a particular thing, or ____1. Hero of the Battle of Mactan
indeed to do it..
____2. Longest Filipino Revolt
Rhetorical Analysis Essays
This type of essay is used as a way of analysing a piece of
rhetoric or a speech and looks at any rhetorical devices which 2. The statement should be phrased so that there is only
have been used. one response.
Analytical Essays Examples:
As the name of this type of essay might suggest, it is an essay
which is used to analyse something. This could be a piece of (Poor) ____1. Manuel L. Quezon
writing, a movie or anything else. The idea is that the analytical ____ 2. The City of Baguio
essay will look at what it is analysing from various viewpoints
allowing the reader to form their own opinion. (Better)____1. The First President of the Philippines
____2. The summer capital of the Philippines
Compare And Contrast Essays
When writing a compare and contrast essay, the author will be Arrangement of Elements
using it as a way of creating a comparison between two things
or finding a contrast between them. But it is not limited to one Ordering - measures memory of relationships and concepts of
or the other, you can also write a compare and contrast essay organization in many subjects.
to do both of these things in one.
- is rearrangement of elements consists of ordering
Cause And Effect Essays or assembling items in some basis.
This is a type of essay which allows the author to explain the 1. Chronological Order
cause of a certain thing as well as being able to explain the
effects of it. Example: Arrange the following Presidents in chronological
order from most recent to least recent. Write your answer on
Critical Essays the spaces at the right
When writing a critical essay, the author will be writing about a
piece of literature and evaluating it. They will use the good and Garcia 1. ____
bad points of the piece in order to do this. Marcos 2.____
Magsaysay 3.____
Process Essays Macapagal 4.____
The process essay is a way of outlining or detailing a process. Osmena 5.____
This is done by breaking down the process so that the readers Quezon 6.____
are able to understand it and even perform the process Quirino 7.____
themselves once they have read the essay. Roxas 8.____
Synthesis Essays
This is a type of essay which is used as a way to synthesis 2. Geographical Order – This is arrangement according to
various concepts in order to create a judgement on their good geographical location.
and bad points.
Example: Arrange the following provinces from north to south.
Write your answer on the spaces at the right
Review Essays Bulaca 1. __________
The review essay is one which looks at a piece of literature Cagayan. 2. __________
and gives a review on it based around the good and bad points Nueva Ecija 3. __________
within it. Sorsogon 4. __________
Research Essays
The research essay is one which is written based on a 3. Arrangement according to Magnitude – the basis of this
research question and aims to give a specific answer to it. The arrangement in size, which may be height, width, distance, etc
author will research the subject as a way of providing an
answer to the question that was posed. 4. Alphabetical Order – Arrangement of words according to
their appearance in the dictionary.
Explanatory Essays
This type of essay is used as way to explain any given piece of 5. Arrangement according to importance, Quality, etc.
written work or literature. They can be written on a variety of
types of literature such as poetry, novels or a short story.
3. IDENTIFICATION
ITEM DIFFICULTY
Item Difficulty or the difficulty of an item is defined as the
number of students who are able to answer the item correctly
divided by the total number of students. Thus:
Example: What is the item difficulty index of an item if 25 This is a perfectly discriminating item and is the ideal item
students are unable to answer it correctly while 75 answered it that should be included in the test.
correctly?
As in the case of index difficulty, we have the following rule of
Here the total number of students is 100, hence, the item thumb:
difficulty index is 75/100 or 75%.
Table 2. Index range
One problem with this type of difficulty index is that it may INDEX RANGE INTERPRETATION ACTION
not actually indicate that the item is difficult or easy. -1.0 to -.50 Can discriminate Discarded
A student who does not know the subject matter will but the item is
naturally be unable to answer the item correctly even questionable
if the question is easy. How do we decide on the -.55 to .45 Non- discriminating Revised
basis of this index whether the item is too difficult or .46 to 1.0 Discriminating item Include
too easy?
Table 1. Range of difficulty index Example: Consider a multiple item choice type of test with the
ff. data were obtained:
RANGE OF INTERPRETATION ACTION
DIFFICULTY Table 3. Multiple choice type test
INDEX ITEM OPTIONS
0-0.25 Difficult Revise or A B* C D
discard 1
0.26-0.75 Right Difficulty Retain 0 40 20 20 TOTAL
0.76- above Easy Revise or
discard 0 15 5 0 Upper
Difficult items tend to discriminate between those 25%
who know and those who does not know the 0 5 10 5 Lower
answer. 25%
Easy items cannot discriminate between those
two groups of students. The correct response is B. Let us compute the difficulty index
and index of discrimination:
We are therefore interested in deriving a
measure that will tell us whether an item can
discriminate between these two groups of
students. Such a measure is called an index
of discrimination.
DISCRIMINATION INDEX
An easy way to derive such a measure is to measure how
difficult an item is with respect to those in the upper 25% of Figure 2. Difficulty index
the class and how difficult it is with respect to those in the =
lower 25% of the class. If the upper 25% of the class found
the item easy yet the lower 25% found it difficult, then the = 40%, within of a “good item”
item can discriminate properly between these two groups.
Thus: The discrimination index can be similarly computed:
Index of discrimination = DU – DL DU=
Example: Obtain the index of discrimination of an item if the = 15/20 = .75 or 75%
upper 25% of the class had a difficulty index of 0.60 (i.e., 60%
of the upper 25% got the correct answer) while the lower 25% DL=
of the class had a difficulty index of 0.20.
=5/20 = .25 or 25%
DU = 0.60 while DL = 0.20, thus index of discrimination = .60 -
.20 = .40. Discrimination index = DU-DL
Theoretically, the index of discrimination can range from -1.0 = .75-.25
(when DU =0 and DL = 1) to 1.0 (when DU = 1 and DL = 0) = .50 or 50%
When the index of discrimination is equal to -1, then this Thus, the item also has a “good discriminating power”.
means that all of the lower 25% of the students got the
correct answer while all of the upper 25% got the wrong It is also instructive to note that the distracter A is not an
answer. In a sense, such an index discriminates correctly effective distracter since this was never selected by the
between the two groups but the item itself is highly students. Distracter C and D appear to have a good
questionable. appeal as distracters.
On the other hand, if the index discrimination is 1.0, then this MORE SOPHISTICATED DISCRIMINATION INDEX
means that all of the lower 25% failed to get the correct
answer while all of the upper 25% got the correct answer.
Item Discrimination refers to the ability of an item to • P- Percentage who answered the item correctly
differentiate among students on the basis of how well they (index of difficulty)
know the material being tested. • R- Number who answered the item correctly
• T-Total number who tried the item
A good item is one that has good discriminating ability Figure 5. Percentage
and has a sufficient level of difficulty (not too difficult
nor too easy).
VALIDATION
After performing the item analysis and revising the items which
Figure 3. Index of difficulty need revision, the next step is to validate the instrument.
Where: The purpose of validation is to determine the
characteristics of the whole test itself,
• Ru – The number in the upper group who answered namely, the validity and reliability of the test.
the item correctly. Validation is the process of collecting and
• RL- The number in the lower group who answered analyzing evidence to support the
the item correctly. meaningfulness and usefulness of the test.
• T- The total number who tried the item. VALIDITY
INDEX OF ITEM DISCRIMINATING POWER Validity is the extent to which measures what it purports to
measure or referring to the appropriateness, correctness,
meaningfulness, and usefulness of the specific decisions a
teacher makes based on the test results.
• How comprehensive?
• Does it logically get at the intended variable? How RELIABILITY
adequately does the sample of items or questions Refers to the consistency of the scores obtained – how
represent the content to be assessed? consistent they are for each individual from one
administration of an instrument to another and from one
CRITERION-RELATED EVIDENCE OF VALIDITY set of items to another.
We already have the formulas for computing the
REFERS TO THE RELATIONSHIP BETWEEN SCORES reliability of a test; for internal consistency,
OBTAINED USING THE instrument and scores obtained for instance, we could use the split-half
using one or more other test (often called criterion). method or the Kuder-Richardson formulae:
• How strong is this relationship?
• How well do such scores estimate present or predict KR-20 or KR-21
future performance of a certain type?
• Reliability and validity are related concepts. If an
CONSTRUCT-RELATED EVIDENCE OF VALIDITY instrument is unreliable, it cannot yet valid
outcomes.
refers to the nature of the psychological construct or As reliability improves, validity may improve (or
characteristic being measured by the test. may not).
• How well does a measure of the construct explain However, if an instrument is shown
differences in the behavior of the individuals or their scientifically to be valid then it is almost
performance on a certain task? certain that it is also reliable.
The following is standard followed by almost universally in
USUAL PROCEDURE FOR DETERMINING CONTENT educational tests and measurement:
VALIDITY
Teacher write out objectives based on TOS Table 5. Reliability
Gives the objectives and TOS to 2 experts along with a RELIABILITY INTERPRETATION
description of the test takers. .90 and above Excellent reliability; at the
The experts look at the objectives, read over the items in the level of the best
test and place a check mark in front of each question or standardized tests.
item that they feel does NOT measure one or more .80 - .90 Very good for a classroom
objectives. test.
.70 - .80 Good for classroom test; in
USUAL PROCEDURE FOR DETERMINING CONTENT the range of most. There are
VALIDITY probably a few items which
This continues until the experts approve all items and also could be improved.
when the experts agree that all of the objectives are .60 - .70 Somewhat low. This test
sufficiently covered by the test. should be supplemented by
OBTAINING EVIDENCE FOR CRITERION-RELATED other measures (e.g., more
tests) for grading.
VALIDITY
The teacher usually compares scores on the test in question
with the scores on some other independent criterion test
which presumably has already high validity (concurrent
validity).
Another type of validity is called the predictive validity wherein
the test scores in the instrument is correlated with scores
on later performance of the feelings.
Note that 19, 23, 27, 31, 35, 39 are the lower limit and 22,
26, 30, 34, 38, 42 are the upper limit.
∑ - sigma notation or summation, is used to denote the
sum of values
In the class 19 – 22, to compute the lower boundary (LB),
lower limit minus 0.5, 19 – 0.5 =
18.5, therefore the lower boundary of 19 is 18.5
In the class 19 – 22, to compute the upper boundary
(UB), upper limit plus 0.5, 22 + 0.5 =
22.5, therefore the upper boundary of 22 is 22.5
3) Relative frequency distribution. Shows the proportion
in percent the frequency of each class to
the total frequency.
Relative frequency (%f) = frequency(f )
n
x 100
In the class 19 – 22, the corresponding frequency is 9,
VIII. THE GRADING SYSTEMS AND THE
and the total number of frequency or n is GRADING SYSTEM IN THE PHILIPPINES
45, hence, INTRODUCTION
Grading in education is the process of applying standardized
(%f) = 8 measurements of varying levels
of achievement in a course.
45 x 100 = 17.78%
Grades can be assigned in letters (for example, A, B, B+, B-, C,
Note: Rounded up to 2 decimal places. C-, D), as a seven-point in the
American system. 1, 1.25, 1.50, 1.75, 2.0, 2.5, 3.0 and 4.0 or an
4) Cumulative frequency distribution. Tries to determine eight-point system, as letters are
the partial sums from the data classified replaced with numerical values in the Philippine colleges and
universities. In basic education,
in terms of classes. This distribution answers problems grades are expressed as percentages (of accomplishment)
like the number of students who got a such as 80% or 75%. As a number out
of a possible total (for example out of 20 or 100), or as
passing mark; the number of employees who got
descriptors (excellent, great, satisfactory,
efficiency rating from 76% to 95%, and so on. needs improvement).
2 Types of Cumulative Frequency Distribution
NORM REFERENCED GRADING
a. Less than cumulative frequency (<cf)
Norm-Referenced Grading System refers to a grading
b. Greater than cumulative frequency (>cf) system wherein a student’s performance is
evaluated relatively to the performance of the other
student.
Using the norm-ref. grading system, a student
performance is evaluated relatively to the
performance of other student within the group.
ADVANTAGES
It is very easy to use.
It works well for the courses with retention policies
and it limits only few students to
advance to the next level of the course.
It is useful if the focus is the individual achievement of
the students.
It is appropriate to a large group of students that is,
more than 40.
The teacher easily identifies learning criteria – the
percentage of students who receive
highest grade or lowest grade.
Table 8. Frequency Distribution Table of 45 Students in
Mathematics VI
DISADVANTAGES
The performance of a student is not only determined For example, in the Philippine setting, not all high school
by his achievement, but also the students can actually advance to
achievement of the other students.
It promotes competition among the students rather college or university level because of financial constraints, the
than cooperation. norm-referenced grading system
It cannot be used when the class size is smaller than
40. can be applied.
Not all the student can pass the given subject or
course. Example: In a class of 100 students, the mean score in a test is
70 with a standard deviation of 5.
controlling factors.
FOUR QUESTIONS IN THE GRADING SYSTEM: Test Standardization is a process by which teacher or
Marinila D. Svinicki (2007) of the Center for Teaching researcher-made tests are validated and
Effectiveness of the University of Texas item analyzed. After a thorough process of validation,
the test characteristics are established.
at Austin poses four intriguing questions relative to grading.
1. Should grades reflect absolute achievements level or These characteristics include: test validity, test reliability,
achievement relative to others in the test difficulty level and other
characteristics as previously discussed.
same class?
CUMMULATIVE AND AVERAGE SYSTEM OF GRADING
2. Should grades reflect achievements only or nonacademic
components such as attitude,
Averaging System is the grade of student on a particular
speed and diligence? grading period equals the average of
the grades obtained in the prior grading periods and the
3. Should grades report status achieved or amount of growth? current grading period.
4. How can several grades on diverse skills combine to give a Cumulative Grading System is the grade of a student in a
single mark? grading period equals his current
grading period grade which is assumed to have the cumulative
effects of the previous grading
periods.
WHAT SHOULD GO INTO A STUDENT’S GRADES?
23, M., 22, M., 21, F., 21, A., 10, A., 31, D., & 5, D. (2020,
May 14). The shift of educational focus from content to learning
outcomes. ELCOMBLUS. Retrieved February 25, 2022, from
https://www.elcomblus.com/the-shift-of-educational-focus-from-
content-to-learning-outcomes/