0% found this document useful (0 votes)
38 views

Chapter 1 Review of Principles of High Quality Assessment

The document summarizes principles of high quality assessment, including: 1. Learning targets should be clearly stated and measurable in terms of knowledge, skills, abilities, and products. Bloom's taxonomy is discussed as a framework for cognitive targets. 2. Various assessment methods are described like written responses, performance tests, oral questioning, and observation. Properties of good assessment like validity, reliability, fairness, and practicality are also outlined. 3. Validity refers to an assessment measuring what it intends to measure. Types of validity discussed are face, content, construct, and criterion-related validity. Reliability is the consistency and stability of assessment results.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views

Chapter 1 Review of Principles of High Quality Assessment

The document summarizes principles of high quality assessment, including: 1. Learning targets should be clearly stated and measurable in terms of knowledge, skills, abilities, and products. Bloom's taxonomy is discussed as a framework for cognitive targets. 2. Various assessment methods are described like written responses, performance tests, oral questioning, and observation. Properties of good assessment like validity, reliability, fairness, and practicality are also outlined. 3. Validity refers to an assessment measuring what it intends to measure. Types of validity discussed are face, content, construct, and criterion-related validity. Reliability is the consistency and stability of assessment results.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 69

Chapter 1: Review of Principles of

High Quality Assessment

Reporters:
Apolinario, Leah
Abdul, Johaira
Villa, Liezl
Villaganas, Dave
Vista, Clint
Clarity of Learning
• Assessment can be
Targets made Precise, Accurate,
and Dependable only if
what are to be achieved
are clearly stated and
feasible.
We consider learning targets involving knowledge, reasoning skills,
products and effects.
Learning targets need to be stated in behavioral terms or Terms that
denote something w/c can be observed thru the behavior of the student.

1. Cognitive Targets
2. Skills, Competencies and Abilities Targets
3. Products, Outputs and Project Targets
1. COGNITIVE Benjamin S. Bloom
TARGETS • As early as the 1950's,
Bloom (1954),
proposed a hierarchy of
educational objectives
as the cognitive level.
Level 1: Remembering

Recall or retrieve previous learned information

Example Key Words: Defines, describes,


identifies, labels, lists, outlines, selects, states.
Level 2: Understanding

Comprehend meaning, translation, state


problem in own words, making meaning.

Example Key Words: Comprehends, explains,


distinguishes, estimates, gives examples,
interprets, predicts, rewrites, summarizes.
Level 3:Applying

Use concept in new situation, applies what has


been learned in new situation.

Example Key Words: Applies, changes,


computes, operates, constructs, modifies, uses,
manipulates, prepares, shows, solves.
Level 4: Analyzing

Separate materials or concept into component


parts so that the organization is clear.
Distinguishes between facts and inferences.

Example Key Words: Breaks down, compares,


contrasts, diagrams, differentiates,
discriminates, identifies, infers, outlines,
relates, selects, separates.
Level 5:Evaluating

Make judgements about the value of ideas or


materials.
Example Key Words: Appraises, compares,
criticizes, defends, describes, discriminates,
evaluates, interprets, justifies, summarizes.
Level 6:Creating

Build a structure or pattern from various


elements. Put parts together to create a whole,
to make new meaning and structure.

Example Key Words: Composes, compiles,


designs, generates, modifies, organizes,
rearranges, reorganizes, revises, rewrites,
summarizes, creates.
Students make judgments about the value of ideas, items,
materials, and more.
Students are expected bring in all they have learned to make
informed and sound evaluations of material.
Key Words for the Evaluation Category: evaluate, appraise, conclude, criticize,
critique
Ex:
Watch an stage play and write a critique of the actor's performance.
* Evaluate the actors professionals, amateurs, or students?
* Criticize the actors capable of dealing with the script's requirements?
* (Be fair to the actors in your assessment of their talents and the level of their
"craftsmanship.")
2. SKILLS,
COMPETENCIES
AND ABILITIES • Skills refer to specific
TARGETS activities or tasks that a
student can proficiently do
e.g. skills in coloring, language skills
• Skills can be clustered together to form specific
competencies e.g.
Birthday card making.
• Related competencies characterize student's
ability. (DACUM, 2000)
Abilities can be roughly categorized into:
cognitive, psychomotor and affective abilities

is an indication that the Other students are better at doing things


student can most likely alone like programming & web
Ability to work well w/ designing (cognitive ability) and,
succeed in work that
others & to be trusted by therefore, they would be good at highly
requires leadership technical
every classmate
Abilities Individualized work.
(affective ability)
3. PRODUCTS,
OUTPUTS AND
PROJECTS TARGETS Tangible and concrete
evidence of student's ability
A clear target for products and projects need to
clearly specify the level of worksmanship of such
projects e.g. expert level, skilled level or novice level.
Once the learning targets are clearly set, it is now
necessary to determine an
appropriate assessment procedure or method.
Educational
Measurement

• Written Response Instruments


• Product Rating Scales
• Performance Test
• Oral Questioning
• Observation and Self Reports
1.Written Response
• Objective Tests
Instruments • Essays
• Checklist
• Appropriate for assessing the various

Objective Test levels of hierarchy of educational


objectives.
• Require a user to choose or provide a
response to a question whose answer is
predetermined. Such a question might
require a student to:
a. select a solution from a set of choices
(multiple choice, true or false, matching)
b. identify an object or position (graphical)
c. supply brief numeric or text responses
Essay
• can test the students grasp of the
higher level cognitive skills
particularly in the areas of
application analysis, synthesis and
judgement.
Checklist
• List of several characteristics or
activities presented to the subjects of
a study, where they will analyze and
place a mark opposite to the
characteristics.
Product Rating • A teacher is often tasked to

Scales rate products Book reports,


Maps, Charts, Diagrams,
Notebooks, Essays, Creative
Endeavors.
• need to be developed to
assess various products over
the years.
PERFORMANCE TEST
• A performance checklist consists of a list of behaviors that make up a
certain type of performance (e.g using a microscope, typing a letter,
solving a mathematics performance and so on).
• It is used to determine whether or not an individual behaves in a
certain way when asked to complete a particular task.
• If a particular behavior is present when an individual is observed, the
teacher places a check opposite on the list.
Oral questioning appropriate assessment
Oral Questioning methods when the objectives are:
• a.) to assess student's stock
knowledge and/or
• b.) to determine the student’s ability
to communicate ideas in coherent
(logical and consistent) verbal
sentences.
Observation
and Self
• Useful supplementary
Reports assessment methods when
used in conjunction with oral
questioning and performance
tests.
Properties of
Assessment Methods
• Validity
• Reliability
• Fairness
• Practicality and Efficiency
• Ethics in Assessment
• The quality of the assessment instrument
and method used in education is very
important since the evaluation and the
judgment that the teacher gives on the
student are based from the information he
obtains using these instruments.
5 PROPERTIES OF
ASSESSMENT METHOD
1) Validity
2) Reliability
3) Fairness
4) Practicality and Efficiency
5) Ethics
VALIDITY
• It is the degree to which a test measures what is supposed to
measure.
• Defined as referring to the appropriateness, correctness,
meaningfulness and usefulness of the specific conclusions that a
teacher reaches regarding the teaching-learning situation.
4 Types of Validity

I. Face Validity
II. Content Validity
III. Construct Validity
IV. Criterion-related Validity
FACE VALIDITY
What do students think of the test?
• Outward appearance of the test lowest form of test validity.
• A test can be said to have face validity if it “looks like” it is going to
measure what it is supposed to measure.
I. Content Validity

Am I testing what I taught?


• It is the degree which test items
match some objective criterion.
• Also refers to the content and format
of the instrument. The content and
format must be consistent with the
definition of the variable or factor to
be measured.
II. Construct Validity
Am I testing in the way I thought?
• Evaluates whether a measurement tool
really represents the thing we are
interested in measuring. It’s central to
establishing the overall validity of a
method.
III. Criterion-related
Validity
How does this compare with the
existing valid test?
• It is when the test item is judged
against a specific criterion.
IV. Concurrent Validity

• Refers to the degree of relationship


between scores on a test or scale on
another measure of established validity
given at about the same time.
V. Predictive Validity
• Refers to the degree or extent to
which scores on a test can predict
later behavior or test scores.
RELIABILITY
• Reliability is the degree to which an
assessment tool produces stable and
consistent results.

• Also refers to the instrument’s


consistency and stability.
a. The Split-half Method- involves
scoring two halves of a test separately for
each person and then calculating a
correlation coefficient for the two sets of
scores.
The spearman-brown prophecy formula is a
useful tool for estimating composite reliability.
It’s particularly useful when we are considering
lengthening or shortening a test.
Where:

is the estimated reliability of the composite or test (for k items)


b. The Kuder Richardson-is the more
frequently employed formula for
determining internal consistency.
- Particularly KR20 (more difficult to
calculate/requires a computer program)
and KR21.
KR-20 Scores
the scores for KR-20 range from 1 to 0, where 0 is
no reliability and 1 is perfect reliability. The closer
the score is to 1, the more reliable the test. Just what
constitutes an “acceptable” KR-20 scores depends
on the type of test. In general, a score of above .5 is
usually considered reasonable.
Apply the following formula once for each item.

n=sample size for the test

v= variance for the test

p= proportion of people passing the item

q= proportion of people failing the item


KR-21

The KR-21 is similar, except its used for a test where the items are all about

the same difficulty. The KR-21 is similar, except it’s used for a test where

the items are all about the same difficulty. The formula is [n/(n-1) * [1-

(M*(n-M)/(n*Var))] where:

n= sample size,

Var= variance for the test,

M = mean score for the test.


Dr. Frederic Kruder (1903-2000) one of the
premier innovators of vocational assessments.
- His 1938 Kuder Preference Record became one of the
most used career guidance instruments in school and
colleges, and was taken by more than a million people
worldwide over the course of several decades.
4 Types of Reliability
I. Inter-rater
II. Alternate Forms
III. Test-Retest
IV. Internal Consistency
I. Inter-rater
• Used to assess the degree to which
different raters or observers give
consistent estimates of the same
phenomenon.
II. Alternate Forms
• Used to assess the consistency of
the results of two tests constructed
in the same way from the same way
from the same content domain.
III. Test-Retest

• Used to assess the consistency of a


measure from one time to another.
IV. Internal Consistency

• Reliability is a way to gauge how well a


test or survey is actually measuring what
you want it to measure.
FAIRNESS
• An assessment procedure needs to be
fair.

• Students need to know exactly what the


learning targets are and what method of
assessment will be used.
• Assessment has to be viewed as an
opportunity to learn rather than an
opportunity to weed out poor and slow
learners.

• Fairness also implies to freedom from


teacher-stereotyping.
PRACTICALITY &
EFFICIENCY
• An assessment method should be
practical in the sense that the teacher
should be familiar with it, does not
require too much time and is in fact,
implementable.
• A complex assessment procedure tends
to be difficult to score and interpret
resulting in a lot misdiagnosis or too
long a feedback period which may
render the test inefficient.
1) Teachers should be familiar with the
test.
2) Number of items is not complicated.
3) Implementable.
ETHICS
• The term “ethics” refers to the question
of right and wrong.

• Conforming to the standards of conduct


of a given profession or group.
Here are some situations in
which assessment may not be
called for:

1. Requiring students to answer checklist


of their sexual fantasies.
2. Asking elementary pupils to answer
sensitive questions without the consent of
their parents.

3. Testing mental abilities of pupils using


an instrument whose validity and
reliability are unknown.
A HIGH QUALITY ASSESSMENT has clear
learning targets is appropriate in method is valid,
reliable, fair,

You might also like