0% found this document useful (0 votes)
18 views5 pages

EDU- 533 P2 Reviewer

The document outlines the process of designing a well-written test, emphasizing the importance of a Table of Specifications (TOS) to align test objectives with assessment targets. It details various test formats, guidelines for writing different types of test items, and the significance of reliability and validity in assessment. Key steps include determining test objectives, coverage, and ensuring alignment with desired learning outcomes.

Uploaded by

bernaskarenmay
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views5 pages

EDU- 533 P2 Reviewer

The document outlines the process of designing a well-written test, emphasizing the importance of a Table of Specifications (TOS) to align test objectives with assessment targets. It details various test formats, guidelines for writing different types of test items, and the significance of reliability and validity in assessment. Key steps include determining test objectives, coverage, and ensuring alignment with desired learning outcomes.

Uploaded by

bernaskarenmay
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

EDU 533: ASSESSMENT IN

LEARNING 1 (P2) Table of Specifications


Sometimes called a test blueprint
• is a tool used by teachers to design a test.
PLANNING WRITTEN TEST • It is a table that maps out the test objectives,
SAS 10 content, or topics covered by the test; the levels of
cognitive behavior to be measured; the distribution
WHAT IS THE FIRST THING THAT YOU NEED of items; and the test format.
TO DO IN DESIGNING A WELL-WRITTEN TEST
ITEM? Steps in developing TOS:
In designing a well-planned written test, first and 1. Determine the objectives of the test.
foremost you should be able to identify the There are 3 types of objectives.
intended learning outcome in a course where a a. Cognitive- design to increase an individual’s
written test is an appropriate method to use. knowledge, understanding, and awareness.
b. Affective- aim to change an individual’s
OBJECTIVES IN TESTING attitude towards something desirable.
c. Psychomotor- design to build physical or
motor skills
Bloom’s Taxonomy of Education Ojectives the
most popular taxonomy of educational objectives 1. Determine the objectives of the test.
Three Domains 2. Determine the coverage of the test.
1. knowledge-based goals (Cognitive Domain) 3. Calculate the weight for each topic.
2. Skills-based goals (Psychomotor Domain) 4. Determine the number of items for the whole
3. Affective goals (Affective Domain) test.
5. Determine the number of items per topic.

Can teachers skip the preparation of TOS?


Why?
Answer: Anybody can skip the preparation of TOS
but this may reduce the purpose to parallel the
learning targets with the assessment targets.
Other factors such as time management, proper
assessment planning are also compromised.

WRITTEN TEST, GUIDELINES,


CATEGORIES, AND FORMAT
SAS 11

Guidelines in choosing appropriate test


format
Ask the following questions:

1. What are the objectives or desired learning


outcomes of the lesson being assigned?
2. What level of thinking is to be assessed
(remembering, understanding, applying, analyzing,
or creating)? Does the cognitive level of the test
questions match your instructional objectives or
DLOs?
3. Is the test matched or aligned with the course’s
DLOs and the course contents or learning
activities?
4. Are the items realistic to the students?
Categories of Tests

○ Selected-Response Test requires learners to


choose correct answers or best alternatives from
several choices.
a) Multiple choice test. Most commonly
used format in formal assessment and
typically consists of a stem (problem), one B. STEM
correct or best alternative (correct 1. Write the directions in the stem in a clear and
answer), and three of more incorrect or understandable manner.
inferior alternatives (distractors)
b) True-False or Alternative response
test. generally consists of a statement
and deciding if the statement is true or
false.
c) Matching type test- Consists of 2 sets of
items to be matched with each other 2. Write stems that are consistent in form and
based on a specified attribute. structure. That is, present all items either in
○ Constructed-Response Test require learners question or in descriptive or declarative form.
to supply answers to a given question or problem.
a) Short answer test. Consist of open-
ended questions or incomplete sentences
that require learners to create an answer
for each item.
i. Completion. Consists of incomplete
statements.
ii. Identification. Consists of statements 3. Word the stem positively and avoid double
that require the learners to identify or negatives, such as NOT and EXCEPT in a stem. If a
recall terms or concepts. negative word is necessary, underline or capitalize
iii. Enumeration. Require the learners to the words for emphasis.
list down possible answers to the
questions.
b) Essay tests. Consists of problems or
questions that require learners to
compose or construct written responses.
c) Problem-solving test. Consists of
problems or questions that require 4. Refrain from making the stem too wordy or
learners to solve problems in quantitative containing too much information unless the
or non-quantitative settings using problem or question requires the facts presented
knowledge and skills in mathematical to solve the problem.
concepts and procedures.
C. OPTIONS
Is it necessary to use all assessment types? 1. Provide 3-5 options per item with one only being
Answer: It is not necessary to use all at once but the correct answer or best alternative.
varying test types is helpful to cater to the 2. Write options that are parallel or similar in form
learning/ examination preferences of learners. and length to avoid giving clues about the correct
answer.
3. Place options in logical order (alphabetically,
Planning and Construction of Written shortest to longest)
Test: TEST PREPARATION 4. Place correct responses randomly to avoid
discernable patterns.
SAS 13 5. Use None of the above carefully and only when
there is one absolutely correct answer such as in
A. CONTENT spelling or in math.
1. Write items that reflect only one specific content 6. Avoid all of the above as an option, especially if
and cognitive processing skills. it is intended to be the correct answer.
2. Do not lift and use statements form the 7. Make all options realistic and reasonable.
textbook or other learning materials as test
questions GUIDELINES IN WRITING MATCHING-
3. Keep the vocabulary simple and understandable
based on the level of learners/ examinees TYPE ITEMS
4. Edit and proofread the items for grammatical 1. Clearly state directions the basis for matching
and spelling errors before administering them to the stimuli with the responses.
the learners/ examinees. Faculty: match the following
Good: column 1 is a list of Greek Gods, while
Column 1 presents Roman Gods. Write the letter of
the Roman God that corresponds to the Greek GUIDELINES IN WRITING ESSAY ITEM
Gods on the blank provided in column 1. TESTS
2. Ensure that the stimuli are longer and the 1. Clearly define the intended learning outcome to
responses are shorter. be assessed by the essay test.
3. Include only topics that are related to one 2. Refrain from using essay tests for intended
another and share the same foundation of learning outcomes that are better assessed by
information. other kinds of assessments.
4. Make the response options short, 3. Clearly define and situate the task within a
homogeneous and arranged in logical order. problem situation as well as the type of thinking
5. Include response options that are realistic and required to answer the question or test.
similar in length and grammatical form. 4. Present tasks that are fair, reasonable and
6. Provide more response options than the realistic to the students.
number of stimuli 5. Be specific in the prompts about the time
allotment and criteria for grading
GUIDELINES IN WRITING TRUE OR
FALSE STATEMENTS/ITEMS TYPES OF ESSAY TESTS
1. Include statements that are completely true or 1. Extended response- requires longer and
completely false. complex responses
2. Use simple and easy to understand Example: How is COVID 19 different from other
statements. Corona-virus Strains? Support your answer with
3. Refrain from using negatives especially double the details and information form the article
negatives. provided.
4. Avoid the use of absolutes such as always and 2. Restricted-response- focused and restrained
never. response
5. Express a single idea in each test item. Example: Student teachers are preparing for their
6. Avoid the use of unfamiliar words or demonstration at their cooperating schools. They
vocabulary. need to know the teaching skills they learned
7. Avoid lifting statements from the textbook and throughout their college years.
other learning materials. A. Identify at least three actions that will improve
their demonstration.
VARIATIONS OF TRUE OR FALSE B. Explain how each action will improve the
ITEMS: demonstration.
1. T-F Correction or Modified True-or-False
Question. In this format, a statement is GUIDELINES IN WRITING PROBLEM-
presented with a key word or phrase that is SOLVING TEST ITEMS
underlined, and the learner has to supply the 1. Identify and explain the problem clearly.
correct word or phrase. 2. Be specific and clear of the type of response
Example: butterflies are mammals required from the students.
2. Yes-No Variations. In this format, learner has 3. Specify in the directions the bases for grading
to choose yes or no rather than true or false. students’ answers/ procedures.
3. A-B Variation. In this format, the learner has to
choose A or B rather than true of false. VARIATIONS IN QUANTITATIVE
PROBLEM-SOLVING ITEMS
GUIDELINES IN WRITING SHORT 1. One-answer choice - This type of question
ANSWER TEST ITEMS contains four or five options and students are
1. Omit only significant words from the statement required to choose the best answer.
2. Do not omit too many words from the same 2. All possible answer choices - This type of
statement such that the intended meaning is lost. question has four or five options and students are
3. Avoid obvious clues to the correct response required to choose all of the options that are
4. Be sure that there is only one correct response correct.
Faulty: The government should start using 3. Type-in answer - This type of question does
renewable energy sources for generating not provide options to choose from instead, the
electricity such as __________. learners are asked to supply the correct answer.
Good: The government should start using The teacher should inform the learners at the start
renewable sources of energy by using how their answers will be rated.
turbines called __.
5. Avoid grammatical clues to the correct response ESTABLISHING REALITY AND
Faulty: The atomic particle with a negative
charge is called an ___________.
VALIDITY: ESTABLISHING REALITY
Good: The atomic particle with a negative PART 1
charge is called a(n) ___________. SAS 14
6. If possible, put the blank at the end of the
statement rather than at the beginning. RELIABILITY
Consistency of the responses to make user under
three conditions:
1. When retested on the same person; versions of the usually used
2. When retested on the same measure; and tests; usually for this
3. Similarity of response across items that done when the analysis
measure the same characteristics. test is
repeatedly used
Reliability of a measure can be high or low, for different
depending on the following factors: groups
1. The number of items in a test- the more (entrance
items a has, the likelihood of reliability is high; exams,
the probability of obtaining consistent scores licensure
is high because of the large pool of items. exams)
2. The individual differences in participants- 3. SPLI  Administer a  Correlate
every participants possesses characteristics T HALF test to a group the two sets
that affect their performance in a test such as of examinees of scores
fatigue, concentration, innate ability,  Items need to using
perseverance and motivation. be split into pearson r
3. External environment- include room halves usually  After
temperature, noise level, depth of instruction, using the odd- correlation,
exposure to materials, and quality of even technique use another
instruction which would affect changes in the  Get the sum of formula called
responses of examinees in a test. the points in the spearman-
odd numbered brown
WAYS HOW IS THIS WHAT STATISTICS items and coefficient
RELIABILITY DONE? IS USED? correlate it with  Both should
1. TEST  Have a test  Correlation the sum of the be significant
-  Administer to all (a statistical points of the and positive
RETEST examinees procedure even-numbered to mean that
ED  After not more where linear items the test has
than 6 months, relationship is  Each examinee internal
administer them expected for will have two consistency
again to same two variables) scores on reliability
group of from the first coming from the
examinees and second same test; the
 Responses on administratio sores on each
the test must be n set should be
more or less the  Pearson close of
same across product consistent
two points in moment  Split half is
time correlation or applicable when
 Applicable for pearson are the test has a
tests that may be used large number of
measure stable because test items
variables data are 4. TEST  Involves  Cronbach’s
(aptitude/psych usually in OF determining if alpha value
omotor) interval scale. ITERNA the scores for 0.60 and
2. PAR  There are two  Correlate L each item are above
ALLEL version of a the test CONSIS consistently indicates that
FORMS test; items need results for the TENCY answered by the the test items
to exactly first form and examinees have internal
measure the the second  Determine and consistency
same skill form record the  Kuder
 Each test  Significant scores for each Richardson
version is called and positive item to see if  The above
a form that is correlation the responses are used to
administered coefficient are are consistent determine the
one form at a expected with each other consistency of
time and the  The  This will work the test items
other form to significant when
another time to and positive assessment tool
the same group correlation has a large
of participants. indicates that number of items
 Responses of the responses , for scales and
two forms in the two inventories
should be more forms are the 5. INTE  Used to  Kendall’s
or less the same same or R determine the tau
 Applicable if consistent RATER consistency of coefficient of
there are two  Pearson r is RELIABI multiple raters concordance
LITY when using is used to 8. DIVERGENT VALIDITY used when the
ratings scales determine if components or factors of a test are
and rubrics to the ratings hypothesized to have a negative correlation.
judge provided by
performance multiple
 Reliability refers raters agree
to the similar with each
por consistent other
ratings provided  Indicates that
by more than the raters
one rater or concur or
judge when they agree with
use an each other in
assessment tool their ratings
 Applicable when
the assessment
requires the use
of multiple
raters

ESTABLISHING REALITY AND


VALIDITY: ESTABLISHING VALIDITY
PART 2
SAS 15

VALIDITY
How well a test measures what it is purported to
measure.

Why is it necessary?
While reliability is necessary, it aloe is not
sufficient. For a test to be reliable, it also needs to
be valid.

TYPES OF VALIDITY

1. FACE VALIDITY ascertains that the measure


appears to be assessing the intended
construct under study.
2. CONSTRUCT VALIDITY used to ensure that
the measure is actually measuring what it is
intended to measure (I.e the construct), and
not other variables.
3. CRITERION-RELATED VALIDITY
(PREDICTIVE VALIDITY) is used to predict
future or current performance it correlates test
results with another criterion of interest.
4. FORMATIVE VALIDITY when applied to
outcomes assessment it is used to assess how
well a measure is able to provide information
to help improve the program under study.
5. SAMPLING VALIDITY (similar to content
validity) ensures that the measure covers the
broad range of areas within the concept under
study.
6. CONCURRENT VALIDITY is used when two or
more measure are present for each examinee
that measure the same characteristics. The
scores on the measure should be correlated.
7. CONVERGENT VALIDITY used when the
components or factors of a test are
hypothesized to have a positive correlation.
Correlation is done for the factors of the test.

You might also like