LANGUAGE ASSESSMENT GROUP 6
LANGUAGE ASSESSMENT GROUP 6
“Cunstructing Test”
Created to fulfill the assignment of the Reading For Language Assessment
Written by Group 6:
1. RAHMADHANI (2322055)
2. SYAHRI AMALIAWATI (2322063)
3. NURUL FADILA (2322065)
4. AFIFAH YASRI (2322069)
LECTURER:
LENA SASTRI M.Pd
2024/2025
PREFACE
All praise is due to Allah SWT, whose guidance and blessings have enabled
the writers to complete this paper titled “Constructing Test” on time. This paper is
written to fulfill the assignment for the Language Assessment course under the
guidance of Mrs. Lena Sastri, M.Pd.
Writing this paper was not without challenges. However, with the support and
assistance of others, the writers were able to overcome those challenges. We
acknowledge that this paper is not perfect and welcome any constructive feedback to
improve future works.
We sincerely thank everyone who helped in the completion of this paper. May
Allah SWT bless them for their support. We hope this paper will be useful and provide
insights for readers, especially in understanding topics related to Language
Assessment.
The Writers
i
TABLE OF CONTENTS
PREFACE ..................................................................................................................... i
TABLE OF CONTENTS............................................................................................. ii
CHAPTER I ................................................................................................................. 1
INTRODUCTION ....................................................................................................... 1
A. Background ..................................................................................................... 1
B. Problems .......................................................................................................... 1
C. Purposes ........................................................................................................... 1
CHAPTER II................................................................................................................ 2
DISCUSSION .............................................................................................................. 2
A. Conclusion……………………………………………………………..…….11
REFERENCE............................................................................................................. 12
ii
CHAPTER I
INTRODUCTION
A. Background
The preparation of test items and directions is a fundamental aspect of
creating a high-quality language assessment. A well-constructed test not only
evaluates students’ abilities and knowledge but also plays a crucial role in shaping
the learning process. In any educational setting, the goal of assessment is to measure
whether students have achieved the learning outcomes outlined in the course
objectives. The process of preparing test items involves selecting appropriate
content, formulating questions that reflect the key skills taught, and ensuring that
each item aligns with the goals of instruction. If the instructions are ambiguous or
difficult to follow, they can hinder performance and affect the reliability of the
results.
1
CHAPTER II
DISCUSSION
1
David P. Harris, “Testing Language as a Second Language”, (1969, New York: McGraw-Hill)
p. 94
2
Harris, p. 95
2
4. Writing simple descriptions and expositions
The test will measure how much students have developed these skills.
2. Breaking Down the Course Objectives
The general objectives need to be divided into specific components to
determine what will be tested. These include:
1. Phonology/Orthography: The sound system (listening/speaking) and
written form (reading/writing).
2. Grammatical Structure: The rules and patterns used in speaking and
writing.
3. Vocabulary: The words needed for effective communication.
4. Fluency: The speed and ease of understanding and producing language.
3. Establishing the General Design of the Test
With the objectives outlined, the next step is designing the test. Key factors
include the time allocated and the test's speed requirements.If the test is 120 minutes
long, 10 minutes should be reserved for administrative tasks, leaving 110 minutes
for testing.The test will cover structure and vocabulary, with 70 fill-in-the-blank
items (45 minutes) and 40 vocabulary paraphrase items (45 minutes). Additionally,
the phonology section will have:
• Sound Discrimination: 20 items where students identify matching
words.
• Auditory Comprehension: 20 items where students choose the correct
answers to questions.
This design allows the test to be completed in 120 minutes.
3
We should now compare our test content with the detailed course outline
we created earlier. The Figure shows a grid mapping the main components of the
course’s four skill areas. The checkmarks indicate the components our test will
measure. As seen in the chart, our test covers most of the course goals, though we
recognize that no single test can cover all the objectives the teacher aims to achieve.
The teacher may include other factors in the final evaluation, such as weekly
compositions or general progress in speaking, based on daily class performance.
The next step in creating our test would be selecting the specific content for
the test items. As we’ve noted, the test content should reflect the course material.
We would begin by listing the phonological, grammatical, and lexical points
covered in the textbook, noting the emphasis given to each. We may find that our
list contains more material than can fit into a two-hour exam, so some content might
need to be omitted. For example, we might exclude material from earlier chapters
that serves as review. We could focus on new material learned during the course.
Even then, we might need to narrow down the list based on the following questions:
a. Which phonemic contrasts were most difficult for the students?
b. Which grammatical structures were emphasized and need further
review?
c. Which vocabulary items are most useful to the students?
4
If deciding on test content becomes challenging, we can remember that all
educational testing is a sampling process. Our final exam would likely be just one
of several evaluations used to assess student progress.
.
4. Preparing the Test Items and Directions
1. Additional Notes on the Preparation of Items, when preparing test items,
it's essential to write more items than needed because some may have
flaws that are not obvious during initial creation. A common guideline
is to start with 30-40% more items than will be required. This extra
material allows for a careful review to discard ineffective items.
Additionally, pretesting will reveal further issues with items, and more
reductions will be necessary. To ensure important topics are still
covered, it’s advisable to begin with multiple items testing the same
concept, as one might be effective while another is not.Each test item
should be typed on a separate slip of paper (recommended size: 5- by 8-
inch) with the answer on the back. This method allows easy rearranging,
adding, or removing items without needing to rewrite the whole test. It’s
crucial to hide the answer so that the reviewer makes an independent
judgment on the correct response.
2. Writing Test DirectionsTest directions should be clear, brief, and
straightforward, accompanied by examples to ensure all examinees,
including slower learners, understand the task. Directions should also
clarify if guessing is allowed and indicate the time allowed for the test.
If the test is time-sensitive, it's helpful to prepare examinees by noting
the speed requirement in the instructions to prevent anxiety over
incomplete sections.
5
as satisfactory can be marked “O.K.” with the reviewer’s initials. This review often
identifies items that might later cause issues in pretesting or with subject-matter
specialists. Minor defects in items can usually be corrected and kept.
Pretesting is a crucial step for objective tests. It involves trying the items on
a group of subjects similar to those for whom the test is designed. Items are only
included in the final test if they meet two criteria:
1. they must be at an appropriate difficulty level and
2. must differentiate between those who know the material and those
who do not.
Pretesting also provides insight into the effectiveness of the test directions
and the estimated time needed for examinees to complete the test. If the directions
are unclear or too difficult, adjustments can be made.
For pretesting, it’s important that the pretest subjects resemble the target test
population. This makes the pretest data more meaningful. Some test developers may
also use native speakers of the language for validation, as items that are difficult for
native speakers may not be suitable for non-native learners. In some cases, such as
with advanced tests, this approach may not be necessary.
Pretests often require more time than the final version of the test to ensure
that most examinees can attempt all items. If many examinees do not finish the
pretest, there will be insufficient data on the last items. A common practice is to
allow more time for pretesting, then mark the items the examinees have reached
when time is called.
In informal classroom tests, pretesting may not be feasible, so this step and
subsequent item analysis may be skipped. However, for formal tests, pretesting is
essential to ensure the items’ validity and clarity before finalizing the test.
6
by over 92% of examinees) or too difficult (answered correctly by less than
30%) should be discarded. Remaining items should then be checked for
discrimination.
2. Determining Item Discrimination: To check how well each item
distinguishes between high and low performers, separate the highest and
lowest 25% of the scores. For each item, subtract the number of low scorers
who answered correctly from the number of high scorers who did. This
result is then divided by the number of examinees in each group to calculate
the item discrimination index. A positive value indicates good
discrimination, while negative or low values (below .30) suggest the item
should be revised or discarded.
3. Determining the Effectiveness of Distracters: For multiple-choice items,
analyze the distracters (incorrect options). If a distracter attracts no
examinees or more high scorers than low scorers, it is nonfunctional or
malfunctioning, and should be replaced. Revised items must be pretested
again, as changes can affect the original statistics.
4. Recording Item Analysis Data: Record the results of the item analysis on an
"item analysis slip," which includes the full item, pretest identification, item
position, difficulty and discrimination indices, and responses from high and
low groups. This helps in organizing and revising the test efficiently.
Using item analysis slips simplifies test revision and preparation, ensuring that the
final test is both effective and fair.
7
1. Arrange the items in increasing difficulty.
2. Ensure answer choices (A, B, C, D) are used roughly the same number of
times.
3. Avoid creating a discernible pattern in the answers.
8
numbers to indicate choices can also reduce confusion. Additionally,
alternating letter sets for odd and even-numbered items can further reduce
miskeying.
• Homemade Answer Sheets
For local testing, homemade answer sheets can be used. A stencil key can
be made by punching out correct answers, allowing scorers to quickly
assess the responses by matching them to the answer sheet.
• Preparing Equivalent Forms of a Test
It is often beneficial to prepare multiple forms of a test, especially when
the test is going to be administered repeatedly. Equivalent or parallel
forms serve several purposes, the most common of which are:
1. Pre- and Post-Testing:
One form of the test can be used at the beginning of a course or
training program, and another form at the end to assess
improvement. Using two equivalent forms helps prevent the
examinee from remembering specific items and repeating them,
which may occur if the same test is used both before and after the
course.
2. Decreasing Test Compromise:
When two or more forms of a test are administered alternately and
according to an irregular schedule, it reduces the temptation for
examinees to memorize and share answers. This method minimizes
the risk of "test compromise," where one examinee might share
content with others.
• Constructing Equivalent Forms
The simplest way to construct equivalent forms is to pretest enough
material to create two forms based on a single item analysis. Both forms
should have:
1. Similar item difficulty distributions: Both forms should have a
comparable difficulty level to ensure fairness.
2. Same general content-area specifications: For example, if the
test assesses word order and verb tense, both forms should cover
9
the same number of items on these topics. The specific content,
however, should differ between the forms to ensure variety.
10
CHAPTER III
CLOSING
A. Conclusion
In conclusion, the process of preparing test items and directions is an
essential aspect of creating an effective and reliable language assessment.
Throughout this paper, we have highlighted the importance of aligning test items
with course objectives, ensuring clarity in test directions, and considering factors
such as item difficulty and fairness. Additionally, pretesting the items and
reviewing them with experts or colleagues are crucial steps in refining the test to
ensure its accuracy and validity.
Suggestion
For the next writers paper about Purposes and method of language
assessmentare advised to look for more writing reference resources in writing
our papers. In reality, progress of this paper is still very simple and straight
forward. And even in preparing assignment or this papers, criticism and
suggestion are still needed for discussing the material.
11
REFERENCE
Harris, D.P. 1969. Testing English As a second Language. New York: McGraw
Hill.