0% found this document useful (0 votes)
85 views

Lecture 1 - Introduction To Language Assessment

This document discusses language assessment. It defines assessment as an ongoing process to ensure course objectives are met, noting that a test is one form of assessment. Informal assessments include unplanned feedback, while formal assessments are systematic exercises like tests. The document outlines various types of assessments, including norm-referenced tests, criterion-referenced tests, and authentic assessments like performances and portfolios. It discusses principles of assessment such as practicality, reliability, validity, authenticity, and washback effect. Alternative assessment options like self-assessment and journals are also covered.

Uploaded by

Insaf Ali
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
85 views

Lecture 1 - Introduction To Language Assessment

This document discusses language assessment. It defines assessment as an ongoing process to ensure course objectives are met, noting that a test is one form of assessment. Informal assessments include unplanned feedback, while formal assessments are systematic exercises like tests. The document outlines various types of assessments, including norm-referenced tests, criterion-referenced tests, and authentic assessments like performances and portfolios. It discusses principles of assessment such as practicality, reliability, validity, authenticity, and washback effect. Alternative assessment options like self-assessment and journals are also covered.

Uploaded by

Insaf Ali
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 35

Introduction

Ms. Syeda Tuba Javaid


MS TESOL and Applied Linguistics, UK
CELTA, University of Cambridge, UK
MA English Literature, Pakistan
Introduction to
Language Assessment
Ms. Syeda Tuba Javaid
What is assessment?

• Not the same as testing!


• An ongoing process to ensure that the
course/class objectives and goals are
met.
• A process, not a product.
• A test is a form of assessment. (Brown,
2004, p. 5)
Informal and Formal Assessment

• Informal assessment can take a number of forms:


• unplanned comments, verbal feedback to students, observing
students perform a task or work in small groups, and so on.

• Formal assessment are exercises or procedures which are:


• systematic
• give students and teachers an appraisal of students’ achievement such
as tests.
Traditional Assessment

• Multiple-choice
• True-false
• Matching
• Norm-referenced and criterion referenced
tests
Norm and Criterion-referenced tests

• Norm-referenced test
• standardized tests (college board, TOEFL, GRE)
• Place test-takers on a mathematical continuum in rank
order
• Criterion-referenced tests
• give test-takers feedback on specific objectives
(“criterea”)
• test objectives of a course
• known as “instructional value”
Authentic Assessment
• Authentic assessment
• reflects student learning, achievement,
motivation, and attitudes on instructionally
relevant classroom activities (O’Malley &
Valdez, 1996).
• Examples:
• performance assessment

• portfolios

• self-assessment
Purposes for Assessment

• Diagnose students strengths and needs


• Provide feedback on student learning
• Provide a basis for instructional placement
• Inform and guide instruction
• Communicate learning expectations
• Motivate and focus students’ attention and
effort
• Provide practice applying knowledge and skills
Purposes continued

• Provide a basis for evaluation for the purpose of:


• Grading
• Promotion/graduation
• Program admission/selection
• Accountability
• Gauge program effectiveness
Assessment Instruments
Pre- Formative (ongoing) Summative
assessment (final)
(diagnostic)
Quizzes Teacher-made test
Pretests
Discussion Portfolios
Observation
Journals/logs Assignments
s Projects
s
Discussions Projects Standardized tests
Questionnaire Observations
s Interviews Portfolios
Journal logs
Standardized tests
Discussion

• How would you document a student


performance during a discussion?

• Which types of assessments noted in the chart could be


considered authentic assessment?
Principles of Language Assessment

• Practicality
• Reliability
• Validity
• Authenticity
• Washback
Practicality

• An effective test is practical


• Is not excessively expensive
• Stays within appropriate time constraints
• Is relatively easy to administer
• Has a scoring/evaluation procedure that is
specific and time-efficient
Reliability

• A reliable test is consistent and dependable. If you


give the same test to the same students in two
different occasions, the test should yield similar
results.
• Student-related reliability

• Rater reliability

• Test administration reliability

• Test reliability
Student Related Reliability

• The most common issue in student related


reliability is caused by temporary illness,
fatigue, a bad day, anxiety, and other physical
and psychological factors which may make an
“observed” score deviate from a “true” score.
Rater Reliability

• Human error, subjectivity, and bias may


enter into the scoring process.
• Inter-rater reliability occurs when two or more
scorers yield inconsistent scores of the same
test, possibly for lack of attention to scoring
criteria, inexperience, inattention, or even
preconceived bias toward a particular “good”
and “bad” student.
Test Administration Reliability
• Test administration reliability deals with
the conditions in which the test is
administered.

• Street noise outside the building


• bad equipment
• room temperature
• the conditions of chairs and tables,
photocopying variation
Test Reliability

• The test is too long


• Poorly written or ambiguous test
items
Validity

• A test is valid if it actually assess the


objectives and what has been taught.

• Content validity
• Criterion validity (tests objectives)
• Construct validity
• Consequential validity
• Face validity
Content Validity
• A test is valid if the teacher can clearly define
the achievement that he or she is measuring
• A test of tennis competency that asks
someone to run a 100-yard dash lacks
content validity
• If a teacher uses the communicative approach
to teach speaking and then uses the
audiolingual method to design test items, it is
going to lack content validity
Criterion-related Validity

• The extent to which the objectives of the test have


been measured or assessed. For instance, if you are
assessing reading skills such as scanning and skimming
information, how are the exercises designed to test
these objectives?

• In other words, the test is valid if the objectives


taught are the objectives tested and the items are
actually testing this objectives.
Construct Validity
• A construct is an explanation or theory
that attempts to explain observed
phenomena

• If you are testing vocabulary and the lexical


objective is to use the lexical items for
communication, writing the definitions of
the test will not match with the construct of
communicative language use
Consequential Validity

• Accuracy in measuring intended criteria


• Its impact on the preparation of test-takers
• Its effect on the learner
• Social consequences of a test interpretation
(exit exam for pre-basic students at El Colegio,
the College Board)
Face Validity
• Face validity refers to the degree to which a test
looks right, and appears to measure the knowledge
or ability it claims to measure
• A well-constructed, expected format with familiar tasks
• A test that is clearly doable within the allotted time limit
• Directions are crystal clear
• Tasks that relate to the course (content validity)
• A difficulty level that presents a reasonable challenge
Authenticity

• The language in the test is as natural as


possible
• Items are contextualized rather than isolated
• Topics are relevant and meaningful for learners
• Some thematic organization to items is
provided
• Tasks represent, or closely approximate,
real- world tasks
Washback

• Washback refers to the effects the tests have on instruction in


terms of how students prepare for the test “Cram” courses and
“teaching to the test” are examples of such washback

• In some cases the student may learn when working of


a test or assessment

• Washback can be positive or negative


Alternative Assessment Options

• Self and peer-assessments


• Oral production-student self-checklist, peer checklist, offering and
receiving holistic rating of an oral presentation
• Listening comprehension- listening to TV or radio broadcasts and
checking comprehension with a partner
• Writing-revising work on your own, peer-editing
• Reading- reading textbook passages followed by self-check
comprehension questions, self-assessment of reading habits (page 416,
Brown, 2001)
Authentic Assessment

• Performance assessment- any form of assessment in which the


student constructs a response orally or in writing.

• It requires the learner to accomplish a complex and significant


task, while bringing to bear prior knowledge, recent learning,
and relevant skills to solve realistic or authentic problems
(O’Malley & Valdez, 1996; Herman, et. al., 1992).
Examples of Authentic Assessment

• Portfolio assessment
• Student self-assessment
• Peer assessment
• Student-teacher conferences
• Oral interviews
• Writing samples
• Projects or exhibitions
• Experiments or demonstrations
Journals
• Specify to students the purpose of the journal
• Give clear directions to students on how to get
started (prompts for instance “I was very happy
when…)
• Give guidelines on length of each entry
• Be clear yourself on the principal purpose of the
journal
• Help students to process your feedback, and
show them how to respond to your responses
Conferences
• Commonly used when teaching writing
• One-on-one interaction between teacher
and student
• Conferences are formative assessment as
opposed to offering a final grade or a
summative assessment. In other words, they
are meant to provide guidance and feedback.
Portfolios
• Commonly used with the communicative
language teaching approach (CLT)
• It is a collection of students’ work that demonstrates
to students and others the efforts, progress and
achievements in a given area. You can have a reading
portfolio or a writing portfolio, for instance
• You can also have a reflective or assessment
portfolio as opposed to collecting every piece of
evidence for each objective achieved in the course
Portfolio Guidelines
• Specify the purpose of the portfolio
• Give clear directions to students on how to get
started
• Give guidelines of acceptable materials or artifacts
• Collect portfolios on a pre-announced dates and
return promptly
• Help students to process your feedback
• Establish a rubric to evaluate the portfolio and
discuss it with your students
Cooperative Test Construction

• Cooperative test construction involves the


students contribution to the design of test
items. It is based on the concept of
collaborative and cooperative learning in
which students are involved in the
process(Brown, 2001, p. 420)
Any questions?
Thank you

Ms. Syeda Tuba Javaid

You might also like