0% found this document useful (0 votes)
172 views55 pages

OSCE Boursicot

The document provides information about the Objective Structured Clinical Examination (OSCE) format for assessing clinical skills. It discusses the OSCE format, purposes, advantages, principles for writing OSCE stations, training examiners, and considerations for scoring. The OSCE is designed to test clinical performance objectively using structured, timed stations that may involve interactions with simulated patients while examiners observe and score candidates. It aims to reliably and validly assess a wider range of clinical skills than traditional assessments.

Uploaded by

Shouja Chaudury
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
172 views55 pages

OSCE Boursicot

The document provides information about the Objective Structured Clinical Examination (OSCE) format for assessing clinical skills. It discusses the OSCE format, purposes, advantages, principles for writing OSCE stations, training examiners, and considerations for scoring. The OSCE is designed to test clinical performance objectively using structured, timed stations that may involve interactions with simulated patients while examiners observe and score candidates. It aims to reliably and validly assess a wider range of clinical skills than traditional assessments.

Uploaded by

Shouja Chaudury
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 55

OSCE

Kathy Boursicot
Train the Trainer Assessment Workshop
October 29, 2003

Hong Kong International Consortium

OSCE

Format

Purpose

Advantages

Writing principles

Training observers

Scoring considerations

What is an OSCE?

Objective

Structured

Clinical

Examination

Harden RG and Gleeson FA Assessment of clinical competence using an


objective structured clinical examination (OSCE) Medical Education,1979,
Vol 13: 41-54

OSCE test design

Observed Stations:
clinician examiners

Varieties of OSCEs
Patient-based

Traditional OSCE
SP-based test
Station couplets
Integral consultations

Clinical task

Written task

.
.
.
.

OSCE

Format

Purpose

Advantages

Writing principles

Training observers

Scoring considerations

Professional authenticity

Simple model of competence

Does
Shows how
Knows how
Knows
Miller GE. The assessment of clinical
skills/competence/performance. Academic
Medicine (Supplement) 1990; 65: S63-S7.

Testing formats

Behaviour~
attitude/skills

Professional practice

Does

Shows how
Cognition~
knowledge

Knows how
Knows

OSCEs
EMQs, SEQs
MCQs

OSCE - Objective

All the candidates are presented with


the same test

Specific skill modalities are tested at


each station

History taking

Explanation

Clinical examination

Procedures

OSCE - Structured

The marking scheme for each station


is structured

Structured interaction between


examiner and student

OSCE Clinical Examination

Test of performance of clinical skills

candidates have to demonstrate their


skills, not just describe the theory

OSCE

Format

Purpose

Advantages

Writing principles

Training observers

Scoring considerations

Characteristics of assessment
instruments

Utility =
Reliability

Validity

Educational impact

Acceptability

Feasibility

Van der Vleuten, C. The assessment of professional competence:


developments, research and practical implications,
Advances in Health Science Education, 1996, Vol 1: 41-67

Test characteristics

Reliability of a test / measure


reproducibility

of scores across
raters, questions, cases,
occasions

capability

to differentiate
consistently between good &
poor students

Sampling
Domain of Interest

Test Sample

Test Sample

Reliability

Competencies are highly domain-specific

Broad sampling is required to obtain


adequate reliability

across content, i.e., range of cases/situations

across other potential factors that cause


error variance, i.e.,

testing time, number of cases, examiners,


patients, settings, facilities

Test characteristics

Validity of a test / measure

The content is deemed appropriate by


the relevant experts

The test measures the characteristic


(e.g. knowledge, skills) that it is
intended to measure

The performance of a particular task


predicts future performance

Test characteristics

Validity of a test / measure

OSCE

Format

Purpose

Advantages

Writing principles

Training observers

Scoring considerations

Advantages of using OSCEs in clinical


assessment

Careful specification of content = Validity

Observation of wider sample of activities =


Reliability

Structured interaction between examiner &


student

Structured marking schedule

Each student has to perform the same tasks =


Acceptability

OSCE

Format

Purpose

Advantages

Writing principles

Training observers

Scoring considerations

OSCE Station Writing

How to start

Decide what tasks you

want to
can
should

test in an OSCE format

OSCEs test performance, not


knowledge

Constructive alignment

Need to know the learning


objectives of your course /
programme

Map these across :

Subject areas

Knowledge areas

Skill areas

Blueprinting

Content of the assessment should


align with the learning objectives of
the course

Blueprinting

allows mapping of test items to


specific learning outcomes

ensures adequate sampling across


subject area and skill domains

OSCE blueprint: systems-based


Hx taking
(incl. diag)
CVS
Endocrine
Gastro
H&N
Haem & LN
Musculoskl
etc

Phys exam
(incl. diag)

Procedures

Counseling/E
ducation

Ordering
investigs

OSCE blueprint: discipline-based


Hx taking
(incl. diag)
Anaes & CC
Clin Pharm
Comm Health
Emergency med
Family med
Musculoskel
etc

Phys exam
(incl. diag)

Procedures

Counseling/E
ducation

Ordering
investigs

Key features of success in designing


OSCEs

Feasibility

Congruence

Feasibility

Is it a reasonable task to expect the


candidates to perform?

Can the task be examined at an


OSCE station?

Can the task be performed in the


time allowed?

Feasibility

Is it a reasonable task to expect the


candidates to perform? Is it authentic?

Can the task be examined at an OSCE


station?
Match clinical situations as closely as possible
Some tasks may require simulated patients
Some tasks may require manikins
Some tasks simply cannot be examined in this
format

Feasibility

Can task be performed in time


allowed?

Pilot the stations to see if they are


feasible

Check equipment /helpers/practicalities

Congruence

Is it testing what you want it to


test?

Station construct: describe what


station is testing

Congruence

Ensure that all parts of station coordinate

Candidate instructions

Marking schedule

Examiner instructions

Simulated patient instructions

Equipment

Station construct

This station tests the candidates


ability
to

Candidate instructions

State circumstances: e.g. outpatient clinic,


ward, A & E, GP surgery

Specify the task required of the candidate:


e.g. take a history, perform a neurological
examination of the legs, explain a diagnosis

Specify tasks NOT required

Instruct on summing up: e.g. tell the


patient, tell the examiner

Examiner instructions

Copy of candidate instructions

Specific instructions appropriate to the task:

do not prompt, explicit prompts, managing


equipment
e.g.,

Simulated patient instructions

Give as much detail as possible so they


can be consistent

try to leave as little as possible for them to ad


lib!

Give enough information to enable them to


answer questions consistently

Be specific about affect in each role

Specify patient demographics

i.e., gender, age, ethnicity, social class, etc.

Marking schedule

Ensure marks are allocated for tasks the


candidates are asked to perform

Decide relative importance of diagnosis vs


process (history taking, examination)

Separate checklist for process skills

Equipment

Be detailed
Think of

Chairs + table / couch / bench


Manikins - specify
Medical equipment

Stethoscope, ophthalmoscope, sphyg,


suturing materials, etc

Designing stations

Use your blueprint

Be clear what you are testing: define


the construct

Check for congruence

Pilot for feasibility

OSCE

Format

Purpose

Advantages

Writing principles

Training observers

Scoring considerations

Training observers

Understand the principles of OSCEs

Enhance inter-rater consistency

Techniques

Examiners must train together

Videos
live stations

Discussion of marking
inconsistencies

Training observers

General training

Station-specific training

OSCE

Format

Purpose

Advantages

Writing principles

Training observers

Scoring considerations

Scoring considerations

Global vs checklist scoring

Weighting

Standard setting

Checklist scoring

Advantages

Helps examiner know what the station setters are


looking for

Helps the examiner be objective

Facilities the use of non-expert examiners

Disadvantages

Can just reward process/thoroughness

May not sufficiently reward the excellent candidate

Ignores the examiners expertise

Global scoring

Advantages

Utilises the expertise of the examiners

They are in a position to make a (global)


judgement about the performance

Disadvantages

Examiners have to be expert examiners i.e.


trained
Examiners must be familiar with expected
standards for the level of the test

Weighting

In a checklist, some items may be weighted


more than others

More complicated scoring system

Makes no difference to very good & very bad


candidates

Can enhance discrimination at the cut score

Standard setting

No perfect method!

Should be criterion-referenced method

e.g. Angoff, Ebel, etc.

But
are these suitable for performance based
tests?

Performance-based standard setting


methods

Borderline group method

Contrasting group method

Regression based standard method

Kramer A, Muijtjens A, Jansen K, Dsman H, Tan L, van der Vleuten C


Comparison of a rational and an empirical standard setting procedure for an OSCE,
Medical Education, 2003 Vol 37 Issue 2, Page 132

Borderline method
Test score distribution

Checklist
1.
2.

3.
4.
5.
6.
7.

TOTAL

Borderline score distribution

Pass, Fail, Borderline P/B/F

Passing score

Contrasting groups method


Test score distribution

Checklist
1.
2.

3.
4.
5.
6.
7.

TOTAL

Fail

Pass

Pass, Fail, Borderline P/B/F

Passing score

Regression based standard

Checklist

X = passing score

1.
2.

Checklist
Score

3.
4.
5.
6.
7.

TOTAL

Overall rating 1 2 3 4 5
1
2
3
4
5

=
=
=
=
=

Clear fail
Borderline
Clear pass
v good pass
excellent pass

1
Clear
fail

2
Borderline

3
Clear
pass

v good pass excellent pass

Borderline/contrasting/regression based
methods

Panel equals examiners

Reliable: cut-off score based on large sample


of judgments (no. of stations x no. of candidates)

Credible: based on expert judgment in direct


observation

Passing score not known in advance

(as with all

examinee centered methods)

Judgments not independent of checklist


scoring

You might also like