0% found this document useful (0 votes)
8 views

ospe new

The study investigates the effectiveness of the Objective Structured Practical Examination (OSPE) as a formative assessment tool for first-year MBBS students in medical education, comparing it to traditional assessment methods. Results indicate that students performed significantly better with OSPE, receiving higher mean marks and expressing strong support for its implementation, while teachers unanimously agreed on its comprehensive evaluation capabilities. The findings suggest that OSPE enhances objectivity and reliability in student evaluations, making it a valuable addition to medical education assessment practices.

Uploaded by

annmarygeorge495
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

ospe new

The study investigates the effectiveness of the Objective Structured Practical Examination (OSPE) as a formative assessment tool for first-year MBBS students in medical education, comparing it to traditional assessment methods. Results indicate that students performed significantly better with OSPE, receiving higher mean marks and expressing strong support for its implementation, while teachers unanimously agreed on its comprehensive evaluation capabilities. The findings suggest that OSPE enhances objectivity and reliability in student evaluations, making it a valuable addition to medical education assessment practices.

Uploaded by

annmarygeorge495
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 10

OSPE [Objective Structural Practical Examination]

Abstract

Background: Practical assessments hold a critical role in evaluating medical


education. However, achieving objectivity, consistency, authenticity, reliability, and
practical usefulness in student evaluations can be a formidable challenge. The
Objective Structured Practical Examination (OSPE) stands out as a promising
technique tailored to assess performance in a realistic educational setting. OSPE
offers a unique approach to aligning assessment methods with the educational
objectives of a given activity, making it possible to comprehensively gauge the
attainment of pedagogical goals.

Objective: This study aimed to overcome the limitations associated with traditional
practical tests and explore the potential advantages of OSPE in improving the
objectivity, consistency, authenticity, and reliability of student evaluations in the
context of medical education. Through a comparative analysis, this research
endeavors to illuminate the practical applicability of OSPE. The primary goal of this
research was to introduce and assess the feasibility of employing the OSPE as a
formative assessment tool for appraising the practical capabilities of Physiology
students.

Methodology: Fifty students from 1st year MBBS were included in this study after
their written consent. They were divided into two groups of 25 students each; two
practical procedures, (a) hemoglobin estimation, and (b) performing blood group.
Students were assessed at two different sessions. Students of each group assessed
by the conventional method in the first session were assessed by OSPE in the
second session of the same practical and vice versa. At the completion of the
assessment process, both students and teachers were asked to rate the various
assessment techniques on a Likert scale. Student test results and instructor and
student opinions were statistically examined using the paired t-test. A significance
level of 0.05 was used.

Results: When evaluated using the OSPE method, students obtained significantly
higher mean marks (12.58±2.74) compared to the conventional assessment method
(8.44±2.13). A paired t-test confirmed the statistical significance of the improvement
in student performance with OSPE (p<0.0001). Student feedback indicated strong
agreement (92%) that OSPE encourages greater focus on practical examinations
and is an effective assessment and learning method. Teachers expressed
unanimous agreement that OSPE is a more comprehensive evaluation tool (100%)
and better at highlighting student strengths and weaknesses (75%). The majority of
teachers (75%) believed that OSPE should be incorporated into future examinations.
Conclusion: The study demonstrates that OSPE significantly enhances student
performance and is well-received by both students and teachers as a more effective
and comprehensive assessment method.
Keywords: conventional method, reliability, validity, students, objective structured
practical examination

Introduction

The purpose of any evaluation procedure is to determine how well pupils have
learned their intended content . This means that every kind of assessment must
have a clear connection to the aims of the course. Current methods of evaluating
students' performance in physiology labs are obsolete since they focus on
knowledge that is unnecessary for the development of a competent physician. The
practical test is a crucial part of the medical education assessment process.
However, if requirements like impartiality, consistency, validity, dependability, and
practicability are to be satisfied, student assessment becomes a difficult task .
Presently, at most medical institutions in India, practical exercises in physiology are
done and assessed in the traditional manner, i.e., a student is given an experiment
to execute, a viva is held after the practical exercise is completed, and the candidate
is then evaluated.

There are a number of methods for making this more uniform, but the Objective
Structured Practical Examination (OSPE) is often used. Harden and Gleeson's
Objective Structured Clinical Examination (OSCE) is the inspiration for this
approach. The OSPE seems to be a valid tool with a high capacity for distinguishing
various types of pupils. In these ways, it excels above the standard practical test. In
addition, it may be organized in a manner that allows for the assessment of all
laboratory teaching goals while giving due consideration to all relevant factors.

The human physiology traditional clinical examination (TCE) consists of a bedside


viva and an evaluation of the candidate's overall performance rather than a test of
their specific clinical abilities after they have completed a predetermined clinical
procedure. According to Miller's hierarchy of skills, TCE prioritizes the "knows" and
"knows how" levels. OSPE, on the other hand, is concerned with the apex of the
pyramid.

Traditional approaches to evaluating competence are not only more open to bias
than objective measures, but they also exclude the assessor from seeing the
candidate in action. In addition, there may be restrictions on what is really covered.
Sensitization towards a new evaluation system of the OSPE is required, as is a more
objective and organized assessment approach, feedback from the students, and
feedback to the students to recognize their deficiencies and enhance their clinical
abilities.

So, many have tried to come up with new methods to get around such problems.
The OSPE is one such approach. All of an activity's pedagogical goals may be
gauged thanks to the way the assessment was set up in this case. This research
aimed to spread OSPE across the physiology department as a novel evaluation tool.

Materials and methods

The research design was a comparative cross-sectional analysis. It took place in


Junagadh, India, in the Physiology Laboratory of the Gujarat Medical Education and
Research Society (GMERS) Medical College. Between May and July of 2016, the
research was conducted. Fifty MBBS students in their first year served as the study
population. Students were included in the research if they agreed to take part in it,
whereas those who said no were left out.

Written informed consent from the students was obtained before the research, and
approval was granted by the institution's ethics committee (IRB number:
IEC/GMERS/2016/11). Then, a month before the OSPE assessment, a brief lecture
and role play were used to familiarize staff and students with the assessment
technique. Fifty first-year medical school students agreed to take part in the
research. The time just before the last formative evaluation was chosen for analysis.
The effectiveness of two real-world methods was evaluated. Hemoglobin count and
blood type determination fell into that category. A subject matter expert was
consulted in order to develop an outline of the course material and a systematic
checklist for both observed and unobserved stations in accordance with Bloom's
taxonomy. Group A consisted of 25 pupils, while Group B consisted of the remaining
25 students. In all, four assessors were assigned to the task, split evenly between
Groups I and II. The test was broken up into two sessions (Table 1).

Table 1. Examination conducted .


CPE = Conventional Practical Examination

OSPE = Objective Structural Practical Examination


Day Teachers Group I Teachers Group II
Day 1 (Session Conducted CPE of Group – A (HB Conducted OSPE of Group – B (Blood group
1) estimation) testing)
Day 2 (Session Conducted CPE of Group – B (blood group Conducted OSPE of Group – A (HB
2) testing) estimation)
Open in a new tab

At the end of both sessions, Group A students performed hemoglobin estimation and
were assessed by both methods. Group B students performed blood group and were
assessed by both methods. Each student was exposed to a different set of
examiners in both exams to reduce the chances of bias.
In the traditional assessment method, students performed practical exercises
followed by viva. In OSPE, students were given comprehensive pre-exam training.
There were many stations in OSPE, and participants had limited amounts of time to
finish each one. The stations focused on observation, procedure, and brief questions
to test not just psychomotor skills but also higher-order thinking. Each checkpoint
was planned in tandem with the inspection checklist.

After the test, the scores from the classic practical were compared to those from the
OSPE. Using a pre-validated survey, we also collected and evaluated data on how
students and teachers felt about OSPE. All of the survey questions were of the 5-
point Likert kind. In ascending sequence, these points represent how much you
agree with the statement in question.

The data was examined using the paired t-test, with the results (marks) reported as
mean and standard deviation. The input was given in the form of percentages based
on a 5-point Likert scale. A significance level of 0.05 was used. The statistical
analysis was performed using Statistical Package for Social Sciences (SPSS),
version 16 (SPSS Inc, Chicago, IL).

Results

Fifty students taking the midterm test were included in the current research, and their
performance was evaluated by four instructors. Student performance as measured
by several evaluative modalities. During the examination, the mean marks obtained
by students with assessment by the conventional method were 8.44±2.13 whereas it
was 12.58±2.74 with the OSPE method (Table 2).
No. Parameter Strongly Agree Neutral Disagree Strongly Disagree
Agree Number Number Number Number (%)
Number (%) (%) (%) (%)
1 OSPE encourages us to pay 19 (38%) 27 (54%) 0 (0%) 2 (4%) 2 (4%)
more attention to practical
examination
2 OSPE is a good form of 27 (54%) 22 (44%) 1 (2%) 0 (0%) 0 (0%)
examination and learning
process
3 OSPE covers the relevant and 16 (32%) 25 (50%) 8 (16%) 1 (2%) 0 (0%)
important topics and is
consistent with the learning
objectives of the syllabus
4 Checklists provide a fair and 25 (50%) 17 (34%) 4 (8%) 3 (6%) 1 (2%)
unbiased system of marking
5 OSPE reduces the element of 27 (54%) 15 (30%) 5 (10%) 3 (6%) 0 (0%)
luck in the examination
6 OSPE is easier to pass the exam 28 (56%) 20 (40%) 2 (4%) 0 (0%) 0 (0%)
compared with conventional
method
7 OSPE is more stressful 4 (8%) 12 (24%) 6 (12%) 5 (10%) 23 (46%)
compared with conventional
method
8 Attitude of teacher during 19 (38%) 24 (48%) 6 (12%) 1 (2%) 0 (0%)
OSPE was better compared
with conventional method
9 OSPE should be used as a 21 (42%) 23 (46%) 4 (8%) 1 (2%) 1 (2%)
method of assessment in future
examinations
10 OSPE is a better way to assess 19 (38%) 21 (42%) 5 (10%) 1 (2%) 4 (8%)
the different aspects of
knowledge

Table 2. Comparative marks obtained by students assessed


by different assessment methods (maximum marks - 20).
*paired t-test

OSPE = Objective Structural Practical Examination

CPE = Conventional Practical Examination


Method N Minimum Marks Obtained Maximum Marks Obtained Mean SD T-value p-value*
Conventional 50 4 13 8.44 2.13006 11.21 0.0001
OSPE 50 5 18 12.58 2.74115
Open in a new tab

The results of the current research demonstrated that when compared to the
traditional technique of evaluation, pupils received considerably better grades when
examined using OSPE. Students' improved performance while using the OSPE
approach was statistically confirmed by a paired t-test, with a p-value of 0.0001,
much below the significance level of 0.05 (this holds true at the .05 level).

Students' opinions on the relative merits of the two testing procedures, as expressed
by their ratings on a 5-point Likert scale, are summarized in Table 3.
Table 3. OSPE feedback form and response from students.
OSPE = Objective Structural Practical Examination
Open in a new tab

The following information is gleaned from student feedback surveys. Ninety-two


percent of students feel that OSPE motivates them to focus more on practical tests,
while 8% disagree. Almost all of the students (98%) agree that OSPE is an effective
method of testing and education. Eighty-two percent of students feel that OSPE
adequately covers the material and aligns with course goals, while 2% strongly
disagree. More than half of students (56%) say that compared to the traditional
technique, OSPE is more demanding. During OSPE, 86% of students said they felt
their teacher's demeanor improved. The majority of students (80%) believe that
OSPE is a more accurate reflection of their knowledge in all areas. The vast majority
of students (84%) believe that OSPE makes exams more fair. The vast majority of
students (88%) feel that OSPE testing should be included in the next exams.

Teachers' input on a 5-point scale about a comparison of the two evaluation methods
The details of the Likert scale are shown in Table 4.

Table 4. OSPE feedback form and response from teachers.


OSPE = Objective Structural Practical Examination
Questions Strongly Agree Neutral Disagree Strongly
Agree Number Number Number Number (%) Disagree
(%) (%) (%) Number (%)
OSPE covers a wider range of 3 (75%) 1 (25%) 0 0 0
knowledge as compared with
conventional examination
OSPE compels the student to learn 4 (100%) 0 0 0 0
different procedures in detail
OSPE specifically highlights the 1 (25%) 2 (50%) 1 (25%) 0 0
weak and strong parts of subject of
student
OSPE is more stressful compared 0 3 (75%) 1 (25%) 0 0
with conventional method
OSPE is more exhausting compared 0 3 (75%) 1 (25%) 0 0
with conventional method
OSPE is a better way to assess the 3 (75%) 1 (25%) 0 0 0
different domains of knowledge of
student
Checklists in OSPE provides a fair 1 (25%) 2 (50%) 1 (25%) 0 0
system of marking
The factor of variability of examiner 3 (75%) 0 1 (25%) 0 0
can be removed in better way by
OSPE
OSPE should be used as method of 2 (50%) 1 (25%) 1 (25%) 0 0
assessment in future examinations
Open in a new tab

One hundred percent of educators in this sample think that OSPE is more
comprehensive than traditional testing methods. One hundred percent of educators
feel that OSPE forces their students to master certain practices. Seventy-five percent
of educators believe OSPE is a more stressful evaluation approach since it calls
attention to both students' subject-area strengths and weaknesses. One hundred
percent of educators think OSPE is a superior tool to evaluate students' proficiency
across a variety of subject areas. Seventy-five percent of educators believe OSPE
eliminates examiner variability in testing. Seventy-five percent of educators believe
that OSPE should be included in the next exams.

Discussion

OSPE is now administered as either a formative or summative test in a small subset


of India's medical schools . Although both theoretical and practical examinations are
used to evaluate pathology students, we have a clear advantage in administering
theoretical examinations and doing them with more consistency . This research set
out to compare two commonly used approaches to grading students' performance on
two lab exercises administered as part of a physiology course's practical test. Blood
group testing and hemoglobin measurement were the two hands-on labs students
participated in for their OSPE and conventional practical examination (CPE) grades.
Students and educators who took part in the study provided their feedback. The
study's major objective was to implement the OSPE evaluation strategy at the
university. Teacher and student familiarity with the OSPE evaluation approach is
aided by the comments received. This will serve as the foundation for further
development of the OSPE as an evaluation tool. Students using OSPE in this
research received much better grades than those receiving traditional evaluations.
Rehana et al. found similar outcomes in their research . This might be because the
OSPE technique is more objective than the traditional approach, which relies heavily
on the assessor's personal opinion. Reduced examiner bias may be attributed in part
to the objective nature of the evaluation process.

Learners are given an opportunity to share their thoughts and opinions via a process
known as feedback . A variety of changes may be made to enhance the quality of
instruction and evaluation based on student feedback . According to the results of
this study's analysis of student comments, OSPE is seen as less stressful and tiring
than traditional PE. The results agree with those of a research study by Rehana et
al. in which objective structured practical examinations (OSPE) and viva voce are
used to evaluate students' competence in the physiology laboratory. Analysis of
student answers also revealed that they found the procedure to be simple,
standardized, fair, stress-free, and objective, and they advocated for its continuance
as an evaluation tool for practical examinations. During OSPE, educators had a
more positive outlook on assessment, too. The vast majority of students are in favor
of using OSPE in upcoming exams since they believe it is a valid and reliable means
of evaluation. The vast majority of students think OSPE is an accurate reflection of
their knowledge and that it has an effect on how they study. A similar pattern of
results was found in a research study comparing OSCE and conventional
examination (CE) as a formative assessment method in pediatrics in semester tests
for final-year MBBS students by Mondal. Almost identical to the current survey, the
previous one found that 73.8% of students thought that objective structured clinical
examination was a superior formative evaluation technique than conventional test,
whereas 9.5% of students preferred traditional examination. Based on these
findings, it can be concluded that OSPE is generally regarded as a preferred form of
evaluation by students.

Teachers' opinions are generally in agreement that OSPE is preferable because it


tests students on a broader range of material than a traditional exam, requires them
to learn new procedures in depth, and draws attention to both students' areas of
strength and weakness in a given subject. Most educators agree that OSPE is a
more consistent way to evaluate students in the classroom, hence it should be
utilized in standardized tests going forward. The vast majority of educators, however,
see OSPE as a more taxing and nerve-wracking way to evaluate students. This
might be because of the time and effort required to create customized checklists for
evaluating different types of practical activities.

In order to provide an effective and accurate assessment of student performance in


practical situations, a modernized testing system is required. Experiential learning is
an important part of a doctor's professional development and continues throughout
their careers. The primary goal of medical school is to provide students with the
knowledge necessary to comprehend the underlying physiological changes that
manifest as illness. The basic goal of medical education is to produce professionals
who are up to the task of treating patients. OSPE assisted students who performed
below or above average on tests of cognitive, psychomotor, and attitude skills,
suggesting that it is a reliable strategy with a strong capacity to distinguish between
various groups of pupils.

The study limitations include limited resources, lack of time, and a smaller sample
size. Long-term effects on student learning outcomes or retention were not explored.
A follow-up study could provide insights into the sustainability of the observed
improvements.

Conclusions

In light of our dedicated efforts, our academic institution now possesses a valuable
asset in the realm of physiology education innovative evaluation method known as
Objective Structured Practical Examination (OSPE). Demonstrably superior to
conventional assessment methods, OSPE should become the standard against
which future examinations are measured. Its integration into upcoming assessments
not only aligns with the evolving requirements of medical education but also meets
the desires of both educators and students for more effective and comprehensive
evaluation tools. OSPE offers the prospect of cultivating a deeper understanding of
practical skills and subject mastery, equipping students with the competencies
essential for success in their medical careers.

As we progress, it becomes imperative to continue researching and evaluating the


efficacy of OSPE, its adaptability to various medical disciplines, and its long-term
impact on student learning outcomes. This study marks the initiation of a
transformative journey in medical education, where innovation and evidence-based
assessment practices chart a course toward the development of more proficient and
capable medical professionals for the future.

References
 1.Guilbert JJ. Geneva: World Health Organization; 1987. Educational handbook for
health personal. [Google Scholar

 2.Objective structured practical examination in pharmacology for medical


laboratory technicians. Batmanabane G, Raveendran R, Shashindran
C. https://pubmed.ncbi.nlm.nih.gov/10365319/#:~:text=The%20objective
%20structured%20practical%20examination,teaching%20laboratory%20in
%20experimental%20pharmacology. Indian J Physiol Pharmacol. 1999;43:242–
246. [PubMed] [Google Scholar]

 3.Assessment of clinical competence using an objective structured clinical


examination (OSCE) Harden RM, Gleeson
FA. https://pubmed.ncbi.nlm.nih.gov/763183/ Med Educ. 1979;13:41–54. [PubMed]
[Google Scholar]

 4.Objective structured practical examination: a new concept in assessment of


laboratory exercises in preclinical sciences. Nayar U, Malik SL, Bijlani RL. Med
Educ. 1986;20:204–209. doi: 10.1111/j.1365-2923.1986.tb01169.x. [DOI]
[PubMed] [Google Scholar]

 5.Reliability and validity of the objective structured clinical examination in


paediatrics. Matsell DG, Wolfish NM, Hsu E. Med Educ. 1991;25:293–299. doi:
10.1111/j.1365-2923.1991.tb00069.x. [DOI] [PubMed] [Google Scholar]

 6.Objective structured clinical examination (OSCE) revisited. Gupta P, Dewan P,


Singh T. Indian Pediatr. 2010;47:911–920. doi: 10.1007/s13312-010-0155-6. [DOI]
[PubMed] [Google Scholar]

 7.Case-based practical examination as an introduction to formative assessment in


pathology: undergraduate student’s perception. Kanasagara AA, Tank YP, Bhatt
PS, Maru AM. Natl J Physiol Pharm Pharmacol. 2022;12:1291–1295. [Google
Scholar]

 8.Perception and performance of medical students in objective structured practical


examination and viva. Rehana R, Sadiqa S, Azhar I, Rabiya
R. http://www.pps.org.pk/PJP/8-2/Rehana.pdf Pak J Physiol. 2012;8:33–
36. [Google Scholar]

 9.Twelve tips to designing and implementing a learner-centred curriculum:


prevention is better than cure. McLean M, Gibbs T. Med Teach. 2010;32:225–230.
doi: 10.3109/01421591003621663. [DOI] [PubMed] [Google Scholar]
 10.A trial of the objective structured practical examination in physiology at Melaka
Manipal Medical College, India. Abraham RR, Raghavendra R, Surekha K, Asha
K. Adv Physiol Educ. 2009;33:21–23. doi: 10.1152/advan.90108.2008. [DOI]
[PubMed] [Google Scholar]

 11.Comparative analysis between objective structured clinical examination (OSCE)


and conventional examination (CE) as a formative evaluation tool in pediatrics in
semester examination for final MBBS students. Mondal R, Sarkar S, Nandi M,
Hazra A. Kathmandu Univ Med J (KUMJ) 2012;10:62–65. doi:
10.3126/kumj.v10i1.6917. [DOI] [PubMed] [Google Scholar]

 12.Reflection in professional practice and education. Robertson


K. https://pubmed.ncbi.nlm.nih.gov/16184213/ Aust FAM Physician. 2005;34:781–
783. [PubMed] [Google Scholar]

 13.Evaluation of integrated learning program of undergraduate medical students.


Rehman R, Iqbal A, Syed S, Kamran
A. http://www.pps.org.pk/PJP/7-2/Rehana.pdf Pak J Physiol. 2011;7:37–
41. [Google Scholar]

 14.Development and evaluation of the POPBL (patient-oriented problem-based


learning) module in pathology: a comparative analysis of performance and
perception among second-year pathology students. Desai KN, Satapara VK,
Rathod GB, Maru AM. Cureus. 2022;14:0. doi: 10.7759/cureus.28885. [DOI] [PMC
free article] [PubMed] [Google Scholar]

NLM on FacebookNLM on YouTube


National Library of Medicine
8600 Rockville Pike
Bethesda, MD 20894
 Web Policies
 FOIA
 HHS Vulnerability Disclosure
 Help
 Accessibility
 Careers

 NLM
 NIH
 HHS
 USA.gov

You might also like