An Online Management Information System for Objective Structured Clinical Examinations

by | Nov 15, 2018

A few words about this paper…

During 2006 – 2008, David Cunningham, as an intern, and myself as a lecturer, were engaged with teaching & learning in the National University of Ireland, Galway, in the School of Medicine (Medical Informatics & Medical Education in those days). Our OSCE procedures involving the planning and execution of the examination was typically laborious, as it is for this exam. Planning was one thing, but what about results? We encountered issues with forms and results. On top of this, the study recorded one typical OSCE exam with 30% errors and a high cost of automation. With Cussimano’s €4.70 staff cost per student, per station and our estimate of €2.80 administration costs per submitted paper form, total cost of an OSCE could be estimated to be €7.50 per student, per station. 

Reading back on this paper today, this seems very much old-school, with 53, 4th year students, a very moderate reliability of 11 stations, and a Standard Error of Measurement varying from about 12%. Taking into account a 95% Confidence Interval of the SEM, we need to take into account that the observed score varies between +/- 24%!

Nonetheless, this was the first publication of its kind reporting about electronic OSCEs. Much work needed to be done because of the large amounts of variation that could be allocated between items, stations and examiners. However, with an electronic OSCE Management System on-board we could do our OSCEs faster, cheaper and quicker than before when all was planned, executed and added on from paper forms. Now, in 2018, I don’t think OSCEs have become any cheaper with 200 students in year 4, but at least all errors are out of the system and students get written feedback instantly after completion of the exam.

Authors

Thomas JB Kropmans, Barry GG O’Donovan, David Cunningham, Andrew W Murphy, Gerard Flaherty, Debra Nestel and Fidelma P Dunne

Abstract

Objective Structured Clinical Examinations (OSCE) are adopted for high stakes assessment in medical education. Students pass through a series of timed stations demonstrating specific skills. Examiners observe and rate students using predetermined criteria. In most OSCEs low level technology is used to capture, analyse and produce results. We describe an OSCE Management Information System (OMIS) to streamline the OSCE process and improve quality assurance. OMIS captured OSCE data in real time using a Web 2.0 platform. We compared the traditional paper trail outcome with detailed real time analyses of separate stations. Using a paper trail version only one student failed the OSCE. However, OMIS identified nineteen possibly ‘incompetent’ students. Although there are limitations to the design of the study, the results are promising and likely to lead to defendable judgements on student performance. 

Click below for article full text…


An Online Management Information System for Objective Structured Clinical Examinations

Click here to provide feedback on the paper.

Free eBook:

The Use of Technology in Clinical Skills Assessment
An essential read for those introducing technology into clinical skills assessment.

Technology can:

  • Reduce error rates
  • Decrease administration time
  • Increase quality standards