Traditionally OSCEs have been assessed with paper based methods. However, a number of issues have been highlighted with this method including illegible handwriting, missing details (students’ names and student numbers) and lost assessment sheets . Furthermore, it is known that manual calculation of results and entering them into a database is time-consuming and is subject to human errors and feedback is rarely provided to students on their performance after paper based assessments. Despite these issues there is a scarcity of literature regarding the use of computer or OSCE software and the assessment of OSCEs. Segall et al compared the usability of paper and pencil method and Personal Digital Assistant (PDA) based quizzes and found the PDA based quiz was more efficient and superior to the traditional based method. Similarly, Treadwell compared the conduction of a paper based OSCEs with an electronic method. The findings indicated that the electronic method was just as effective and more efficient (less time consuming) than the traditional paper based method. In addition, the electronic system was highly rated by the assessors, who found it less invasive and reported they had more time to observe the students and permitted greater observation of the students when using the paper assessment.
During a summer research project in 2007 students analysed the information gathered of three cohorts of undergraduate students going through our traditional paper trailed OSCEs. We found 30 % of errors due to incomplete filled out forms, incorrect calculation of final results and differences between the data on the paper forms and those in available electronic spreadsheets. You might recognize the issues? Apparently, 30% of our students received results that according to existing standards, might have been correct but we know better now! This kind of retrospective Quality Assurance had was done before within our home base the National University of Ireland in Galway. Senior lecturer Medical Informatics & Medical Education Dr Thomas Kropmans was a new kid on the block and discussed the problem with now COO David Cunningham. While automating the OSCE process with direct entry of assessment results we could solve the problem. Our online solution is used by the most prestigious Universities throughout the World. We currently analyse the completed sets of data using Borderline Regression Analysis for dynamic cut-off score calculations.
Feedback is the most powerful learning tool in Medical Education. However, a lot of students have to progress without it. It is just to labor intense to provide immediate feedback on performance in an OSCE or MMI if there are only paper forms. OMIS consists of an instant student feedback email system that allows module coordinators to sent global or detailed feedback to students. The student email feedback system counts various options to sent written feedback, with or without item details, Global ratings etc.