Traditionally OSCEs have been assessed with paper-based methods. However, a number of issues have been highlighted with this method including illegible handwriting, missing details (students’ names and student numbers) and lost assessment sheets . Furthermore, it is known that manual calculation of results and entering them into a database is time-consuming and is subject to human errors. In addition, feedback is rarely provided to students on their performance after paper-based assessments. Despite these issues, there is a scarcity of literature regarding the use of computer or OSCE software and the assessment of OSCEs. Segall et al compared the usability of paper and pencil method and Personal Digital Assistant (PDA) based quizzes and found the PDA based quiz was more efficient and superior to the traditional based method. Similarly, Treadwell compared paper-based OSCEs with an electronic method. The findings indicated that the electronic method was just as effective and more efficient (less time consuming) than the traditional paper-based method. In addition, the electronic system was highly rated by the assessors, who found it less invasive and reported they had more time to observe the students and permitted greater observation of the students when using the paper assessment.
During a summer research project in 2007, students analysed the information gathered of three cohorts of undergraduate students going through our traditional paper trailed OSCEs. We found 30 % of errors due to incomplete filled out forms, incorrect calculation of final results and differences between the data on the paper forms and those in available electronic spreadsheets. Apparently, 30% of our students received results that according to existing standards, might have been correct but we know better now! This kind of retrospective Quality Assurance was done previously within our home-base, the National University of Ireland in Galway. Senior lecturer in Medical Informatics & Medical Education, Dr Thomas Kropmans, discussed the problem with a Software Engineer, David Cunningham. By automating the OSCE process with direct entry of assessment results, we could solve the problem. Our online solution is used by some of the most prestigious Universities in the World. We currently analyse the completed sets of data using Borderline Regression Analysis for dynamic cut-off score calculations.
Feedback is the most powerful learning tool in Medical Education. However, a lot of students have to progress without it. It is just too labor intensive to provide immediate feedback on performance in an OSCE or MMI if there are only paper forms. Qpercom Observe includes an instant student feedback email system that allows module coordinators to send global or detailed feedback to students. The student email feedback feature offers various options to send written feedback, with or without item details, Global ratings etc.