Reliability and validity of OSCE checklists used to assess the communication skills of undergraduate medical students: A systematic review

by | Nov 15, 2018

A few words about this paper…

In 2011, Winny from Indonesia approached me to ask whether he could join us for a PhD track. It would be an opportunity to investigate the wide range of communications stations used within our School of Medicine. Data was collected using our OSCE Management Information System. A systematic review was commenced to find out where the flaws in practice were, and it was successful. If a clinical skills trainer addresses that he/she is responsible for a communication skills station I ask, which of the 18 domains of communications skills are you going to assess? Silence usually follows and low Cronbach’s alpha (internal consistency of the assessment form) at a later stage is very likely. Winny’s paper (to date, November 2018) is referenced 17 times by other researchers.

The agreement on the validity and reliability of the communication skills checklists between us reviewers was only 0.45. We couldn’t agree on the domains either apparently! Heterogeneity in the rubrics used in the assessment of communication skills, and a lack of agreement between reviewers makes comparison of student competences within and across institutions difficult. Work began in earnest to investigate whether a standardised approach in the development of OSCE communications skills stations would help. And so Winny was busy for the proceeding four years. I am still grateful to his family for allowing him to follow his dream.

Authors

Winny Setyonugroho, Kieran Kennedy and Thomas JB Kropmans 

Abstract

Objectives

To explore inter-rater agreement between reviewers comparing reliability and validity of checklist forms that claim to assess the communication skills of undergraduate medical students in Objective Structured Clinical Examinations (OSCEs).

Methods

Papers explaining rubrics of OSCE checklist forms were identified from Pubmed, Embase, PsycINFO, and the ProQuest Education Databases up to 2013. Included were those studies that report empirical validity or reliability values for the communication skills assessment checklists used. Excluded were those papers that did not report reliability or validity.

Results

Papers focusing on generic communication skills, history taking, physician–patient communication, interviewing, negotiating treatment, information giving, empathy and 18 other domains (ICC −0.12–1) were identified. Regarding the validity and reliability of the communication skills checklists, agreement between reviewers was 0.45.

Conclusions

Heterogeneity in the rubrics used in the assessment of communication skills and a lack of agreement between reviewers makes comparison of student competences within and across institutions difficult.

Practice implications

Consideration should be afforded to the adoption of a standardized measurement instrument to assess communication skills in undergraduate medical education. Future research will focus upon evaluating the potential impact of adoption of a standardized measurement instrument.

Click below for article full text…

 


Reliability and validity of OSCE checklists used to assess the communication skills of undergraduate medical students: A systematic review

Click here to provide feedback on the paper.

Free eBook:

The Use of Technology in Clinical Skills Assessment
An essential read for those introducing technology into clinical skills assessment.

Technology can:

  • Reduce error rates
  • Decrease administration time
  • Increase quality standards