True communication skills assessment in interdepartmental OSCE stations: Standard setting using the MAAS-Global and EduG

by | Nov 15, 2018

A few words about this paper…

In medical education it is extremely helpful to compare outcomes. To be able to compare communication skills outcomes between students, years of study or between institutions is very challenging. If the measurement of particular learning outcomes is not standardised, just as using a standardised measurement tape to measure length, you cannot trust the outcome. In this study we attempted to compare communication skills outcomes between groups of students.

Since communication skills assessment forms are not standardised at our School of Medicine within the College of Medicine, Nursing and Health Sciences of the National University of Ireland in Galway, we developed the MAAS-Global proportion (MG-P) as a result of one of our previous studies. If we know how large the MG-P of an assessment form is we might be able to compare different students, groups of students or years of the curriculum. We therefore introduced the MAAS-Global score followed by MAAS-Global proportion and section percentage.

For example, a MAAS-Global score of 65 with a MAAS-Global proportion of 75 which consists of 14% section 1, 29% section 2, and 57% section 3, will be then written as MAAS-Global score 65 [MG75-14-29-57]. This is quite a complicated matter I have to say, but we wanted to be able to compare our student outcomes accounting for different languages and/or educational backgrounds of our students.

Due to the efforts made, we are now able to compare not only student’s OSCE outcomes (marks) but also their outcome on Communication Skills and to what extent the various proportions of the MAAS-Global were represented in our student results. True measurement of a student’s progress in Communication Skills (CS) development is only possible if a standardised tool is employed. As a result of this study, we have suggested a standardisation method that utilises the MAAS-Global to standardise existing OSCE checklists, to enable comparison between students and student groups within a curriculum.


Winny Setyonugroho, Thomas JB Kropmans, Ruth Murphy, Peter Hayes, Jan Van Dalen and Kieran Kennedy


Comparing outcome of clinical skills assessment is challenging. This study proposes reliable and valid comparison of communication skills (1) assessment as practiced in Objective Structured Clinical Examinations (2). The aim of the present study is to compare CS assessment, as standardized according to the MAAS Global, between stations in a single undergraduate medical year.


An OSCE delivered in an Irish undergraduate curriculum was studied. We chose the MAAS-Global as an internationally recognized and validated instrument to calibrate the OSCE station items. The MAAS-Global proportion is the percentage of station checklist items that can be considered as ‘true’ CS. The reliability of the OSCE was calculated with G-Theory analysis and nested ANOVA was used to compare mean scores of all years.


MAAS-Global scores in psychiatry stations were significantly higher than those in other disciplines (p<0.03) and above the initial pass mark of 50%. The higher students’ scores in psychiatry stations were related to higher MAAS-Global proportions when compared to the general practice stations.


Comparison of outcome measurements, using the MAAS Global as a standardization instrument, between interdisciplinary station checklists was valid and reliable.

Practice implications

The MAAS-Global was used as a single validated instrument and is suggested as gold standard.

Click below for article full text…

True communication skills assessment in interdepartmental OSCE stations: Standard setting using the MAAS-Global and EduG

Click here to provide feedback on the paper.

Free eBook:

The Use of Technology in Clinical Skills Assessment
An essential read for those introducing technology into clinical skills assessment.

Technology can:

  • Reduce error rates
  • Decrease administration time
  • Increase quality standards