SciELO - Scientific Electronic Library Online

 
vol.24 número1Cuestionario de percepción de la formación pedagógica: estructura factorial y confiabilidad en académicos de la salud chilenos índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

Links relacionados

  • En proceso de indezaciónCitado por Google
  • No hay articulos similaresSimilares en SciELO
  • En proceso de indezaciónSimilares en Google

Compartir


FEM: Revista de la Fundación Educación Médica

versión On-line ISSN 2014-9840versión impresa ISSN 2014-9832

FEM (Ed. impresa) vol.24 no.1 Barcelona feb. 2021  Epub 31-Mayo-2021

https://dx.doi.org/10.33588/fem.241.1112 

EDITORIAL

Consenso de expertos sobre evaluación de estudiantes y residentes a partir de la Ottawa Conference 2020

Expert consensus on student and resident assessment from the 2020 Ottawa Conference

Expert consensus on student and resident assessment from the 2020 Ottawa Conference

Jordi Palés-Argullós1  , Amando Martín-Zurro1 

1Fundación Educación Médica

In 2011, the journal Medical Teacher published a paper outlining an expert consensus on the performance assessment of medical students and residents generated at the Ottawa Conference [1], a forum dedicated primarily to discussing assessment in medical education. Nine years later, a new article has been published again in Medical Teacher entitled ‘Performance assessment: Consensus statement and recommendations from the 2020 Ottawa Conference' [2], which revisits the 2011 document, analysing the aspects that have not been resolved in a totally satisfactory manner in the time elapsed since its publication and focusing on two specific topics, objective structured clinical examinations (OSCE) and workplace-based assessment. Building on the initial 2011 document, and in the light of developments made in assessment over the last 10 years, the experts set out a number of recommendations for establishing good practice in the use of these two tools.

Given that all our medical schools are using OSCEs in a generalised way to assess their students and that there has sometimes been talk of their possible use in the future in the entrance exam to specialised training, in this editorial we would like to focus our attention on this instrument, briefly commenting on the recommendations that this group of experts propose with the aim of facilitating its proper use in practice. The article addresses three main aspects: the need for a clear definition of the intended use and purpose, for there to be significant evidence to support and justify the decisions taken on the basis of the results obtained in the OSCE and, finally, for there to be sufficient arguments or justifications to defend the decisions taken on the basis of the results obtained.

The first aspect states that the purpose of the OSCE should be made explicit to all the stakeholders, such as teachers, candidates, examiners, employers, regulators, agencies and the public.

In the second aspect, the most extensively developed, the authors stress different points: the content of the OSCE (ensuring that the content adequately assesses what it is intended to assess), the planning of an OSCE according to the learning objectives so that it really allows us to assess what it is intended for, that is, clinical skills, communication, technical procedures and clinical reasoning. In this regard, they remind us that we should not include knowledge tests in different formats in an OSCE, as this would jeopardise the validity of the test. They also refer to the requirements in terms of the internal structure that an OSCE must fulfil, such as having an adequate number of stations (no fewer than 12) and their length (no less than 150 minutes); having an adequate number of observers and a wide and diverse map of situations (remember that the competence is specific, rather than generic); having an adequate scoring system (checklists versus global assessment scales); establishing the pass levels of the exam in an appropriate way using the best methods to do so; and carrying out an adequate psychometric analysis to ensure the reliability of the test. They also reflect on the need for security measures to prevent test takers from knowing in advance the tasks they will be asked to perform and on the need to compare the results of OSCEs with those of other tests by triangulating the data obtained.

The experts also advocate the inclusion of OSCEs in a programmatic approach to assessment, considering that decisions should be made not after each assessment episode, but by domain or competence rather than by assessment instrument. In other words, they advocate the need to combine the OSCE with the use of different assessment instruments that rate competence from different perspectives. Finally, in this section, they insist on the inexcusable need for the decisions taken on the basis of the results of the exam to be solid, fair and defensible, which requires that the criteria for passing the exam be established using the best-evidenced methods and providing students with mandatory feedback in order to increase motivation and learning.

In the third section, the experts consider it essential to collect all sources of validity evidence to justify any decision taken on the basis of the test results.

By way of summary, the article concludes with a number of recommendations: 1) design OSCEs in the context of a more global assessment system; 2) adhere to validity criteria; 3) define the purpose of the OSCE and make it explicit to stakeholders; 4) ensure there is a correspondence between what we assess and learning outcomes; 5) plan adequate sampling with sufficient stations and testing time; 6) use standard settings based on relevant criteria; 7) generate psychometric studies to ensure reliability at the global and station levels; 8) value examiner diversity and focus their training on conduct, behaviours and bias; 9) handle test security through task design and circuit design to group stations; and 10) triangulate data from OSCE performance with other assessments or outcomes to inform final decision-making.

The article also discusses workplace-based assessment and the different connotations it can have depending on whether its purpose is formative or summative assessment, essentially in terms of feedback and interaction with learning tutors. Methods for workplace-based assessment include different instruments focused on the analysis of direct or indirect encounters with one or more patients. These tools should be used carefully, striving to minimise the interpretation problems that, especially in the psychometric field, may arise when using them from the perspective of summative assessment.

A careful analysis of these recommendations is, in our opinion, of great importance in Spain, especially at a time like the present, when significant changes are being considered in the training of students and residents. Assuming that OSCEs can be a core instrument to assess the undergraduate period and to contribute to defining criteria for the transition to postgraduate studies requires the decisive involvement of all the stakeholders involved in correcting the problems they may have in their current conception and development. The institutions and bodies responsible for undergraduate and postgraduate training processes, with special reference to our medical schools, must critically analyse and, where appropriate, apply the above recommendations in order to advance in the design of the best possible assessment system for our students and residents.

Bibliografía / References

1. Boursicot K, Etheridge L, Setna Z, Sturrock A, Ker J, Smee S, et al. Performance in assessment:consensus statement and recommendations from the Ottawa conference. Med Teach 2011;33:370-83. [ Links ]

2. Boursicot K, Kemp S, Wilkinson T, Findyartini A, Canning C, Cilliers F, et al. Performance assessment:Consensus statement and recommendations from the 2020 Ottawa Conference. Med Teach 2021;43:58-67. [ Links ]

Creative Commons License Este es un artículo publicado en acceso abierto bajo una licencia Creative Commons