SciELO - Scientific Electronic Library Online

 
vol.13 número3Proceso de Bolonia (I): educación orientada a competencias índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

Links relacionados

  • En proceso de indezaciónCitado por Google
  • No hay articulos similaresSimilares en SciELO
  • En proceso de indezaciónSimilares en Google

Compartir


Educación Médica

versión impresa ISSN 1575-1813

Educ. méd. vol.13 no.3  sep. 2010

 

EDITORIAL

 

Bologna's challenge: assessment of the competencies

El reto de Bolonia: la evaluación de las competencias

 

 

Albert Oriol Bosch

Presidente de la Fundación Educación Médica. E-mail: aoriolb@gmail.com

 

 

The Bologna process demands the European Universities outcomes oriented education. The educational outcomes should be defined as measurable competencies, meaning that they have to be assessed. The European Universities are then supposed to evaluate the degree of competencies their graduates have achieved at graduation and they will in turn be assessed by the success of their graduates.

The assessment of the competencies of the graduating students poses a big challenge to the European Universities. First of all, because most of them lack institutional experience in doing it, then, because this is a complex affair, and finally because they lack resources to carry this sophisticated task.

Universities and Medical Schools particularly are organized with a departmental structure that impedes their migration from fragmented curriculum in subject matters towards educational programs directed to provide the students capacities of performance. Most of the Medical Schools in Europe have not equipped themselves with interdepartmental educational units with enough expertise in assessment processes. In most countries external evaluations are not in place and students' assessment is in the hands of the content experts responsible of the teaching of the different subject matters.

Tacking into account that competencies are complex constructs that capacitate for decision making and performance, both expressed as specific behaviors, they can only be assessed by expert evaluators, ideally, in standardized conditions.

It has been said that at present, the main shortcomings lay in the tools available for assessing competencies [1]. The existing ones fall short in fulfilling the required criteria (validity, reliability, educational impact, cost and acceptability) [2]. Nevertheless the same authors identify some measurement tools showing not only general validity but also discriminatory capacity, admitting that 'portfolios' allow to document evidence based clinical performance while mini-clinical-evaluations (mini-CEX), standardized-simulated-patients as well as video recorded clinical interviews are psychometrically very potent tools to assess clinical skills. The present difficulties to evaluate professionalism can be overcome through multiple feedback from multiple observers in multiple environments [3].

Nevertheless, Green & Holmboe [4] maintain that the main limitation in competencies' assessment lays not so much in the shortage of adequate measurement tools but in the inappropriate use of those available. The assessment based on performance observation requires expert observers not only on the performances to be assessed but also in emitting synthetic judgments about what is observed [5], since significant shortcomings on both facets have been reported [6]. A good instrument in the hands of a non expert evaluator does not produce a quality assessment [4].

It is possible to develop the evaluation capabilities of educators with training and sufficient practice [7]. There is a need to train evaluators in order to be able to develop competency-based-education as required by the Bologna process. But this is not the only problem to overcome since there is also an insufficient understanding of the performance dimensions expected along the different phases of the educational progress in order to know what to look for during the observations. 'An explicit set of expectations that would link the ideals of the general competencies to the realities of measurement' are necessary, according to Lurie et al [1].

The challenge posed by Bologna will not be successfully confronted if the Medical Schools do not equip themselves with interdepartmental units equipped with expertise in the educational and evaluation processes, contributing to the development of the educational capabilities of the academic staff, and contributing to bridge the departmental barriers . Being unable to measure what it is aimed at makes impossible to know how far out of target you are and how well Bologna's requirements have been attended.

References

1. Lurie SJ, Mooney CJ, Lyness JM. Measurement of the general competencies of the Accreditation Council for Graduate Medical Education: a systematic review. Acad Med 2009; 84: 301-9.        [ Links ]

2. Van der Vleuten CPM. The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ Theory Pract 1996; 1: 41-67.        [ Links ]

3. Lockyer J, Clyman SG. Multisource feedback. In Holmboe ES, Hawkins RE, eds. Practical guide to the evaluation of clinical competence. Philadelphia: Mosby-Elsevier; 2008.        [ Links ]

4. Green MI, Holmboe ES. The ACGME toolbox: half empty or half full? Acad Med 2010; 85: 787-90.        [ Links ]

5. Norcini J, Burch V. Workplace-based assessment as an educational tool. AMEE Guide 31. Med Teach 2007; 29: 855-71.        [ Links ]

6. Holmboe ES. Faculty and the observation of trainees' clinical skills: problems and opportunities. Acad Med 2004; 79: 16-22.        [ Links ]

7. Holmboe ES, Hawkins RE, Huot SJ. Effects of training in directs observation of medical residents' clinical competence: a randomized trial. Ann Intern Med 2004; 140: 874-81.        [ Links ]

Creative Commons License Todo el contenido de esta revista, excepto dónde está identificado, está bajo una Licencia Creative Commons