In this study we have explored the training experience of young European rheumatologists and trainees from the 41 countries where training is offered. Self-reported ability appears moderately high for most core competences, though no external reference has been found for comparison, not even in other medical specialties. Still, there is room for improvement. Moreover, differences between countries persist even after adjustment. If this reflects real differences in competence achieved during specialty training, the trans-national standard of care for patients with RMDs might be compromised.
Several educational “red flags” have been detected. There was consistently poor performance in the identification of crystals by optic microscopy: self-reported ability was low, as was the percentage of trainees receiving education and “sufficient” practical experience. Crystal identification is a mandatory technique for rheumatology trainees according to the UEMS recommendations [3] and to most national training programs. Whether the lack of competence is due to other specialists/laboratory technicians taking over the procedure is unknown; this approach, however, has low reliability [9, 10]. A second procedure - performing MSK US – was also poorly performed, with low self-reported ability in this procedure. This might reflect the prolonged learning curve for US, as those completing training might not yet have attained sufficient proficiency. However, US is not (yet) considered a mandatory competence at a European level or in many national curricula so the low self-reported ability could result from lack of access [11] or the acquisition of the competence by only some of the trainees. Surprisingly, ability even in what could be considered a core rheumatologic procedure - knee aspiration – was not optimal, with over 5 % of responders reporting low ability. Not being able to perform knee aspiration adequately could have a negative impact on other competences such as management of a patient with different types of arthritis or identifying crystals under the microscope. If global deficiencies are to improve, analysis of the causes by national and international organizations should be prioritized and solutions such as further promotion of courses or educational stays at reference centers considered.
Self-reported ability was better for clinical competences than for generic competences. The lowest self-reported ability within the disease areas was in managing patients with CTD, especially those with systemic vasculitis. This could be due to several reasons as these are fewer common diseases with potential multi-organ involvement, which can therefore present in a variety of manners. Also, rheumatologists might not be the main providers of care for these patients in some countries, adding to the European diversity. This issue was not addressed in this study.
Beyond global deficiencies, country-specific strengths and weaknesses have also been detected. In order to interpret the results a minimum background knowledge of rheumatology training programs across Europe is needed. In all countries, a period of general internal medicine training is required, though the timing of the training in internal medicine varies (within or prior to the rheumatology training program). Training length (considered as any prior internal medicine training and the rheumatology training program) will span from 3 to over 8 years. Given the differences in structure, this period provides a better reflection of true training time than the length of the rheumatology training program.
We accepted a broad definition of rheumatology for this project, but diversity in clinical rheumatology practice was documented over two decades ago [12]. This will impact the clinical competences and procedures that each country classes as mandatory during the training. The competences were chosen deliberately to cover a broad spectrum of rheumatology, but in order to maximize participation, the number of competences analyzed was necessarily limited.
Apart from the differences in self-reported ability, some countries have greater reliance on personal study (limited formal education), while in others the trainees are expected to see and manage a greater number of patients. As expected, greater experience increased the self-reported ability of trainees, as did receiving formal education. Even though didactic lectures do not modify physician behavior in continuous medical education, it must be noted that formal education can take different forms, including interactive formats, which do have behavior-modifying potential [13]. Also, postgraduate training comprises a critical educational time, with a need to incorporate vast amounts of knowledge in a relatively short time. The effect of assessments on self-reported ability was inconsistent. However, the type of assessment was not recorded and therefore anything from case discussions to written multiple-choice exams or objective structured clinical examination could be considered an assessment.
Even though comprehensive discussions were performed during the development and dissemination of this survey, several limitations must be noted. First, in this study we used self-reported ability - also referred to as confidence - as a surrogate marker for competence. In the absence of a pan-European examination, it is impossible to obtain a homogeneous assessment of competence, and self-reported ability could be a reasonable surrogate. However, it should be noted that this is a subjective assessment made by the young rheumatologist/trainee and thereby influenced by many external factors such as the referent against which it is compared (i.e., the standard the respondent considers optimal), the social context (e.g., the attitudes towards medical education), and personal traits (e.g., self-esteem).
Second, we asked trainees to estimate outcomes for the end of their training and rheumatologists to recall the end of their training. This introduces several biases (such as recall bias for young rheumatologists) and imprecision. Also, the survey captured answers from around a quarter of the target population; it is plausible that trainees who answered the survey are more engaged and optimistic than the general population. The limited number of participants, particularly from some countries, poses some challenges in the interpretation of the results. Country of training consistently and significantly played a role and analyses are robust for this conclusion. Nevertheless, due to a limited number of participants per country and particularly in some countries, the coefficients reflecting the magnitude of the difference between countries may be unstable, and therefore must be interpreted with caution.
Finally, some of the questions could have led to differences in interpretation. For this, we piloted the survey and provided definitions and examples of all terms considered less clear. For example we provided examples of what should be considered as formal education: “This does not mean that you have seen, managed or discussed patients with these complaints or that you can treat patients with these diseases. Rather we are asking if you have received formal education such as courses, lectures, etc”.
This study suggests that there are significant differences between European countries, not only in training structures and methods, but also in educational outcomes. The differences between countries can be attributed to a number of factors. Length of training does not fully explain the differences between countries. Other factors not assessed in this study (e.g., type of supervision, health care organization, etc.) are potential confounders of this relationship with length of training and should be explored to provide further insight into the differences between countries and the possible intervention strategies.
Homogeneous training might be considered unnecessary or even unwarranted, as national needs differ. However, harmonized training has been encouraged by all European authorities as an essential way to support doctor mobility within the European Union, improve national training, and assure trans-national standards of care for patients. Efforts towards this end have been ongoing for decades in many specialties [14–19]. In rheumatology, the Section and Board of the UEMS has produced several Charters and Educational Training Requirements, and the uptake of these, though currently limited, should be supported.
Other specialties have devised different paths to enhance harmonization, with greater or lesser success. Up to 30 UEMS specialty Boards are providing a (voluntary) European assessment [20]. Even though these are not to be considered as formal specialist qualifications, their quality and recognition have increased significantly and as a result, some countries recognize European assessments as part of (or equivalent to) their national examination. Furthermore, their mere existence has led to an increase in the number of trainees and specialists taking these exams, contributing to a higher standard of knowledge and competence, which may ultimately contribute to better delivery of care. Steps towards an assessment tool that can be used across all the USA rheumatology training programs have also been recently forthcoming [19]. Within the rheumatology community, further steps in this direction are needed; this can only take place with the involvement and commitment of the different countries and stakeholders.
In summary, this study reports that even though overall high self-reported ability is reported for most competences, significant differences in outcome and training methods remain amongst countries. Given the relevance of the issue, further analyses of the extent of the differences in achieved competences are warranted. Initiatives such as the development and implementation of a consensus list of core competences or a multinational examination should be supported and encouraged. Increased knowledge about national training provides the background information necessary to plan successful harmonization attempts.