Impact factor 1.09
Original article
Improved education after
implementation of the Danish postgraduate
medical training reform

Troels Kodal1, Niels Kristian Kjær1 & Dorte Qvesel2

1) Postgraduate Education, Medical Faculty, University of Southern Denmark
2) Department for Postgraduate Medical Education, Southern Denmark



Introduction: A reform of educational postgraduate medical training was launched in Denmark in 2004. The reform was based on a report by the Danish Medical Specialist Commission and consisted of a number of initiatives that were all aimed at improving the quality of medical training. Since 1998, all junior doctors in Denmark have been requested to rate the quality of their training on a Danish standardized questionnaire (DSQ) comprising 24 questions. In this study, we examined how junior doctors in hospitals rated their postgraduate medical training before and six years after the reform was implemented.

Material and methods: This study is a cross-sectional register study of DSQ ratings of the postgraduate training in the region of Southern Denmark in 2002-2004 and in 2010. The ratings were extracted from the official database:

Results: For comparison, a total of 1,028 ratings from before the reform and 686 ratings from after the reform were extracted. 70% of junior doctors filled in a DSQ in 2010. The doctors’ perceptions of the training improved from 2002-2004 to 2010 as far as educational outcome and the department’s educational effort were concerned. However, no change was evident in several questions targeting educational management.

Conclusion: Based on the junior doctors’ DSQ ratings, the quality of postgraduate training has improved in several areas from 2002-2004 to 2010. But there is still room for improvement. Developing a new, validated questionnaire should be considered in order to ensure a high credibility in future work on quality.

Funding: not relevant.

Trial registration: not relevant.

Following several years of concern about the quality of postgraduate medical training in Denmark [1], the Danish Medical Specialist Commission was established under the Health Ministry. In 2000 this commission published a report which contained a number of recommendations. These recommendations included changing the contents and format of postgraduate medical training. The Danish reform of postgraduate medical training was initiated in 2004 and was based on the Commission’s.

The Commission’s recommendations comprised a number of new initiatives, among others that postgraduate training should follow a new curriculum based on the CanMEDS roles [2, 3].

New theoretical elements were introduced during the postgraduate medical training, e.g. courses in management, collaboration, skills training, supervision and research methodology.

A regional organization for postgraduate medical education was established, including a regional educational council and a region-based clinical teaching development function.

Each hospital unit appointed one consultant with educational responsibility for a strengthening of the supervision and feedback given within the department.

The Commission further recommended that the evaluation of postgraduate medical training used since 1998 should be continued [1]. Such evaluation is based on a Danish standardized questionnaire (DSQ).

The Region of Southern Denmark started using an online version of the DSQ in 2002, and this online version has gradually been implemented nationwide. Evaluations are accessible on

In addition to the Commission’s recommendations, basic postgraduate training was reduced in 2008 from 18 months to 12 months. This initiative was controversial [4].

Implementing an educational reform is a challenge that requires a significant amount of effort [3, 5]. A Danish study published 3.5 years after the reform showed only a limited impact on the clinical training practice and on educational culture [6].

This study examines how junior doctors evaluated postgraduate medical training in Denmark before and six years after the reform was launched.


This study is a cross sectional register study of how junior doctors employed in hospitals rated postgraduate medical training in the region of Southern Denmark in 2002-2004 and in 2010. Only evaluations from Region of Southern Denmark are used because this was the region to start using electronic registration form in 2002.

The source of the material used in the present study is an official Danish survey database in which evaluations from junior doctors have been collected since 1998. After each rotation, junior doctors fill in the official recommended DSQ. The questionnaire contains 24 questions, including 22 questions covering: introduction, educational programme, trainers’ qualifications, organization of work and other matters. Furthermore, two important global questions cover the trainee doctors’ perceptions of the department’s educational effort and their self-assessments of the overall learning outcome during the rotations.

For each of the 24 questions, a Likert rating scale is used (one to nine), where nine is best, except in two questions (questions 13 and 15), where 4.5 is best.

The DSQ ratings were extracted from the survey database during March 2011. Normal distribution was tested by Box Cox regression. In cases of non-normal data distribution, differences were assessed statistically using the Kruskal-Wallis rank test. Data are given as mean values. p-values < 0.05 after Bonferroni correction were considered statistically significant.

The standardized questionnaire DSQ (in Danish) and the ratings are available at

Trial registration: not relevant.


According to the survey database in 2010, 70% of junior doctors filled in a questionnaire after each hospital rotation. No data detail the percentage of doctors who filled in the DSQ questionnaire in 2002-2004.

We extracted 1,028 ratings from 2002-2004 and 686 ratings six years after the reform to compare evaluations from before and after the reform. The ratings from 2002, 2003 and 2004 showed no significant difference.

The ratings of the 24 questions were not normally distributed in the Box Cox analysis.

As shown in Table 1, the doctors’ perceptions of the training improved from 2002-2004 to 2010 regarding the educational effort of the department (question 23) and the trainee doctors’ self-assessment of their learning outcomes during the rotations (question 24). No change was detected in most other questions, including those specific to the management of practical education. Improvements were, however, observed in questions on participation in quality assurance, administrative work, personal teaching experiences and on departmental desire to prioritize education.

Figure 1 presents the increase in the ratings for questions 23 and 24 over the years. The ratings are relatively stable during this period, with a small drop in 2005 and 2007. From 2008 to 2010, the ratings increased. However, non-parametric tests showed no significant change in the ratings from year to year.


The doctors’ overall evaluation of the departments’ educational effort and their self-assessed learning outcomes of the training improved from 2002-04 to 2010. In the evaluation database, scores of seven and above are defined as acceptable. Seen in this light, these issues have achieved acceptable DSQ ratings. However, no change was evident in 18 issues specific to the educational management. Consequently, there is still room for improvement. We find it especially important that efforts be made to improve the introduction (question 1), the quality of supervision (question 9) and use of a work plan that matches the educational needs (question 14), since these ratings were below 7.0, showed no improvement during the observation period and are, in our view, essential for proper medical training.

The high utilization rate in 2010 suggests that the DSQ is widely used, and the online version seems to be well embedded.

The DSQ questionnaire has been used with the same questions since 1998 and its use ensures a large amount of comparable data. The high utilization rate of the DSQ provides credibility to the specific questions answered in the questionnaire, and these data can be used as a basis for an improvement of quality and the educational environment.

However, it is a weakness that the DSQ has never been validated. It is therefore unknown whether the DSQ ratings provide a complete and true picture of the status of postgraduate medical training.

Along with the educational reform, several changes have been introduced in the organisation of health-care. Thus there are many possible variables that could bias evaluations in either a positive or a negative direction. For example, a bias could lie in the reduction of basic postgraduate training from 18 months to 12 months in 2008.

The modest impact found in the present study is supported by an earlier Danish study, which found that the reform had only a limited effect on certain structural educational issues and no or little impact on daily clinical training practice and educational culture [6].

Lessons learned 11 years after the implementation of a reform in Canada are that there are several elements that are essential for successful implementation of large-scale reforms. These include the realization that the change of management, mindfulness of the educational culture and faculty development necessarily require both time and resources [3]. Our data cannot show whether the Danish reform has yet been sufficiently implemented.

Results from the questionnaires answered by trainees are currently being used in quality management at a regional level in the UK (e.g. throughout Yorkshire) [7].

A more comprehensive use of evaluation questionnaires would, however, require a properly validated survey tool. Internationally, validated questionnaires have been developed with a view to monitoring the educational environment in hospitals. The Postgraduate Hospital Educational Environment Measure is now an internationally used instrument for measuring the educational climate for doctors in training [8]. This survey instrument has been translated into Danish and validated with good internal consistency [9]. It is worth considering whether an internationally accepted questionnaire should be used or if a new, validated Danish questionnaire should be developed.

We find that despite concern over the validity of the DSQ, it is possible that a management strategy based on the DSQ may lead to improvements in some elements of the Danish educational environment specifically addressed in the questionnaire.


Despite junior doctors’ perception of some improvement in the overall quality of postgraduate training, there is still room for improvement.

In this study, we have described a publicly accessible and well-used online database with up-to-date data. This database is beneficial for educators, health authorities, politicians, etc. However, it should be considered whether a new, validated questionnaire should be developed in order to ensure greater credibility in the future work on the quality of postgraduate medical training.

Correspondence: Troels Kodal , Krebseparken 87, 6710 Esbjerg V, Denmark.

Accepted: 3 January 2012

Conflicts of interest: none

  1. The future specialist. Report by the Specialist Commission. Report No. 1384 [in Danish]. Copenhagen: Danish Ministry of Health, 2000.
  2. Frank J, Jabbour M, Tugwell P. Skills for the new millennium: report of the Societal Needs Working Group, CanMEDS 2000 Project. Annals RCPSC 1996;29:206-16.
  3. Frank JR, Danoff D. The CanMEDS initiative: implementing an outcomes-based framework of physician competencies. Med Teach 2007;29:642-7.
  4. Kjær NK, Kodal T, Qvesel D. An evaluation of the 18- and 12-month basic postgraduate training programmes in Denmark. Dan Med Bul 2010;57(8):A4167.
  5. Davis MH, Amin Z, Grande JP et al. Case studies in outcome-based education. Med Teach 2007;29:717-22.
  6. Mortensen L, Malling B, Ringsted C et al. What is the impact of a national postgraduate medical specialist education reform on the daily clinical training 3.5 years after implementation? A questionnaire survey. BMC Med Educ 2010;10:46.
  7. Cooper N, Forrest K, eds. Essential guide to educational supervision in postgraduate medical education. Oxford: Wiley-Blackwell, 2009.
  8. Wall D, Clapham M, Riquelme A et al. Is PHEEM a multi-dimensional instrument? Med Teach 2009;31:e521-e527.
  9. Aspegren K, Bastholt L, Bested KM et al. Validation of the PHEEM instrument in a Danish hospital setting. Med Teach 2007;29:504-6.

* Required