Introduction

Medical and dental training has undergone significant change since the publication of Modernising medical careers1 and within dentistry, the Department of Health's Creating the future2 firmly embedded foundation training within the development of careers for young dentists. A formal UK wide curriculum for foundation training was developed and a number of principles integrated into underpinning the delivery and assessment of the postgraduate training to reflect the required learning outcomes.

Since 1992 foundation training (previously known as vocational training) has been mandatory for all UK graduates and is comprehensively funded by the Department of Health. In 2011/12 the net total funding allocated for grants, salaries and services delivered by DF1s is approximately £94 million. In addition to this, £5.883 million was allocated for delivery of the educational programme by way of study days and for the salaries of foundation training advisers.3 Great importance is placed on the role of foundation training in the development of young practitioners so there is need for a clear, consistent UK wide standard of training supported by strong management, underpinned by skilled trainers and effectively quality assured. All trainees should receive a similar quality of training experience and be able to demonstrate similar learning outcomes as shown by the assessment process.

A recent study4 undertaken with trainers and trainees in Merseyside, UK investigated the perceived effectiveness of workplace-based assessments (WPBAs) used in the Committee of Postgraduate Dental Deans and Directors' (COPDEND) foundation training portfolio and both trainers and trainees highlighted the value of WPBAs in providing feedback and insight into the developmental needs of young practitioners. This study also reported positive feedback on the WPBA tools that trainers felt were easy to use and provided a clear and comprehensive record of progress through the training year.

Anecdotally, it has been reported that the use of the COPDEND foundation training portfolio to record training, assess progress and competency during training has not been universally accepted or comprehensively used throughout England and Wales.

The aim of this study is to evaluate the effectiveness of WPBAs in the UK wide national scheme and explore potential areas for development, ensuring a standardised delivery of the foundation training curriculum, assessment of progress and competence.

Methods

Sample

The sample consisted of all current trainers (741) and foundation trainees (643) in the UK from 12 deaneries in England and Northern Ireland in June 2010.

Ethics

Ethical approval was granted by the University Research Ethics Committee.

Data collection

An anonymous questionnaire was designed to evaluate the WPBAs in foundation training. The questionnaire was informed by the pilot study,4 which focused on WPBAs in one regional deanery in the North West of England. The questionnaire from the pilot study was modified and then piloted by all members of the research team and 20 randomly selected trainers. The questionnaires were subsequently modified for the present study.

Two questionnaires were designed, one for trainees and one for trainers. The questionnaires contained 15 and 17 questions respectively. These were divided into four sections. The first investigating demographic information and then subsequent sections covering the dental evaluation of performance assessment (D-EPS), case-based discussion (CBD) and patient assessment questionnaires (PAQs) that focused on the use of feedback and the extent to which trainees and trainers felt that WPBAs had improved patient care and clinical practice.

The surveys were distributed by e-mail individually to potential participants in May 2011 (month ten of the training year) to all current trainers and foundation trainees in each deanery in England and Northern Ireland. Each deanery nominated a designated administrative contact who managed the distribution of the e-mails. Administrators were asked to keep a log of any mails that could not be delivered and were requested to inform the research team of the total number of questionnaires that were successfully sent out. This number was used to calculate the total response rate.

The e-mail sent by administrators contained a link to the online questionnaire via Survey Monkeyâ„¢ and an information sheet that described the study in detail. Each trainee/trainer was given a randomly allocated study number that respondents inserted at the beginning of the survey. This allowed non-responding trainers and trainees to be followed up at two weeks and four weeks after initial distribution, the survey was closed in August 2011. The non-clinical research assistant had sole access to the study numbers and results.5 Only the deanery administrators had access to the study numbers and associated e-mail addresses, at no time could the e-mail address and responses be linked.

Analysis

All quantitative data was input into PASW 18 statistic data editor (SPSS). From this database frequencies were used to examine the distribution of all variables and describe the sample demographics, as well as identify differences in responses across the two questionnaires. The qualitative data comments were analysed using a thematic analysis approach incorporating organisation, familiarisation, reduction and analysis.6,7 The qualitative data were analysed independently by the research team to enhance internal validity and to ensure concordance of themes among the research team thereby enhancing validity of the findings. QSR NvIVO 9 was used to assist this process.

Results

The survey was sent electronically to 643 foundation trainees and 741 trainers. Three hundred and fifty-nine (55.8%) and 559 (75.43%) responses were received respectively from the 12 deaneries (see Fig. 1 for a breakdown of the responses received from each deanery).

Figure 1
figure 1

The sample percentage representation from each deanery (trainees and trainers)

The majority of trainer respondents were male (75.4%, 416). There was a wide spread of ages of trainer respondents, ranging from under 25 to over 60 years old and of these 54.6% (303) respondents were aged between 36-50 years old. The trainee respondents had a more even gender split with 42.3% (152) males. The majority of trainee respondents, 91.6% (328) were aged <25 to 30 years old.

Figures 2a and b demonstrate that all the WPBAs were considered by the majority of trainees and trainers to be 'useful' in training. Overall it would appear that CBDs are seen by both trainers and trainees as the most useful of the WPBAs tool.

Figure 2a
figure 2

Percentage of trainers and trainees who found the WPBAs useful

Figure 2b
figure 3

The percentage of agreement by trainees and trainers with individual statements about the WPBAs

Training of trainers

Surprisingly over a quarter of the trainers (25.7%, 144) who responded to the survey stated that they had received no formal training in the use of WPBAs.

The dental evaluation of performance assessment (D-EPS)

As shown in Figures 2a and 2b, trainees and trainers' feelings about D-EPs were mainly very positive. Both trainers (65%, 499) of and trainees (84.4%, 282) felt that D-EPs feedback had enabled trainees to improve their patient care.

Seventy percent (503) of trainers and 85.5% (288) of trainees felt that feedback gained from D-EPs had given the trainees insight into their own developmental needs. Trainees (281, 83.4%) and trainers (525, 66.9%) felt that D-EPs feedback actually highlighted things that the trainees did well and improved their confidence (trainers 462, 63.1%; trainees 260, 77.2%). 63.8% (449) of trainers and 71.8% (239) of trainees felt that D-EPs feedback enabled trainees to be reflective in their clinical practice. Indeed, 57.7% (442) trainers and 70.1% (235) of trainees felt that D-EPs was an appropriate form of assessment and 68.2% (229) of trainees felt that the grades that they had been awarded for D-EPs were an accurate reflection of their abilities.

The verbatim quotations that are presented here are exemplars of the identified themes from across all the questionnaires and are identified by questionnaire number and identified as trainer or trainee.

Overall a very positive response was received from both trainees and trainers about the D-EPs process, highlighting the positive outcomes from the assessments. However, some trainers and trainees felt that the process of being observed was very stressful for some trainees, which may have impacted negatively on their performance.

Some of the trainees and trainers felt that the number of D-EPs that had to be performed during training should be altered, believing the process to be highly valuable at the beginning of the year but becoming rather onerous by the end of the year.

A number of trainees and trainers commented that the D-EPs process would benefit from being standardised to make the process fairer, as demonstrated by this comment from a trainee;

'D-EPs should be standardised throughout the scheme in all training practices, it is unfair that some [foundation dentists] have intense supervision and scrutiny during D-EPs whereas other (trainees) do not have their trainer present for the procedure'. (Trainee 218)

Worryingly a number of comments were made stating that the assessments were completed by staff without a dental qualification as highlighted by this trainee;

'As this was often completed by other members of the dental team who do not have a BDS qualification rather than my trainer, I believe it is not a true reflection'. (Trainee 302)

The trainers also acknowledge that there is a great deal of difficulty in making the process standardised;

'There is an unavoidable issue of calibration between trainers. Some trainers marking more severely than others.' (Trainer 491)

The trainers felt that specific tailored training for the process was required and trainers made comments highlighting the fact that they would like further training on their use,

'Need formal training in D-EPs so assessments are consistent.' (Trainer 523)

'A useful assessment but teaching of trainers in their use is poor.' (Trainer 504)

It was also noted that some trainees and trainers would like to see the scoring system revised;

'The grading system of performance is not varied enough, need to have more options. Currently the grading options available are limited.' (Trainee 234)

'Although the criteria for scoring D-EPs is unclear I think it is a necessary assessment. Maybe a rubric for scoring of 1-5 would be a welcome addition.' (Trainer 349)

Cased-based discussion (CBD).

As shown in Figures 2a and b, trainers and trainees found CBD the most beneficial of the WPBAs. Both trainers (84.4%, 505) and trainees (92.4%, 296) felt that CBD feedback had enabled trainees to improve their patient care.

85.5% (512) of trainers and 93.1% (303) of trainees felt that feedback gained from CBD had given the trainees insight into their own developmental needs. Trainees (97%, 299) and trainers (83.4%, 512) felt that CBD feedback actually highlighted things that the trainees did well and improved their confidence, (trainer 77.2%, 466; trainees 86.5%, 276). 71.8% (456) of trainers and 83.8% (258) of trainees felt that CBD feedback enabled trainees to be reflective in their clinical practice. Indeed 70.1% (462) trainers and 81.7% (255) of trainees felt that CBD was an appropriate form of assessment and 76.6% (259) of trainees felt that the grades that they had been awarded for CBD were an accurate reflection of their abilities.

There were numerous positive comments made by both trainees and trainers about the experience of CBD in training. Many trainees and trainers commented that the CBD process encouraged discussion and reflection in a very relaxed environment, which some trainees and trainers felt allowed the best learning environment. However, both trainees and trainers felt that there could be some improvements made to the process;

'Rather than focusing on cases that only the foundation dentist has seen it would be better if the remit of the possible cases that were open to discussion could include cases from the trainer and other dentists as well'. (Trainer 34)

Patient assessment questionnaires (PAQs).

As shown in Figures 2a and b the trainee and trainer responses to the statements about the impact of PAQs on the training experience was overwhelmingly positive. Indeed, trainees (62.7%, 215) and trainers (61.1%, 332) felt that PAQs had encouraged trainee reflection in clinical practice. Trainees (75.8%, 278) and trainers (82.1%, 411) believed that PAQs had increased trainees' confidence, enabled trainees to make improvements to their patient care (trainees 79.4%, 431; trainers 76.7%, 417), given trainees insight into their own developmental needs (trainees 78.5%, 282; trainers, 73.9%, 401) highlighted things that trainees did well (trainees 86.7%, 470; trainers 86.7%, 470), highlighted things for them to develop through their tutorials (trainees 68.2%, 245; trainer 68.6%, 372) and actually changed their clinical practice (trainees 78.4%, 283; trainers 67.3%, 365)

Fifty-nine percent (203) of trainees and 55.2% (300) of trainers believed that PAQs are an appropriate form of assessment. Trainees also believed that the feedback given through PAQs were an accurate reflection of their abilities 68.9% (237).

PAQs were thought by many trainers and trainees to be an essential part of training. Both trainees and trainers reported that the majority of the PAQs received back were positive, which boost the confidence of the trainees but do not help in identifying areas for improvement. However, both trainees and trainers did raise some concerns over the accuracy of the data collected in relation to the PAQs. It was felt that some patients would spend very little time on the questionnaire and also that patients may not feel that they can be completely honest in their responses, therefore questioning the validity of the feedback from this tool.

'I always find it difficult to obtain an honest opinion from a patient, they tend to be guarded when unhappy with an aspect of care and may well complete the PAQ with answers they think you want to hear rather than how they actually feel.' (Trainer 303)

'I feel patients are reluctant to be honest on PAQs as they feel they are not anonymous. If they were posted out to the patients at home this may help.' (Trainee 251)

As illustrated in the quotations above, trainers and trainees both expressed concern about the way in which these PAQs were actually given to patients, it was highlighted that this factor has an impact on whether the patient feels that the questionnaires are truly anonymous and therefore could possibly impact on answers given.

'PAQ are a useful method of getting feedback from patients... I feel that it would work better if the responsibility of handing them out was placed upon the reception staff as they would be less likely to be biased as to who they were handed out to, whereas there may be more bias and, therefore, less accuracy, if handed out by the DF1.' (Trainee 250)

There were also comments made that indicated that it may be necessary to review the format of the instructions for the PAQs given on the electronic personal development plan (ePDP), to ensure clarity. It may also be necessary to clearly define how, when and to who the PAQs should be given to.

Trainers and trainees both felt that the format of the actual PAQ needed revising. Suggesting that some of the questions were inappropriate and that there should be a space for patients to leave 'free text' comments, which would give more useful feedback.

It was also suggested that perhaps more specific training should be tailored to let trainees know how to use the information gained from PAQs;

'I was very disappointed by the way this patient questionnaire exercise was conducted. Instructions for the task were posted on the ePDP but were not clear. When I sought further information from my trainer he was unable to help me because he had not received any information about it.' (Trainee 244)

Discussion

This national study follows on from a pilot study carried out in one of the deaneries in the North West of England.4 The response rate was acceptable for an Internet-based questionnaire, which was used for ease of administration, cost, accuracy and completeness of the information.4

This study confirms the findings of the pilot4 in that WPBAs were acceptable and effective in providing feedback and insight into young practitioners developmental needs. However, it also raised a number of issues with regard to their effectiveness and delivery as part of a national training scheme.

At a cost of approximately £100,000 for each foundation dentist to complete DF1 there should be a high quality standardised approach to the delivery of the training, where WPBAs play a major role in directing learning and the development of young practitioners. It was therefore disappointing that over 25% of trainers reported that they had received no formal training by their deaneries in the use of WPBAs and that non-clinicians (practice managers) were in some cases carrying out clinical assessments.

Although the overall consensus from both trainees and trainers was very positive regarding the use of D-EPs, some negative comments and concerns were raised. These related mainly to lack of standardisation of procedures assessed and the necessity for training and calibration of assessors.

In dental foundation training the current scales for D-EPs utilises construct alignment to developmental level recorded by an ordinal categorical six-point scale with six anchors ranging from 'below expectation for FD1' to 'above expectation'. Assessor training and calibration 'group work' has not been found to be particularly effective in medical training,8,9,10 as assessors may agree on performance but interpret scales differently or disagree about response scales even when they agree about what they have observed.8,9,10 Making judgments without clearly defined criteria makes calibration nationally of trainers difficult.

A move to a tool that is well aligned to the expertise and priorities of clinical assessors is a key factor in reducing assessor variance and discrimination.8 In surgical training (UKISCP) procedure based assessments (PBAs) are now used. These still provide a tool for feedback and reflective learning but use a construct alignment to capability for independent clinical practice using a categorical four-point scale with four anchors reflecting competence and supervision requirements during training. PBAs incorporated into dental foundation training would ensure that the assessments would be carried out by a trained clinician, reduce the variability of grading as the descriptors are far more specific; facilitate designation of procedures to be assessed thus allowing for a national standard set of PBAs to be completed by all trainees, and would possibly reduce the overall number of assessments required.

Case-based discussions were felt by both trainers and trainees to be the most beneficial of the WPBAs in providing feedback for the trainees developmental needs and improving patient care. It was disappointing to note that these assessments were occasionally carried out by someone other than the trainer for example, practice manager. This is unacceptable and confirms the findings within the study that training of trainers and trainees in the proper use of WPBAs is lacking.

In this study the patient assessment questionnaire (PAQ) was regarded as having a positive impact on the training experience. The use of a WPBA that is reliable, has an educational impact, is valid and feasible is important when making assessments of performance in the workplace. One of the difficulties with patient satisfaction questionnaires is ensuring that the questions are appropriate, unambiguous and designed for the purpose intended.

Within medical practice the consultation and relational empathy measure (CARE) has been developed and validated in primary care and has been shown to reliably discriminate between doctors.11 In the current study it was felt by trainers and trainees that the design of the PAQ utilised in dental foundation training did not necessarily deliver valid feedback, with inappropriate questions and little opportunity for patients to comment other than grade, for example the trainees' empathy.

It was also evident that the method of dissemination of the questionnaire to patients and the collection and dissemination of the data varied between the deaneries. This again highlighted the need for training for both trainers and trainees on the use of patient assessment questionnaires.

A recent multicentre study demonstrated that multisource feedback from professional colleagues and patient feedback from consultations is more likely to provide a reliable and feasible opinion of clinical performance.12 In addition to a formative role of providing feedback on progress and performance it is suggested that these two tools are reliable enough to inform a high stakes judgment on the outcome of training and therefore introduction of MSF should be considered in the future.

Conclusions

The results from this study confirm that the experience of WPBAs is positive in that they have a role in the trainees' learning during foundation training. In order to provide a consistent approach to the delivery of foundation training and its learning outcomes nationally changes are required to the WPBA tools used. The importance of comprehensive training in their use for both trainers and trainees has also been highlighted.