Main

Oral surgery is arguably the most invasive of all the dental disciplines and is associated with considerable stress for both the inexperienced undergraduate and the patient. Competence in this discipline involves assessment of the students' knowledge, practical skill and attitude.1 Assessment of knowledge is usually achieved with summative assessment techniques such as written examinations and viva voces. However, assessment of practical ability is subjective, dependent upon student and staff experience. It also requires close monitoring by a motivated and vigilant staff. In addition, practical examinations, rather than informal assessments, provide a different perspective on student abilities than do daily clinical grades.2 Recently there has been a trend towards competency-based assessment on a more objective level.3

Many dental institutions use objective structured clinical examinations (OSCEs) in their assessment. This form of assessment has been shown to be a valid and reliable method of assessing some clinical procedures, but it has its limitations.4 Even continuous assessment can fail to identify those students who are underperforming, allowing them to continue without developing a reasonable level of competence or self-confidence. Ideally a formative assessment that increases self-awareness and encourages self-evaluation and learning would be more beneficial and would highlight those students requiring closer supervision.5 One such assessment is the structured clinical operative test (SCOT) which uses a checklist for the assessment of a clinical task.5 The advantage of this form of assessment is that it involves authentic clinical procedures on real patients commonplace in the dental undergraduate curriculum, therefore encouraging learning 'in context'. The performance of the student is signed up as being completed or not, rather than pass or fail. In the latter case an incomplete performance is followed by counselling by the supervising staff and the student may attempt the SCOT on another occasion as many times as required to complete the assessment. Similar checklist-based clinical assessments are in place in other institutions, eg checklist assessment of operative skills (CAOS). The use of such assessment methods have already been investigated in dentistry.6,7,8 Such a formative assessment recognises the great deal of variation that exists between each clinical scenario, therefore the criteria need to be broad enough to encompass a number of clinical variables. This is particularly true of exodontia as it is not possible to standardise the extraction due to variables such as patient anxiety, presence of pathology, periodontal disease, condition and position of tooth and root morphology to name but a few. Although the SCOTs are being used as a formative assessment tool, they have been devised so that after validation it would be possible to use them in summative assessment.8 However, a successful SCOT needs a checklist with clearly defined criteria that are reproducible.7 Global rating scales have been used for the assessment of the removal of third molars but as yet there are no accounts in the literature of such an assessment for simple exodontia.3,9 It is thus timely to appraise approaches to clinical assessment in dentistry.10

The aim of this study was to determine whether SCOTs could be used as a formative assessment tool in the oral surgery undergraduate course. In addition we were particularly interested to see how the SCOT was received by both students and staff.

Materials and Methods

The teachers in oral surgery devised a SCOT for the task of simple exodontia that covered each component of a complete patient episode, including cross-infection control (Table 1). The checklist covered a number of 'micro-skills', some of which required specialist knowledge, and also communication skills. Therefore the guidelines provided to the examiners were that the student had to effectively explain the procedure to the patient, confirm the tooth for extraction and provide the patient with post-operative instructions as well as, in their opinion, establishing a rapport with the patient. This checklist was validated on a small group of final year students and senior house officers by two experienced members of staff. The staff involved in the validation had designed the checklist and were therefore very clear as to the remit of the examiners and had accordingly produced a high level of agreement when validating the checklist. Because of the constraints of the timetable, another three members of staff were recruited as examiners and underwent a brief period of training in the use of the checklist. The checklist was made available to all fourth year students in advance of the assessment. Any further guidance or clarification about the criteria in the SCOT could be discussed with the assessors before commencing the SCOT during a routine oral surgery clinic.

Table 1 The SCOT checklist for simple exodontia

The SCOT was student-led in that they indicated to the staff when they felt prepared to undertake the assessment. The examiner remained in the surgery with the student throughout the assessment and immediately afterwards the student was provided with feedback on their performance as well as an opportunity for self-reflection. The SCOT could be attempted as many times as required to secure success. To motivate the students they were advised that the SCOT would run for only one term. Again in order to motivate the students, they were informed that successful completion of the SCOT was a pre-requisite for presentation for the penultimate professional examination. In this respect there was a summative element to this assessment, which was used to augment continuous assessment rather than being taken in isolation. On completion of the SCOT the students were asked to complete a questionnaire outlining their impression of the SCOT (Table 2). Feedback was collected by semi-structured interview from the staff involved.

Table 2 Student feedback questionnaire

Results

Student performance

All of the fourth year students (49) took part in the SCOT with 57% of them achieving success on the first attempt and 86% after two attempts. The remaining seven students required between three and five attempts.

The commonest cause for failure was inadequate cross-infection control, accounting for 35% of failures. Thereafter students failed the SCOT due to inappropriate patient/operator positioning, choice of anaesthetic or anaesthetic technique or poor patient management/communication skills (Fig. 1).

Figure 1: Reasons for failure at the first attempt of the SCOT.
figure 1

Twenty one of the 49 students were unsuccessful on the first attempt at the SCOT. The reasons for this are shown as a percentage and are: poor cross infection control; inappropriate/incorrect local anaesthetic technique; patient (LA) /operator position; incorrect/inappropriate instrumentation; poor patient management

Student feedback

The anonymous questionnaire distributed to the students had an 82% response rate and is shown in Figure 2 Fifty-three per cent thought that the SCOT was a fair assessment, although only 40% felt that they benefited from this exercise with improved confidence. Thirty per cent thought that they changed their normal practice in order to perform for the examiners. Although 50% thought that the SCOT interfered with the running of the clinics, 80% thought that the assessment should be repeated the following year. In addition 23% would like to repeat the SCOT in the final year. The greatest criticism was of too much variation between examiners (100% of students), including a lack of feedback from the examiners (27% of respondents) and the examiner leaving them during the assessment (50%). Indeed one of the three examiners had twice the failure rate of the other two. Forty-eight per cent of students claimed not to have had access to the marking schedule prior to the assessment despite being readily available. Another point raised by the students was that in many cases the assessment turned into a viva voce rather than an observed test that tended to disrupt their performance. Some students would have preferred that there was either one examiner for the whole year, or two examiners for each SCOT. Other suggestions were that the SCOT should be used on a continuous basis, allowing a number of attempts to compensate for the variability in the clinical cases to counteract a poor performance on an 'off' day. If some students were being assessed on periodontally involved single-rooted teeth and others had to remove teeth with bulbous, complicated multiple roots especially. Other problems encountered were that some SCOTs could not be completed because of failure to achieve adequate anaesthesia either because of acute infection or patient anxiety.

Figure 2: The results of the SCOT feedback questionnaire.
figure 2

The students who took part in the SCOT were asked to complete a questionnaire shown in Table 1. Forty students completed the feedback questionnaire. They answered yes or no to 10 questions and each column indicates their replies expressed as a percentage.

Staff feedback

The staff found the SCOT very labour intensive, causing disruption of the clinics and difficulties supervising other students, hence the reason that they could not remain with the student in every case but returned to observe the extraction. They also commented that it was difficult to remain objective, especially with students who had attempted the assessment on more than a few occasions. They felt that the checklist was still too subjective and covered too many components. The staff found it difficult to merely observe when it became obvious that the student was struggling due to incorrect choice of forceps, inappropriate operator or patient positioning or poor technique and had to resist the temptation to step in. In some cases they asked questions to prompt the student into realising that they had not taken the medical history into account when choosing the local anaesthetic or had forgotten to give a long buccal block for a lower molar extraction. This was done in good faith, trying to provide the student with an opportunity to redeem themselves. Often with the weaker students, the staff had to discontinue the SCOT and adopt a pragmatic approach to teaching. The examiners had to ensure that patient safety and quality of care were not compromised and therefore it was entirely appropriate for the staff to intervene in these cases.

On the whole the staff were of the opinion that the SCOT had potential, as it did highlight those students who were struggling in comparison with their peers. A major problem was that staff became fatigued performing these assessments each day.

Discussion

Assessment of clinical skills is essential for patient safety as well as providing feedback and motivation for continued learning,10 especially in a profession that may be perceived by the general public as a practical rather than an academic speciality compared with medicine. However, the commonest assessment methods of observation and judgement and the use of fixed schedules of clinical requirement have been perceived by teachers in restorative dentistry as not particularly valuable.10

There are few opportunities for the same member of academic staff to observe every student completing a manual task, although in Dundee this is accomplished during an intensive surgical dentistry course run at the end of the first clinical year. This course has served to identify those students who are struggling with the more practical component of the oral surgery course, allowing them to be targeted for more intensive instruction and supervision. We encourage self-reflection with this course and use it as an opportunity for student feedback. However, in addition to assessment of surgical skills, the students would also like the same kind of feedback on simple exodontia to reassure them that they are progressing. Some students lack confidence and this may be compounded by variation in the level of support and verbal encouragement given by some staff. These students would benefit from a standard to measure themselves against. That is why we feel that it is essential to provide some form of assessment that is mutually beneficial such as the SCOT.

Problems with SCOT

As this was the first time the SCOT was introduced to the oral surgery curriculum, we encountered a number of problems such as examiner variability and the inevitable disruption of clinics. The fact that the most common cause for failure was inadequate cross infection control highlights a potential area for further attention. Some might argue that cross infection could be examined as a separate 'micro-skill', but it is an essential aspect of clinical practice and therefore we felt that it had to be included. Feedback suggested that the students resented failure due to poor cross infection when the actual provision of treatment was satisfactory. Dental student attitudes towards infection control have shown discrepancies between what they believe to be appropriate and what they actually do.11 Even in this SCOT, 30% of students said that they modified their normal practise to satisfy the examiner. Awareness of cross infection control needs to be raised and monitored, although the clinical significance of washing your hands for 10 seconds as opposed to one minute before donning gloves is perhaps debatable.

One way around this problem may be to allocate a scoring system to each component of the SCOT, weighted in favour of the more practical components of the procedure, or distinguish the procedure into a before, during and post extraction phase. A pass mark could be set against a scale in which several marks represent a pass rather than the all or nothing approach that we adopted.

The apparent inter-examiner variability may be accounted for by differences in the clinical cases and student abilities, or due to the criteria for the checklist not being objective enough despite validation by two other examiners. However, the constraints of the timetable were such that we could not specifically assess the inter-examiner variability by having the examiners assess all the students on the same patient episode. The training of examiners should be reassessed regularly, perhaps every term. Despite a period of training for the examiners involved, there appeared to be differences in their interpretation of the checklist with some examiners seizing this as an opportunity for a viva rather than merely observing clinical performance. Manogue et al.10 also found that a lack of consistency or objectivity of assessment was the commonest perceived departmental problem with practical assessment in restorative dentistry. To address this issue we intend to modify the checklist by paying careful attention to the wording of the criteria to ensure that they are unambiguous.7 It may be necessary to stipulate that only multi-rooted teeth with minimum crown breakdown be used for the assessment to try to improve standardisation. Thereafter we intend to reassess its validity using a video of a number of clinical scenarios to investigate inter-examiner variability in scoring. An alternative that we are investigating is the use of a manikin that would allow us to standardise the SCOT more effectively. In this instance we would look at a small number of micro-skills such as operator/patient positioning and choice and use of forceps. The disadvantage of this system is that it is not a real chair-side procedure but may be useful for less experienced students to introduce them to the concept of the SCOT.

The main problem with the SCOT is that it is very labour intensive and disruptive, which is a common problem of clinical assessment techniques.10 With our current staff levels we found the exercise produced more stress for our staff and students and inevitably patients. The assessment should be more integrated into the course so that the students can make each patient episode a SCOT if they so wish. This would reinforce good clinical practice and build self-confidence as well as encouraging self-assessment. The problem of staffing could be helped by investigating the use of final year dental students as examiners, as has been done with teaching and examining medical undergraduates.12 To this end we intend to investigate the use not only of students as examiners but also junior members of staff and dental nurses. If this is possible then the SCOTs could be run throughout the academic year allowing the students to have repeated attempts at this assessment. Each episode should allow them to reflect on their own development providing them with a standard with which to measure themselves throughout the course.7 This should help to identify weaker students earlier, allowing them to be targeted for closer supervision. Ultimately it is hoped that the SCOT will provide a more objective assessment of the students' ability, allowing the students to develop greater confidence in what is perceived to be an invasive and stressful aspect of the undergraduate curriculum.

Conclusions

Our experience has highlighted some problems with the SCOT, most notable being the students' perception of inter-examiner variability. The objective assessment of clinical skills is crucial to the production of graduates of a uniformly high clinical standard. This formative assessment may also be used as a summative assessment when combined with limited experience (eg a low number of extractions). In this situation failure in the SCOT could prevent the student from attempting the professional exams. Greater familiarity with the format of the assessment should make this a fairer, less stressful experience for the student. Ultimately the assessment is designed to benefit the student rather than being viewed as yet another examination. However, greater care in defining the role of the examiner is required. This method of assessment could be adopted by other dental institutions allowing standardisation of practical assessment throughout the UK.