Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

OSCE

Scope of the OSCE in the assessment of clinical skills in dentistry

Abstract

Introduction The objective structured clinical examination (OSCE) is now an accepted tool in the assessment of clinical skills in dentistry. There are however no strict or limiting guidelines on the types of scenario that are used in the OSCE examinations and experience and experimentation will inevitably result in the refinement of the OSCE as a tool for assessment.

Aim The aim of this study was to compare and contrast different types of clinical operative skills scenarios in multi-station OSCE examinations.

Methodology Student feedback was obtained immediately after the sitting of an OSCE examination on two different occasions (and two different cohorts of students). The same questionnaire was used to elicit the responses.

Results The questionnaire feedback was analysed qualitatively with particular regard to student perception of the usefulness and validity of the two different kinds of OSCE scenarios.

Conclusions OSCE scenarios which involve phantom heads are perceived to lack clinical authenticity, and are inappropriate for the assessment of certain clinical operative skills. While the OSCE is useful in the examination of diagnostic, interpretation and treatment planning skills, it has apparent limitations in the examination of invasive operative procedures.

Main

The dental schools in the UK are currently considering their curricula in the light of the recent General Dental Council (GDC) publication entitled, 'The First Five Years – The Undergraduate Dental Curriculum',1 which was released in March 1997. This document states that it 'must be possible to demonstrate the presence of essential elements' (of the undergraduate curriculum) and so implies the need to define the essential knowledge, skills and attitudes to be achieved by the end of the undergraduate dental course.

Further initiatives by the General Dental Council with regards to continuing professional development, reaccreditation and recertification reflect the increasing importance of the attainment and maintenance of high clinical standards in professional dental practice. It is incumbent on dental practitioners and specialists in dentistry to attain the necessary competencies and by continuing education to keep up-to-date with the latest developments. This is important for the provision of best treatment and for maintaining high standards of care for patients.

It is equally important that appropriate methods of assessment of clinical competencies are developed so that it is possible to detect a fall in standards below an acceptable level. This paper examines some of the more recent methods being developed for the assessment of a range of clinical and operative skills in undergraduate dentistry, but which may also be applicable for postgraduate / GPT assessment.

Objective Structured Clinical Examination (OSCE)

Over the years, the traditional clinical examination in dentistry (as in medicine) has been the 'long case' – a patient presenting with a relevant clinical problem or condition, and the student is instructed to do a diagnosis and treatment plan under examination conditions and present the findings, usually verbally to an examiner.

The two major drawbacks of this system are that, (a) different students are given different patients with different presenting problems, and (b) examiner subjectivity resulting in inter-examiner variation in the assessment of the same performance. It was primarily for these reasons that Harden et al2 introduced an alternative approach, arguing that there was a need to remove patient and examiner variation.

They devised the Objective Structure Clinical Examination (OSCE) primarily for use in undergraduate medical assessment. This examination was structured in that the questions had a well defined marking system with predetermined answers and pass/fail criteria. It was also structured in that it comprised of a series of consecutive timed stations, and was clinical in that these stations comprised scenarios to test specific clinical skills including diagnosis, interpretation and treatment planning. 3

The gradual evolution and development of this methodology has led to its widespread use as an assessment tool in undergraduate and postgraduate medical and dental assessment.4,5,6,7 It has also been used for the assessment of certain clinical skills, communication skills on simulated patients and even clinical decision making skills. 8,9,10,11,12

One of the main strengths of the OSCE examination is its inherent objectivity whereby the aim is to remove patient and examiner variation so that the only variable being examined is the ability of the candidate. Other advantages of the OSCE system include the flexibility and versatility made possible by the multiple station design. This means that it is possible to examine a range of skills and disciplines and even to incorporate more than one skill or discipline simultaneously in the design of a particular station.

Examples of generic skills applicable to a range of disciplines include communication skills, dental charting, aspects of history taking, impression taking and cross infection control. The application of the Objective Structure Clinical Examination to the assessment of undergraduate dental skills has recently been described by Mossey et al13 and Davenport et al.14

Student Perception of The Objective Structure Clinical Examination (OSCE)

While the OSCE has been used in undergraduate medical assessment for almost 20 years, it is only in the past four to five years that the OSCE has become an assessment tool in undergraduate dental education. Its interpretation and application in the assessment of dental competencies varies quite widely, and it was perceived that the OSCEs would be used for clinical operative skills testing in the early days when they were introduced to dentistry. There is however a fundamental difference between medicine and dentistry in the conferring of clinic skills. It is sufficient for the medical graduate to have attained an adequate level of competence in examination, diagnosis and interpretation in the clinical situation, but the dental graduate must in addition be a competent surgeon.

Aim of the study

The primary aim of the study was to critically analyse the scope of the OSCE in dentistry and to answer the following questions: is it possible to use the OSCE for the assessment of all clinical skills including operative procedures? If so, how would these assessments be designed, and if not, what alternative methods of assessment would be used?

In the experience of the authors, various OSCE scenarios had been tried and tested in the preceeding years, and in considering the above questions, it was agreed that a retrospective examination of the student feedback from different diets of the OSCE examination could be carried out. Two scenarios where there were perceived philosophical differences in the designs of the OSCE circuits were chosen for comparison, one (OSCE circuit A) with mainly 'operative' scenarios and the other (OSCE circuit B) with 'diagnostic' scenarios.

Methodology

Student feedback on two Objective Structured Clinical Examinations (OSCEs) obtained by means of identical questionnaires was compared. These were run as Class examinations at different times with different cohorts of 4th year students. The dates of these were May 1994 (circuit A) and December 1996 (circuit B). For each circuit, feedback questionnaires (Table 1) were administered immediately after the examination resulting in a 100% response rate; 56 students for circuit A comprising 29 female and 27 male students and 45 students for circuit B comprising 25 female and 20 male students.

Table 1 Table 1

There was therefore a similar male to female ratio in both groups and there was no reason to suspect that there were any differences in the academic profile (background training or knowledge) of the two cohorts. In addition, there was no difference in the students familiarity with the OSCE in the two scenarios as the two cohorts of students in 1994 and 1996 were being introduced to clinical OSCEs for the first time when the assessment was done. The questions were worded exactly the same in the feedback questionnaires in both scenarios, and the four point response scale was as follows: Strongly Agree (SA), Agree (A), Disagree (D), Strongly Disagree (SD).

In addition to circling what they felt was the appropriate response to each question, the students were encouraged to record comments or observations.

One circuit described as circuit A in Table 1 was mainly phantom head orientated with an emphasis on the assessment of clinical operative skills, whereas circuit B deliberately avoided the use of phantom heads and concentrated more on assessment of diagnostic and interpretation skills in clinical dental scenarios.

Other differences were that circuit A consisted of 10 six-minute stations, four of which were phantom head stations and the students were advised that among the aims and objectives of the examination was the assessment of 'operative' skills. Circuit B contained ten 10-minute stations consisting of clinical dental scenarios without phantom heads and the students were advised that certain clinical skills were being assessed, but there was no mention of operative skills being an objective of this OSCE examination.

The objective of the study was to investigate whether OSCEs with or without phantom head scenarios were equally acceptable to students, and whether they are perceived to be equally acceptable in the assessment of both diagnostic and operative clinical skills in dentistry.

Results

Table 2 outlines the results of the two questionnaires in response to questions 1-6. This reveals a strong consensus that the aims and objectives were understood for both circuits and that a wide range of skills and dental disciplines were tested. Circuit A was however considered by the students to be much less useful, being perceived to be a relatively poor test of clinical diagnostic skills (only 41% agreed) and not a good indicator of which areas needed to be improved or revised.

Table 2 Table 2

In addition, it was interesting to note where the students were advised to expect that their clinical operative skills would be tested, 66% of them still disagreed (D and SD) that this had actually occurred. In circuit B where there was no attempt to test clinical operative skills and no scenario set up with that in mind, 80% disagreed as expected. However 89% agreed that circuit B had tested their clinical diagnostic skills compared with 41% in circuit A. The students therefore generally disagreed that their operative skills were being validly tested in either of the two OSCE circuits.

Some qualitative feedback comments recorded on the questionnaires would lend support to this impression. Examples of comments from the circuit A questionnaires included 'phantom head stations too far removed from clinical reality', 'found it difficult to take phantom head stations seriously,' and 'are we expected to wash our hands and wear gloves when dealing with phantom head procedures?'

Some of the comments in circuit B however also indicate that the students recognized the lack of clinical skills testing. Samples of comments included 'this does not really test manual dexterity at all,' 'should be more effort to assess clinical operative skills', 'better to examine communication skills on the clinics.'

The conclusion from this qualitative analysis lends support to the notion that clinical operative skills in dentistry cannot be validly assessed using unrealistic phantom head scenarios. Other student feedback comments which reflect perceived problems in the testing of operative clinical skills using OSCEs are that 'exam pressure definitely affects performance for what are normally simple clinical tasks – your mind can go blank' and 'the best place to examine clinical skills is on the clinic'.

One comment from circuit B however encompasses the positive side of OSCEs and their perceived usefulness for assessment of diagnostic and treatment planning skills: 'An excellent idea to enhance and expand clinical knowledge. It would be in our best interests as future clinicians to get as much experience in doing these as possible so that we can deliver concise, accurate, well presented care for our patients.'

Discussion

It is deemed to be important that in undergraduate dentistry methods are developed for the assessment of operative clinical skills, and the OSCE has certain advantages, being standardised and reasonably objective. Attempts to set up clinical scenarios using 'phantom heads' and acrylic models in an effort to simulate the chairside clinical situation have been monitored with the help of staff and student feedback. In a debriefing meeting set up to elicit feedback from staff after the examination and with the help of the feedback elicited from the student questionnaires the following conclusions emerged. The main drawbacks identified in the use of these scenarios included the following:

  1. 1

    Lack of clinical authenticity. Unrealistic compared to authentic clinical situations in the completion of routine clinical tasks such as soft tissue manipulation (e.g. cheek and tongue retraction), moisture control, bleeding or crevicular fluids management problems. Also unrealistic scenarios with regard to assessment of clinical effectiveness such as administration of LA (e.g. was the LA administered effectively and did it achieve adequate anaesthesia?) and cross-infection procedures unrealistic.

  2. 2

    Lack of communication skills testing. No opportunity for testing of communication skills – not only the routine communication skills in 'interviewing' such as opening, closing etc, but also those associated with clinical tasks. Prior explanation of a clinical procedure or treatment options, establishment of rapport with a patient, motivating patients and use of appropriate jargon when dealing with children, teenagers and adults before and during clinical procedures.

  3. 3

    Lack of patient management / behavioural problems. There are certain 'routine' patient management problems in dentistry such as dealing with apprehension, restlessness and anxiety, or managing a gagging reflex during the taking of an impression or making a complete denture. It is not possible to set up OSCEs for the observation of such clinical scenarios.

This highlights the shortcomings of phantom head scenarios, which are inadequate not only from the operative viewpoint, but the lack of authenticity from an interpersonal skills, behaviour management and contingency management viewpoint are also apparent. Simulated patients may help to offset some of the aforementioned communication problems, provided the scenarios are 'non-invasive' procedures.

There is convincing evidence justifying the place of OSCEs in the armamentarium for assessment of clinical skills, and in medicine it is reported that OSCEs have a beneficial effect on student learning by encouraging an orientation towards clinical aspects.

In the curricula where OSCEs are part of the summative assessment, students spend more time learning in the clinic as opposed to the library, they concentrate more on clinical skills and a greater degree of motivation for clinical work is reported.15 This is testament to the formative element of this type of examination.

Conclusion

While there is no doubt that OSCEs are a valuable and versatile method for of assessment in clinical disciplines, it is apparent that they are best suited to the assessment of diagnostic, interpretation and treatment planning scenarios and have limitations in the assessment of clinical operative procedures. Furthermore students are sensitive to these limitations.

In dentistry, therefore, there is a need to develop and evaluate objective methods for assessment of invasive clinical operative procedures. A method of assessment designed to address this need is being developed and a detailed analysis will be presented in a sister publication.

References

  1. General Dental Council, The First Five Years . The undergraduate curriculum. March 1997.

  2. Harden R M, Stevenson M, Downie W W, Wilson G M . Assessment of clinical competencies using objective structured examination. Br Med J 1975; 1: 447–451.

    Article  Google Scholar 

  3. Cushieri A, Gleeson F A, Harden R M, Wood R A . A new approach to a final examination in surgery, use of the objective structured clinical examination. Ann Royal College of Surgeons of England 1979; 61: 400–405.

    Google Scholar 

  4. Thomson D M . The objective structured clinical examination for general practice; design validity and reliability. J R Coll Gen Pract 1987; 37: 149–153.

    PubMed  PubMed Central  Google Scholar 

  5. Walker R, Walker B . Use of the OSCE for assessment of vocational trainees for general practice. J R Coll Gen Pract 1987; 37: 123–124.

    PubMed  PubMed Central  Google Scholar 

  6. Jewell D . Learning through examinations: use of an objective structured clinical examination as a teaching method in general practice. J R Coll Gen Pract 1988; 38: 506–508.

    PubMed  PubMed Central  Google Scholar 

  7. Sloan D A, Donnelly M B, Johnson S B, Schwartz R W, Strodel W E . Use of an Objective Structured Clinical Examination (OSCE) to measure improvement in clinical competence during the surgical internship. Surgery; 1993; 114: 343–50.

    PubMed  Google Scholar 

  8. Gerritsma J G, Smal J A . An interactive patient simulation for the study of medical decision-making. Med Educ, 1988; 22: 118–23.

    Article  Google Scholar 

  9. Gordon J, Sanson-Fischer R, Saunders N A, Identification of simulated patients by interns in a casualty setting. Med Educ, 1988; 22: 533–538.

    Article  Google Scholar 

  10. McAvoy B . Teaching clinical skills to medical students: the use of simulated patients and videotaping in general practice. Med Educ, 1988; 22: 193–9.

    Article  Google Scholar 

  11. Davies M . The way ahead: teaching with simulated patients. Med Teach 1989; 11: 315–20.

    Article  Google Scholar 

  12. Clinical decision making - an art or a science? Part III: To treat or not to treat? Br Dent J 1995; 178: 153–155.

  13. Mossey P A . Clinical Skills Assessment in Dentistry. Guide to Assessment of Students Progress and Achievements, eds. Godfrey and Heylings. 1997: 78–81.

    Google Scholar 

  14. Davenport E S, Davis E C, Cushing A M, Holsgrove G J . An innovation in the assessment of future dentists. Br Dent J 1998; 184: 192–194.

    Article  Google Scholar 

  15. Feickert J A, Harris I B, Anderson B C, Bland C J, Allen S, Poland G A, Satran L, Miller W J . Senior medical students as simulated patients in an objective structured clinical examination: motivation and benefits. Med Teach 1992; 14: 167–77.

    Article  Google Scholar 

Download references

Author information

Affiliations

Authors

Additional information

Refereed Paper

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Mossey, P., Newton, J. & Stirrups, D. Scope of the OSCE in the assessment of clinical skills in dentistry. Br Dent J 190, 323–326 (2001). https://doi.org/10.1038/sj.bdj.4800961

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/sj.bdj.4800961

Further reading

Search

Quick links