Introduction

Recognising that NHS Quality Improvement Scotland were about to publish draft standards for primary dental care,1 NHS Education for Scotland (NES) and Greater Glasgow NHS (GGNHS) jointly funded a pilot Quality Practice Initiative (QPI) with the stated aim being 'To provide a structured approach to improving performance while minimising the risk to patients, practitioners, staff, and the organisation through underperformance'.

The Action Plan for Improving Oral Health and Modernising NHS Dental Services in Scotland2 gave a commitment to supporting quality. This was further evidenced in the text of the Policy Memorandum on the Smoking, Health and Social Care (Scotland) Bill3which received the Royal Assent on 5 August 2005.4 The Memorandum advised that the policy intention underlying the new Regulations was to allow Health Boards to give financial help for GDS providers to support 'for example, staff, premises, infrastructure and quality'. It is likely therefore, that general dental practitioners will be expected to demonstrate quality improvements and a practice's ability to achieve national standards of care if they are to access such financial support.

Although a number of organisations have developed mechanisms to address quality in dental practice either pro-actively5,6,7 or reactively,8 there is a lack of scientific evidence to recommend any particular approach to quality management in healthcare.9,10,11,12

The QPI was regarded as a means of identifying and quantifying the support required by practitioners in achieving the national standards while improving the quality of patient care within the NHS.

Specific objectives included:

  • The development of a local 'Quality Practice Award' with a number of levels

  • Recruitment of a group of 16 dental practices at the First Level of the Award over a period of three years

  • To provide meaningful and tangible incentives to participating practices

  • To assess the quantity and types of support needed to meet the required standards.

Assessment criteria

Criteria for achievement of the 'Level One Award' were agreed by the Dental Director and Dental Practice Advisers as follows:

  • Practice records, comprising: patient records, certification, and audit and peer review

  • Communications

  • Health and safety

  • Infection control

  • Employment policies and procedures

  • Radiology.

The general principle behind the Level 1 checklist was that it should indicate the requirements for meeting, and beginning to exceed, legal minimum requirements for quality assurance. Standards were comparable in degree, but not the same as, those required for Vocational Training accreditation.

Methods

Sample

All 200 practice teams in Greater Glasgow were invited to attend a meeting in February 2003 to raise awareness of, and to recruit volunteers to the pilot. The event was attended by 95 delegates from 53 practices comprising principals, practice managers, receptionists, dental nurses and hygienists. This was followed by a mailshot to all practices enclosing a self-assessment proforma based on the assessment criteria (Appendix 1).

Table 5 Appendix 1: QUALITY IN DENTAL PRACTICE INITIATIVE - PRACTICE SELF ASSESSMENT

From the 30 practices that completed the self-assessment and expressed an interest in QPI, 16 practice teams were selected to receive direct support from the CGAs in working towards the Level 1 QPI Award. These practices represented a reasonable cross-section of practice 'types', and comprised single-handed practices, practices with up to four dentists, practices with/without practice managers, and one Vocational Training practice. The practices also represented a wide geographical spread across Greater Glasgow. All were made aware that participation was entirely voluntary. The only pre-determined criterion for inclusion was that according to their self-assessments, none met the standards required for the Level 1 Award.

Interventions

Practice meetings and Clinical Governance Adviser support

Two general dental practitioners were recruited as Clinical Governance Advisers (CGAs), each working two sessions per week. The mainstay of the support given to practices was intended to be through mentoring visits and facilitation of practice meetings supplemented by training targeted towards any knowledge gaps identified in the process.

This generally involved advising, and facilitating reflection upon, aspects of practice required to change in order to comply with the Level 1 checklist, but was not confined to such. Advice was also given on a wide range of issues important to the practices, though not strictly related to the requirements of the QPI, such as:

  • Appropriate emergency drugs

  • Design of new surgeries

  • Filing systems and storage of records

  • Employment law

  • CPD requirements

  • Procurement of surgery equipment (especially in relation to infection control and radiation protection)

  • Conducting effective meetings

  • Dealing with failed appointments.

The exact methods employed and specific inputs varied according to the observed and expressed needs of the individual practices. As the Initiative progressed a 'method' for facilitating progress emerged, as shown in Table 1.

Table 1 Facilitation process

Provision of templates for documentation

Where the required documentation was absent (eg risk assessments), a computer disk containing examples of relevant items from other practices was forwarded for discussion and adaptation to individual requirements. Teams were also actively encouraged to make full use of the Primary Care Division's Dental practitioner's manual which contained guidance and templates on all of the assessment criteria.

Design of 'whole group' support interventions

In areas where most of the participants in the pilot indicated that they felt a clear training need, efforts were made to provide a formal course accessible to the whole team in all of the practices.

Practice-based training

For the first time in Scotland, CPD allowance approval was obtained for a workshop held within individual practices, facilitated by the Clinical Governance Advisers, involving the practice team. This was the preferred format for the cross infection control training provided, as it allowed a specifically tailored problem-solving approach by the whole team working in collaboration, specific to their particular working environment and style of practice. At the time of writing the course has been taken to seven practices.

Participants' forum

A participants' email group and 'Bulletin Board' was instituted, to be used as a 'virtual forum' for teams to share ideas and facilitate each other's progress towards achieving the award.

Development of the assessment process

The two CGAs and the Primary Care Division's two Dental Practice Advisers (DPAs) established a procedure and tolerance level for assessing each item on the checklist during a practice's assessment visit.

Wherever possible, hard evidence that a criterion had been met was requested. For example, either all dentists had documented their CPD or they had not, and documentation had to be seen for the assessment to be successful.

In cases where this type of rigid assessment was impossible, other strategies were agreed, for example:

  • Tolerance levels were agreed in situations where 100% compliance with the checklist would have been unreasonable to expect (for example in certain aspects of record keeping)

  • Where complicated processes were involved, a member of staff was asked to demonstrate their normal practice. For example, to assess cross-infection control, a tray of instruments was set out in the surgery, including a disposable impression tray, a matrix band, and a bristle brush, and a dental nurse asked to demonstrate 'What you would normally do?' for dealing with the instruments prior to the next patient entering the surgery. The processes were assessed by the DPA and CGA who took notes where required and discussed after the visit whether there were any significant deviations from established good practice

  • A combination of methods where direct inspection with a tolerance level and a description of the process were required eg assessment of procedures for updating medical history forms. The practice receptionist would be asked to describe the process involved and 10 medical history forms for patients currently under treatment would be examined. A maximum of two forms not signed and dated within the last year was tolerated.

To secure as much objectivity as possible, the final assessment of a practice's achievement of Level 1 was conducted by a Dental Practice Adviser unconnected with the support and facilitation process.

Data collection and analysis

A variety of data were collected to include both measurable, objective criteria and subjective impressions or observational data. Data were also collected under the headings shown in Table 2 to assist with strategy, monitoring and evaluation. For the purposes of this paper, however, only a selection of results from the Level 1 checklist are presented in detail.

Table 2 Data collected

Results

Quantitative data

Thirty practices out of 53 returned completed Level 1 pro-formas.

Table 3 shows selected results from the 16 pilot practices at the February 2003 baseline. One result from each of the three sub-sections of the practice records component of the checklist are presented together with one result from each of the other five checklist headings.

Table 3 Baseline data for pilot practices (n = 16)

Figures for the 'Intervention' group were derived from the 13 practices still participating in the pilot at 18 months. In these cases, the checklists were completed by the CGAs based on direct observations.

The data in Table 4 allow comparison between the 'Intervention' and 'Non-Intervention' practices in their progress towards achieving the Level 1 criteria. The data for the non-intervention practices are shown in italics.

Table 4 Comparison between intervention and non-intervention practices

Whole group support interventions

The two most commonly expressed needs for further training were in communication skills and in cross infection control. Consequently, CPD allowance approved courses were arranged in both of these areas.

Summary of other relevant findings

  • The most significant barriers to progress, as expressed by participating dentists on postal questionnaires, were the costs of making improvements/lack of financial incentive, and turnover of support staff

  • At the time of writing, six practices have passed assessment for the Level 1 award

  • Analysis of postal questionnaires also revealed that practice teams felt that they were receiving the correct amount of support from the CGAs

  • Very few messages were posted on the web-forum; informal enquiry indicated that this was largely due to the number of participants without internet access.

Discussion

Data collection and analysis

The original purpose of data collection in the QPI was to assist project management, not to produce statistically valid results; hence, no potentially misleading statistical tests have been carried out. The aim of presenting the results is to share information with others involved in promoting quality improvements in dental practice.

Baseline data for pilot practices

It is reasonable to assume that the pilot practices were broadly representative of Greater Glasgow practices in general. The only qualification would be that those attending the launch would perhaps be more likely to be closer to meeting the standards required than those who did not. It could be expected therefore, that the overall picture in Greater Glasgow would show more deficiencies than the pilot group suggests.

In general, even within the pilot group, clinical governance systems were initially very weak. Many very basic areas of quality assurance and risk management were seriously deficient. The importance of the possible sequelae of these deficiencies needs no discussion, except to state that the status quo was unacceptable.

Based on the observations of the CGAs' visiting practices, it became clear that the practice self assessments lacked objectivity. There are a number of reasons why this could be the case, including:

  • The specific points under examination being misunderstood (eg widespread confusion over the meaning of the term 'PAT testing')

  • A tendency towards leniency in self-assessment generally,13 and within the public sector in particular14

  • Basic untruths revealed by CGAs' visits to the practices. Probably the most common of these would be where a practice claimed that they routinely disposed of plastic impression trays, but direct observation in that practice revealed drawers full of used trays 'ready' for re-use

  • A discrepancy between what the dentist completing the self-assessment thought happened in the practice and the actual practices of the team members. This was seen frequently in a variety of areas, and specifically, in several practices in the monitoring of emergency drug supplies. It was relatively common for a practice owner to be able to say who was responsible for checking that emergency drugs were in date, and when the checks should be carried out, but for the CGAs to find that most of the emergency drugs at those practices had passed their expiry date

  • A misunderstanding of the detail of a procedure or process being examined. For example, almost none of the practices who said in the 'Record Keeping' section that they updated all medical histories annually, had any written or electronic evidence of this having been done.

The significance of these discrepancies is two-fold.

Firstly, because all assessments carried out after the initial self-assessment were done by, or under the guidance of, the CGAs, it follows that the appearance of relative improvements for the 'Intervention Group' as presented in Table 3 will tend to be minimised.

Furthermore, it is worth noting that a number of national bodies are employing self-assessment as the main form of evaluation for various forms of accreditation or award, although it has been demonstrated in medicine and dentistry that self-assessment of neither knowledge,15,16,17 nor clinical skills,18,19relate closely to results for objective assessments. Similarly, participants in studies of the reliability of self assessment do not demonstrate an increase in validity with experience of or training in the process of self-assessment.20,21

It has also been reported in a study of applicants for employment in a public sector organisation, that there are gender related differences in the 'leniency' or 'halo' effects in responses to self assessment questionnaires,22 and that the differences are related to the style of questionnaire used.

Another study examining self-assessments of house officers, concluded that:

'[self-evaluation instruments] should not be used to judge the “accuracy” of the individual's evaluation' but, that they are: 'best used to help individuals analyse their work practices and to promote reflection on performance.'23

This mirrors the use of self-assessments in the QPI in that the checklists were employed mainly to promote reflective self-evaluation and to facilitate the production of action plans.

Comparison between pilot practices and non-intervention practices

Despite the previously discussed tendency for the methods used to minimise the apparent effect of improvements made following intervention by the CGAs, marked changes in practice were observed across the range of criteria included in the checklist.

The data that demonstrate improvements in cross-infection control have been selected for presentation here as this has been especially topical in Scotland since publication of the Glennie Group's report24 highlighted deficiencies in local decontamination practices. This report emphasised the need for improved training in decontamination for the dental team, and it is the authors' opinion that QPI's successes in this regard are due to:

  • Involvement of the whole practice team in the training

  • Taking the training to individual practice locations

  • Adopting a non-judgemental attitude and problem-solving format.

A point particularly highlighted by the examination of cross-infection control procedures was the importance of observing processes within the practice. The DPAs were surprised on more than one occasion to witness the procedures being employed in practices where a satisfactory conventional practice inspection visit had been undertaken.

The heading on oral cancer screening in Table 4 was included to illustrate the way in which introduction of a single process in a practice has the potential to improve patient care, record-keeping and, potentially, to avoid medico-legal problems.

Communication

The increase in the percentage of QPI practices holding eight or more minuted staff meetings per year was presented in Table 4 as the CGAs believe it reflected the overall trend of improvements in the QPI practices. There was a strong subjective impression throughout the Initiative that the practices which had the most effective meetings, where the whole team was involved in planning quality improvements, were those who made most practical progress towards achieving the award. Conversely, practices appearing reluctant to engage in open and honest communication showed less progress.

The role of the Clinical Governance Adviser

The Clinical Governance Advisers adopted a flexible response to the needs of individual practices and provided intensive support where required. A 'soft' communications style rather than a rigid approach enabled them to achieve the rapport and trust necessary to influence change.

Conclusions and recommendations

Clinical Governance systems were found to be poor in a broadly representative sample of practices in Glasgow. In the absence of data for the rest of Scotland, this needs to be addressed urgently on a national basis.

It is the authors' view that the direct support provided to the QPI practices produced meaningful improvements in quality assurance. It is suggested therefore that QPI should inform the development of a national system of support for dental practices employing formally trained 'coaches' or 'facilitators'.

The improvements seen were made without any financial incentives being available to the practices concerned, and the majority of changes made did have direct costs involved. As most practices cited this as a barrier to progress it seems reasonable to conclude that achievement of the required standards should be linked, by any of a variety of possible mechanisms, to the payment of a 'Quality Award'.

Self-assessment checklists should be used to promote reflective learning. They should not be relied upon as an indicator of standards.

Assessment of process is of primary importance in determining quality in healthcare9 and must be included in the future system of practice inspection and accreditation. This would be done most efficiently through a unified system where a practice is assessed by one person. Currently Dental Reference Officers assess outcomes, DPAs and Practice Inspectors largely assess structural elements and process is missed. The alternative would clearly be more efficient and more effective.