Main

Stephen Hancocks' thoughtful editorial Does D put the dilemma in CPD?1 raises some serious points on the 'verifiability' of some CPD, which deserve wider discussion.

One of the many valid points in the editorial questioned just how do dentists actually prove that they have learnt anything from reading a particular paper in a clinical or scientific journal? Realistically, the same question could be asked about dentists, or any other dental professional going to a postgraduate lecture/course/meeting or attending a seminar, a webinar, or indeed any hands-on course. More importantly, how can they then demonstrate conclusively to a cynical public or regulator that what they are now putting into practise is what they 'learnt' from that particular article, lecture, seminar, webinar or hands-on course?

I am writing as someone who, as a postgraduate dental tutor for more than 25 years, was responsible for running hundreds of postgraduate courses at King's College Hospital and later also at Guy's Hospital. These involved lectures, seminars, workshops and hands-on courses. While happy to help by running these courses, and participating in many of them there and elsewhere, I have often been concerned as to whether anything that has been mentioned or emphasised in these has actually made any substantial difference in achieving better outcomes for patients by application of any knowledge or skills gained from this postgraduate activity.

In writing that, I am very well aware that very many committed, conscientious dental professionals attending many meetings, seminars, webinars or reading various papers, already have lots of sensible ideas and sound dental knowledge, experience and honed clinical skills. It is very difficult, therefore, to be specific about details for example, whether someone learnt a particular skill at a particular time or improved their knowledge, or philosophy of treatment, after reading a particular paper or doing a particular course. The knowledge may well have been already 'latent' within them but somebody may have said, or written something that made that 'new' information seem much more relevant and appropriate to apply for the benefit of their particular patients.

The elephant in the room, however, is that while people may have the requisite, or apparently up-to-date, knowledge and skills, if they are put back into a corrupting system which does not reward them for doing the 'right' thing, then often the wrong thing (that perversely does get rewarded) gets done instead. For instance, some proven and audited minimally destructive approaches to tooth wear or improving tooth colour bring minimal financial rewards under the current NHS. You can talk until the cows come home, or show people how to do things practically, such as endodontics, but unless there is a reasonable reward for applying those skills and that treatment philosophy, then nothing really beneficial will happen to change things for the better and thereby improve the long-term outcomes for patients.

My view, for what it is worth, is that it is a good idea that some proof should be required from any attendee at a course or person participating in any online educational experience to demonstrate that they understood more than for example, 60% (that is, better than an even chance) of the relevant points from a particular paper, lecture or seminar and that they are therefore likely to have a decent chance of being able to apply them to help their patients.

One possible idea could be based on the topical Olympic model. CPD participants could be given a bronze certificate for 60% correct answers, silver for 80% correct and gold for 100% right answers and this could be verified as an OUTCOME measure. That could be good fun and would reward the conscientious.

As an example, Dental Protection Ltd employs a very good model at their annual Premier Symposium. They do not give a verifiable CPD certificate for this until the 'signed in course attendee' has submitted the answers to the relevant multiple choice questions. The answers are graded and the attendee is told how well they have done with appropriate feedback.

In a cynical UK society, which increasingly questions the integrity and motivations of almost all professions, it is important that one can stress that the outcome (this being the application of any knowledge gained) for whichever CPD activity, produces safer or better treatment outcomes for patients.

It seems bizarre that in some situations even when the majority of questions are wrongly answered based on a CPD programme on a computer (even with all the information straight in front of one) hours of verifiable CPD can still be rewarded. These rewards for failure can be gained with each issue of many journals. That seems to me to be not just wrong, but senseless and should not be allowed to continue. It would do nothing to reassure a cynical public if they really knew that this sort of verifiable CPD tokenism was rife. The BDJ and other journals granting such certificates need to be very aware of potentially very real 'brand damage' if this practice is allowed to continue.

Various correspondents have questioned the daftness of the current position on several occasions and in different arenas over the years, making valid points which deserve to be widely debated and many of which should be adopted. That laudable approach would hopefully lead to a more satisfactory outcome for all concerned by virtue of dentists being then able to better demonstrate that they at least comprehended the messages before, hopefully, applying these sound dental principles, learnt or reinforced, by whichever type of reliable or appropriate CPD activity that the individual chooses.

In that way, a cynical public might be more reassured of the general honesty, integrity and trustworthiness of the dental profession at large in relationship to CPD.