The value of situational judgement tests for assessing non-academic attributes in dental selection

Key Points

  • Suggests the content and context of situational judgement tests (SJTs) must be updated regularly to ensure their fairness, relevance and realism for current cohorts of applicants.

  • Discusses how SJTs are efficient and cost-effective for high-volume selection contexts, in comparison to other popular methods such as face-to-face interviews.

  • Suggests that practice tests should be available for high-stakes SJTs.


Situational judgement tests (SJTs) have been shown to be reliable and valid tools for assessing non-academic attributes across numerous healthcare professions. However, within the context of selection into dental foundation training (DFT) in the UK the introduction of an SJT is relatively new. This expert opinion highlights four key considerations regarding the DFT SJT in order to inform further debate amongst researchers and stakeholders. We clarify that SJTs measure non-academic attributes important for success in dental training, and that their context and content must be updated regularly to ensure their relevance, realism and fairness to current applicants. We outline that SJTs are efficient and cost-effective for high volume selection in the long term, in comparison to face-to-face interviews. Finally we summarise the value of practice material being available for high-stakes SJTs, such as the DFT SJT. Implications for practice are discussed throughout.


Research has consistently shown situational judgement tests (SJTs) to be a reliable and valid selection method in healthcare professions, to identify a range of professional, non-academic attributes (such as professional integrity, teamworking and resilience).1,2 Given this evidence, SJTs have received a great deal of attention both in the academic literature and through stakeholder commentary. While there is a long history of research evidence on SJTs in other settings,3,4,5 it is acknowledged that, as a relatively new method within dental selection, stakeholder reactions may not be wholly positive. As such, we highlight four key considerations regarding the use of the dental foundation training (DFT) SJT, to inform further debate.

1. SJTs measure non-academic attributes important for success in dental training

It has been widely recognised in research and practice internationally that there is a critical need for robust measures of non-academic attributes at the point of selection into training for healthcare roles. For example, recent UK government enquiries6,7 highlight concerns regarding the decline in compassionate care within healthcare, emphasising the critical role of the workforce in ensuring the provision of high quality and safe healthcare services. Undoubtedly, an important first step is ensuring that the right individuals appointed to any training place have the 'appropriate attributes to work in healthcare',8 including ethical and moral judgement. Thus, the value of selection methods which assess these types of attributes cannot be understated, especially at a point in the education pathway where emphasis is on aptitude for entering clinical practice, such as DFT.

SJTs are designed to measure important inter- and intra-personal attributes for a given role, and do not require clinical knowledge to score highly.3 Historically, researchers in this field used the term 'non-cognitive' to describe these professional, non-clinical attributes.9 More recently, however, the research literature has termed these 'non-academic' attributes, and this is what the DFT SJT is measuring. In relation to this, Affleck et al.10 state that 'to regard [the SJT] simply as a test of putatively non-cognitive professional attributes is perplexing', referring to Patterson and colleague's article in this Journal11 which provided proof of concept for the DFT SJT. Although Affleck and colleagues appropriately raise the point that the term 'non-cognitive' is problematic as it may imply that no thought is required to perform well on the SJT, this is arguably a matter of semantics. Certainly they put forward an important consideration, which is also supported in the wider research literature: SJTs assess individuals' judgement regarding situations which occur in the workplace, which obviously requires some element of cognition (that is, situational judgement), but does not require clinical knowledge. That these judgements require some awareness of what it means to be an ethical dental practitioner, as Affleck et al. also suggest, is implicit in the nature of SJT design and content.

2. The content and context of SJTs must be updated regularly

Some stakeholders have voiced concerns that the consensus about what is the most appropriate thing to do in challenging work-based situations may change over time; meaning that response keys for some scenarios may become outdated during the lifecycle of an SJT.10 Certainly this is a valid point, and any method used in a high-stakes setting, such as selection into DFT, should be regularly reviewed and its content updated accordingly. Practically, an important design consideration for SJTs is to ensure that the context, content and difficulty of the test is commensurate with candidates' level of training.3 For these reasons, and in line with best practice,12 the content and relevance of existing DFT SJT items are regularly reviewed by both SJT item-writing experts, as well as by expert panel members, including training programme directors, educational supervisors and postgraduate dental deans. These experts provide input from the perspective of an employer of a foundation dentist who is being employed to work and train in a general dental practice. Any items that are no longer appropriate are retired or significantly updated. Similarly, new content is created and piloted each year within the SJT, meaning that the content of the test is refreshed to reflect the latest thinking in practice on a yearly basis. Again, this process involves creating new SJT items in close consultation with expert panel members, which undergo extensive and rigorous design and review processes before being piloted. Only items which perform well psychometrically during piloting are entered into the pool of operational test items. (See Patterson et al.3 for a detailed outline of the development stages for SJT content.)

3. SJTs are efficient and cost-effective

A number of authors in recent publications have called for face-to-face interviews in place of SJTs in selection systems for postgraduate healthcare training positions.10,13 We agree that interviews (either structured or multiple-mini interviews) are an important part of the selection process for both the candidate and the employer to assess person-organisation fit,14 and to allow candidates to provide rationales for their given responses. However, while it is important to retain face-to-face methodologies in a selection process, using interviews in lieu of SJTs would represent a plethora of challenges in terms of both practicalities and validity. Taking DFT as an example, it would be impractical to attempt to assess all of the attributes that the SJT measures (teamworking, resilience and coping with pressure, professional integrity, and empathy and communication) in a single interview station at the selection centre, and attempts to do so are likely to have low reliability and validity.15 It is worthy of note that the DFT selection centre includes two interview stations (in addition to the SJT) which provide the opportunity for probing questions and for candidates to provide rationales for their reasoning. Clinical skills are a pre-requisite for successful completion of undergraduate dental degrees, hence these are not assessed at the point of selection into DFT.

Moreover, the benefits of SJTs as a complementary method to face-to-face interviews are well documented.3 SJTs are a standardised way of assessing large numbers of applicants, such as for DFT which has c. 1,400 applicants annually, as they are machine-marked, and are a significantly more practicable, and less resource-intensive, solution than adding another face-to-face interview to the selection centre. The latter would be very costly to a health service already severely constrained by available resources. As such, recent suggestions that the use of SJTs in dental selection may represent an 'overreliance' on the methodology10 are somewhat misguided.

4. Practice tests should be available for SJTs

A common concern for many applicants is whether access to coaching will increase their scores on an SJT. However, the emerging literature does not support coaching as an effective method for increasing performance on SJTs16,17 Practically however, for recruitment and selection purposes, it is important that as much information as possible, including appropriate practice material, is available to all candidates. This has links to procedural and distributive justice theory, which relate to candidates' perceived fairness of the selection process itself, and fairness of the outcome respectively.18,19 Positive candidate reactions to selection methods are important for a number of reasons, including the fact that candidates who perceive processes to be unfair may legally challenge an organisation,20 and candidates who have negative perceptions and are unsuccessful may criticise the process and potentially reduce further applications.21

Affleck and colleagues10 have called for a practice test for the DFT SJT, including a rationale for the appropriateness of different response options. We concur with this suggestion, because – as stated by Affleck et al. – although practising the SJT content itself should not improve candidates' scores, it is beneficial for candidates to be able to familiarise themselves with the format of the test to enable them to focus on the content of their answers when they sit the operational SJT. This is an avenue of development that is currently being explored in the DFT context.


In summary, SJTs represent a relatively new selection methodology in the postgraduate dental setting, and it is therefore both understandable and expected that stakeholder reactions will be mixed. Importantly, there is a wealth of evidence demonstrating the reliability, validity and robustness of SJTs in similar high stakes, high volume settings. We have aimed to demonstrate that the concerns raised by researchers and stakeholders in the context of the DFT SJT are already addressed (or shortly due to be addressed) in the design, delivery and ongoing improvement of the SJT. One possible area for development is the creation of a practice test for the DFT SJT.


  1. 1

    Patterson F, Knight A, Dowell J, Nicholson S, Cousans F, Cleland J . How effective are selection methods in medical education and training? Evidence from a systematic review. Med Educ 2016; 50: 36–60.

  2. 2

    Cabrera M A M, Nguyen N T . Situational judgment tests: a review of practice and constructs assessed. Int J Sel Assess 2001; 9: 103–113.

  3. 3

    Patterson F, Zibarras L, Ashworth V . Situational judgement tests in medical education and training: Research, theory and practice: AMEE Guide No. 100. Med Teach 2016; 38: 3–17.

  4. 4

    Lievens F, Peeters H, Schollaert E . Situational judgment tests: a review of recent research. Pers Rev 2008; 37: 426–441.

  5. 5

    Legree P J, Kilcullen R, Psotka J, Putka D, Ginter R N . Scoring situational judgement tests using profile similarity metrics. United States Army Research Institute for the Behavioral and Social Sciences. Technical Report 1272. 2010.

  6. 6

    Francis R . Report of the Mid Staffordshire NHS Foundation Trust Public Inquiry: Executive Summary. Mid Staffordshire NHS Foundation Trust Public Inquiry, 2013. DOI: 10.1002/yd.20044.

  7. 7

    Cavendish C . The Cavendish Review: An independent review into healthcare assisstants and support workers in the NHS and social care settings. 2013. Available online at (accessed May 2016).

  8. 8

    Patterson F, Prescott-Clements L, Zibarras L, Edwards H, Kerrin M, Cousans F . Recruiting for values in healthcare: a preliminary review of the evidence. Adv Health Sci Educ 2015; DOI: 10.1007/s10459-014-9579-4.

  9. 9

    Sommerfeld A . Recasting non-cognitive factors in college readiness as what they truly are: non-academic factors. J College Admission 2011; 213: 18–22.

  10. 10

    Affleck P, Bowman M, Wardman M, Sinclair S, Adams R . Can we improve on situational judgement tests? Br Dent J 2016; 220: 9–10.

  11. 11

    Patterson F, Ashworth V, Mehra S, Falcon H . Could situational judgement tests be used for selection into dental foundation training? Br Dent J 2012; 213: 23–26.

  12. 12

    Lievens F, Sackett P R . Situational judgment tests in high-stakes settings: issues and strategies with generating alternate forms. J Appl Psychol 2007; 92: 1043–1055.

  13. 13

    Najim M, Rabee R, Sherwani Y et al. The situational judgement test: a student's worst nightmare. Adv Med Educ Pract 2014; 6: 577–578.

  14. 14

    Rynes S L, Gerhart B . Interviewer assessments of applicant 'fit': an exploratory investigation. Pers Psychol 1990; 43: 13–35

  15. 15

    British Psychological Society. The design and delivery of assessment centres. A standard produced by the British Psychological Society's Division of Occupational Psychology. Leicester: British Psychological Society, 2015.

  16. 16

    Lievens F, Buyse T, Sackett P R, Connelly B S . the effects of coaching on situational judgment tests in high-stakes selection. Int J Sel Assess 2012; 20: 272–282.

  17. 17

    Stemig M. S, Sackett P R, Lievens F . Effects of organizationally endorsed coaching on performance and validity of situational judgment tests. Int J Sel Assess 2015; 23: 174–181.

  18. 18

    Gilliland S . Effects of procedural and distributive justice on reactions to a selection system. J Appl Psychol 1994; 79: 691–701.

  19. 19

    Gilliland S W . The perceived fairness of selection systems: an organizational justice perspective. Acad Manag Rev 1993; 18: 694–734.

  20. 20

    Macan T, Avedon M, Paese M, Smith D . The effects of applicants' reactions to cognitive ability tests and an assessment centre. Pers Psychol 1994; 47: 715–738.

  21. 21

    Bauer T N, Truxillo D M, Sanchez R J et al. Applicant reactions to selection: Development of the Selection Procedural Justice Scale (SPJS). Pers Psychol 2001; 54: 387–419.

Download references

Author information



Corresponding author

Correspondence to F. Patterson.

Additional information

Refereed Paper

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Taylor, N., Mehra, S., Elley, K. et al. The value of situational judgement tests for assessing non-academic attributes in dental selection. Br Dent J 220, 565–566 (2016).

Download citation

Further reading