Introduction

Michael Gove’s declaration during the 2016 referendum on UK membership of the European Union that ‘the people of this country have had enough of experts’ (Youtube, 2016) is now notorious—and could be taken as proof that Ministers are not interested in evidence. Even advocates for evidence-based policy worry that we live in as ‘post-truth’ age (Breckon, 2016). But it is important not to allow a stereotype to take hold: after all, as a Minister, even Michael Gove commissioned evidence (for example, Goldacre, 2013). Meanwhile, a recent survey for the Institute for Government (IfG) indicated that 85% of people wanted politicians to consult professionals and experts when making difficult decisions, and 83% wanted government to make decisions based on objective evidence—and that these percentages had increased since their previous survey in 2014 (IfG, 2016a). A survey of those who are called to advise Ministers—senior officials—emphasised their open-ness to evidence and the factors bearing on their assimilation and adoption of it (Talbot and Talbot, 2014). A recent survey of over 50 UK Parliamentarians by academics proposing to establish a science-based Parliamentary Evidence Information Service identified a clear desire to engage with evidence in the formation of policy, and uncovered a range of factors affecting how Parliamentarians came to make judgements about policy (Lawrence et al., 2016).

Of course, specific examples of evidence-free policy do arise. But I will argue that in practice, proof of evidence is largely normative now in governmental policy-making and requirements of proof of evidence are built into the processes of government. Additionally, the development of devolved government within the UK provides new spaces for policy trials and experiments, although policy learning and transfer as a result of devolution remains at an early stage of development. I give specific examples from the Welsh context of such initiatives. Finally, I drawn on experience as a minister in the Welsh Government (Andrews, 2014) to set out some examples of how ministers commission and draw on evidence and can be persuaded of the value of expertise, and urge a greater understanding of three factors which bear on ministers’ approaches to policy decisions: time constraints; confidence in evidence sources; and the wider political ‘authorising environment’.

Ministerial engagement in policy-making

There are principally three reasons why the myth of evidence-free Ministers persists. First, as Cairney explains (2016), the process of policy-making is not well-understood, including amongst academics who wish to see their research having a stronger bearing on policy. Second, as Rhodes (2012) has argued, political science literature has little to say about what Ministers actually do. Third, as the former head of Tony Blair’s Prime Minister’s Delivery Unit has asserted (Barber, 2015), there is little in the literature on government about how policy change is carried through to delivery. I will take these arguments in turn.

First, policy is not formulated in a sterile laboratory environment. Cairney points out that:

In the real world, the evidence is contested, the policy process contains a large number of influential actors, scientific evidence is one of many sources of information, and policymakers base their decisions on a mixture of emotions, knowledge and shortcuts to gather relevant evidence.

Policymakers rely on evidence sources whom they trust. Building such relationships of trust can take time—and some policy solutions take years to become adopted, or to be seen as the new ‘common sense’. Persuasion, argument and framing of issues are all important in that process. Cairney also points out that policymakers exist in a complex system, shaped by institutions, ideas, networks and events anticipated and unexpected. (Cairney, 2016).

Second, political leadership and its development is ‘under-theorised and under-researched’ (Hartley, 2010, p 146). The study of Ministers often concentrates more on the political circumstances leading to the emergence of policies or decisions than the role of Ministers or political leaders in the detailed development of such policies. As Rhodes (2012) says, ‘The surprise is that mainstream political science should have had so little to say about the occupation of politician.’ Ministers are ‘missing links’ in academic research (Pollitt, 2006). Klein and Marmor (2008) argue for understanding of the perspectives, motivations and concerns of decision-makers in government, urging ‘empathy in the sense of capturing what drives policy actors and entering into their assumptive worlds.’ They warn that policy-making can be ‘as much drudgery as drama, a constant process of tinkering and repairing’. If analysis of what UK Ministers actually do on a day–today basis is thin, research on ministers in devolved administrations is virtually absent. Lynch (2006) in his work on First Ministers in Scotland and Wales complained that there was ‘insufficient evidence’ to analyse properly the relevance of theories of governance within the devolved administrations.

The advice given to civil servants on working with Ministers is directly relevant to academics seeking to influence policy:

Unless we can make that imaginative leap to see the world from a minister’s perspective, we cannot help them discharge the exacting duties upon which successful government depends (Jary, 2015).

Third, the lack of study of the day-to-day ‘drudgery’ or routines of government is a block to understanding (Barber, 2007, p 111):

Stubborn persistence, relentless monotony, attention to detail and glorying in routine are vastly underestimated in the literature on government and political history.

More recently, he has said that ‘very few of the books and little of the commentary focus on how to run a government so that it delivers the change it has promised’ (2015, p 11). Perhaps there is no surprise at that: political commentators on national newspapers are more interested, in the words of one columnist and former editor, in ‘fireworks, war-cries, blood on the decks’. Journalists, he says, ‘are bored by strategy, consistency, plugging on.’ (Moore, 2010). Barber argues that without routines, you get ‘government ‘by spasm’ (2017)—routines are ‘a way of making a complex and often anarchic world seem manageable’ (Bevir and Rhodes, 2010, p 202).

I would argue that analysis of the ‘routines’ of government would demonstrate that evidence-based policy-making is broadly normative now in the UK system of government. The need for evidence is reinforced by Treasury guidance in the Green Book and the Magenta Book on frameworks for appraisal and evaluation of major projects (See, for example, Price and Thurston, 2012; Davies, 2012). While it has been argued that the ROAMEF cycle of policy-making (Rationale, Objective, Appraisal, Monitoring, Evaluation, Feedback) cycle is unrealistic (IfG, 2011) it is still current Treasury guidance. Draft UK Government and devolved administration laws are expected to be accompanied by detailed impact assessments. Indeed, the duty to provide advice based on evidence is enshrined in the Civil Service Code. (Civil Service Code, 2015). There is therefore what we might call an institutional basis for evidence-based policy-making in the UK. Indeed, recent political history suggest that evidence-based policy is institutionally embedded on a bipartisan basis.

The continuing importance of evidence

The development of evidence-based policy in the UK is attributed to the commitment of the New Labour Government from 1997–2010, with the 1999 White Paper, Modernising Government, and its advocacy of ‘What Works’ (Burnett and Duncan, 2008; Davies, 2004; Downe et al., 2012; Nutley et al., 2002; Parsons, 2002; Solesbury, 2001; Wells, 2007). Commitment to the importance of evidence was continued by certain of the initiatives of the UK Coalition Government from 2010–2015, and the Conservative Government from 2015, such as the creation of the Behavioural Insights Team (BIT), with its commitment to randomised control trials (Halpern, 2015; BIT, 2015) and the Open Data Institute. The What Works centres have continued and been expanded (Cabinet Office, 2011).

In respect of data, the then UK Labour Government legislated in 2007 to create the UK Statistics Authority as an independent statutory body, operating at arm’s length from government as a non-ministerial department and reporting directly to the UK Parliament, the Scottish Parliament, the National Assembly for Wales and the Northern Ireland Assembly. The three arms of UK Government statistics, the Authority, the Office of National Statistics (the Authority’s Executive arm, and the Government Statistics Service, a ‘community of all those involved in the production of official statistics in the UK’ form what is known as the UK Statistical System, a key means of generating and publishing data on an open and transparent basis. (UKSA, 2016).The Open Data Institute was established separately from government by Sir Tim Berners-Lee and Professor Nigel Shadbolt and backed in its initial phase by Southampton University (Shadbolt, 2013; ODI, 2015) with a £2 million initial investment from the Technology Strategy Board.

Devolution as a policy laboratory

The former First Minister of Wales, the late Rhodri Morgan, saw devolution as offering the opportunity for four ‘living laboratories’ throughout the UK for policy development. (WASC, 2010). The extent to which this has happened has been examined in Keating et al. (2012) and McCormick (2013). Faulty UK-wide policy design by UK Government departments, forgetful of the different educational structures in the devolved nations, has been a significant problem (Andrews, 2014, p 353–368). UK-based organisations such as Alliance for Useful Evidence (AUE) and IfG have sought to assist the evidence transfer and policy learning between governments across the UK. (AUE, 2015; IfG, 2015; Paun et al., 2016).

While there is a long way to go on this agenda, there are some recent signs that there have been attempts to widen Whitehall’s understanding of the devolved administrations, their role, and the necessary interaction between them and Whitehall. (Jones, 2016; Civil Service blog 2016; Rycroft, 2016). This has particularly been the case at the level of the Policy Profession network within the UK and devolved governments. (Pendlebury, 2016). Further analysis of policy learning across the UK, on the lines of work on the smoking bans (Cairney, 2009), would clearly be useful.

The public policy institute for wales (PPIW)

The PPIW was a deliberate attempt to bring evidence-based policy-making into the heart of government. The Welsh Labour manifesto for the 2011 National Assembly elections contained a clear commitment to the establishment of ‘a pan-Wales public policy institute.’ The express purpose was ‘to develop the engagement of the wider Welsh civic society, including the higher education sector, with the Assembly Government’s policy-making process’ with the objective of creating higher quality strategic research. (Welsh Labour, 2011). A paper exploring the concept was taken to the Welsh Government Cabinet by the First Minister (Welsh Government, 2012a). It was envisaged that the Institute would support the Cabinet and Ministers by undertaking work commissioned by Ministers, helping Ministers identify research needs, advising Ministers on available expertise and making connections across research activity, and working alongside the Welsh Government’s own Knowledge and Analytical Services, help inform the commissioning and communication of policy and research (see Nicholl, 2013).

PPIW Institute opened for business in October 2013 after a period of one-to-one discussions with Ministers on their priorities (Martin, 2013) and was formally and publicly launched in January 2014 (PPIW, 2014). The First Minister of Wales said on the first anniversary of the PPIW’s establishment that the Welsh Government was using its advice ‘to inform our decisions, to focus our interventions, to target our policies’ (Cardiff University, 2015).

PPIW has produced dozens of reports in a number of specific devolved policy areas, namely Economy, Education, Finance, Health and Social Care, Housing, Natural Resources and Sustainable Development, Public Services, Tackling Poverty and Transport. (PPIW, 2015, 2016a). The Institute has established a network of experts drawn from higher education institutions and others from across the UK and beyond. It has also contributed extensively to discussions about policy diffusion. (Bristow, 2016; PPIW, 2016b; Shepherd, 2016; Smith, 2016; St Denny, 2016; White, 2016). A tender for a new Welsh Centre for Public Policy to take over from PPIW was announced by the ESRC and Cardiff University won the bid in 2017 (PPIW, 2017).

The Welsh Government has historically held open sessions for academics on how best they can engage with policymakers (Wales DTC, 2014; Thurston, 2014). Randomised control trials (RCTs) for the development of policy have been a feature in Welsh policy-making for some time. Professor Jonathan Shepherd’s work on alcohol-related violence, which used data-sharing and RCTs for different experiments, has been widely cited (See for example, Henderson, 2012, pp 178–181; John, 2016; Moore et al., 2014; Shepherd, 2016). In the Welsh NHS policy trials have been run for some time (Roberts, 2012). The present author brought BIT in to work with the Welsh Government from 2015 on a number of specific policy areas where it was felt RCTs might assist policy development (See Andrews, 2015; Andrews, 2016a; Written Assembly Questions, 2016). The Welsh Government Cabinet approved a paper on open data in February 2016, recognising ‘how real-time data was informing citizen choice, and how management and statistical information was being used effectively to drive performance improvement’, and released its Open Data Plan in March 2016. (Welsh Government 2016a and b). Academics in the Wales Institute of Social & Economic Research, Data & Methods, have developed a data portal as a web application which allows National Assembly research staff, Assembly Members and their support staff to access and map a wide range of data for National Assembly for Wales Constituencies and Regions (Wiserd, 2017).

What influences ministers: a personal perspective

I would argue that there are three key factors affecting the ability or willingness of ministers to consider evidence: first, the limitations of time; second the issue of trusted sources of evidence; and third the broad authorising environment, which conditions ministerial work. Understanding these factors is likely to assist anyone seeking to engage ministers with evidence in their domain of expertise. Individual ministers will have different personal styles and will operate in different ways, but for me personally what I would have expected of anyone seeking to influence me would have been an understanding of the time constraints I faced, and the infrastructure I developed with others to help me address those and develop policies over time; second, a willingness to invest time in build relationships, recognising that trust does not happen overnight, even for those with established reputations in particular domains; and third, by being visible within the networks and media which shaped the environment within which policies were developed, recognising that policies do not emerge from nowhere, but are shaped through discussion and debate.

Time

The former head of Tony Blair’s strategic communications unit, Peter Hyman, once wrote: ‘too often in government the urgent crowds out the important’ (2005). It is a refrain which can be traced back to US President Eisenhower and the balancing of the urgent and important has become a standard text in business management (Eisenhower, 1961). The value of clarity and brevity in briefing has been emphasised by politicians at least since Churchill’s famous memo of 1940 (Edwards, 2017; National Archives, 2013).

The problem for politicians is not a dearth of evidence. Government is not an ‘evidence-free zone’ (Nutley et al., 2002). In fact, the lives of Ministers in particular are littered with evidence—so the major obstacle to reviewing evidence is often the availability of time. Michael D. Higgins, now the President of Ireland, reflecting on the time-pressure he faced as the Irish culture minister, said:

I’ve had to now develop an economy of what I am doing, and I am trying to pull back for more consideration of what I am doing and I have a very definite set of priorities. (Kelly, 1994)

Wicks (2012) analysed his own experience as a minister and stressed the ‘mundane’ and the ‘routine’ as taking up a considerable amount of time—such as the signing of correspondence. He recorded ‘one fundamental fact about ministerial life is that it is an exceedingly busy one’, stress the short-cuts which ministers must effect to make judgements:

If family and sleep were not to be altogether sacrificed, the Minister has to make careful judgements. What can be quickly scanned and authorized?

This necessarily implies that ministers sometimes have to rely on others—both permanent civil servants and special advisers—as ‘evidence filters’ in decision-making.

Rhodes identifies that at most 20% of a UK Minister’s day can be spent on policy issues—and says that is probably an overestimate (2011, p 102). Ministers in the UK systems of government are of course also constituency or regional representatives, and in time terms there can certainly be a problem of role conflict, but it can mean that ministers get direct feedback on the operation of a public service from their constituents in a way that their officials will not.

I would argue that intellectually-confident ministers will often be keen to bring in external expertise. I commissioned external expertise for a variety of reasons, including to

  • Assess processes to learn from best practice, especially if that meant moving money to the frontline (Welsh Government, 2011a, 2015)

  • Benchmark our system against others (OECD, 2014)

  • Gather best practice (e.g., Estyn, 2011)

  • Evaluate existing practice, impact and implementation (e.g., Taylor et al., 2014)

  • Explain best practice (e.g., Harris and Jones, 2017)

  • Develop new programmes (e.g., see Hadfield et al., 2017; Hayward, 2012)

  • Identify capacity gaps at both departmental and system levels (e.g. Welsh Government, 2011b)

  • Examine the scope for new policy developments (Welsh Government, 2012b)

Some of these projects were carried out by teams led by or including academics; others by consultants; some by our school inspectorate, Estyn; some by specialists within the field, such as head-teachers. Feedback loops were also important in this process, as Shepherd suggests in his analysis of the ‘evidence eco-system’ for the What Works network (Shepherd, 2014). I established new feedback loops as Education Minister precisely so that I could evaluate how initiatives and their implementation were being perceived on the ground (Andrews, 2014, pp 45–51).

In any policy domain, there is a plethora of evidence sources ranging from active stakeholder groups to academics and individual members of the public with a specific interest. The weight given to stakeholders, and their provenance, will often vary from field to field. In some fields, such as education policy, the key stake-holders—at least until the growth of academy trusts in England—were in my experience most likely to be public, third-sector, academic or trade-union sources (Andrews, 2014). By contrast, in the field of media policy-making, in which I have prior experience (Andrews, 2005), commercial companies as sources of evidence have tended to be dominant (Freedman, 2008).

As a minister, I held a monthly policy board with senior officials to consider broader and thornier policy questions within my portfolio. Sometimes academics would be invited in to share latest evidence so that we could understand the relevance of new research to our system. The advantage of the monthly policy board was that it provided a regular punctuation point in a busy schedule to take stock and assess progress on policy development and implementation, and space for consideration of new challenges, including the need for further evidence. It was a way of balancing the urgent and the important.

Ministers’ judgements of time are not only conditioned by the day-to-day: they are also conscious of the time they have to make an impact, before they may move portfolios, lose their jobs or face an election (Rose, 1972). In the UK government specifically, churn amongst ministers is frequent (Cleary and Reeves, 2009): for example, there were 13 housing ministers between 1995 and 2015 (Raynsford, 2016). These factors explain why politicians may operate, to borrow an expression from US presidential politics, as though they are in a permanent campaign (Kelly, 1993). All of these factors bear on ministerial prioritisation. So those seeking to influence ministers must start from the proposition that they do not want to be time-wasters, and be prepared to explain what they have to offer concisely and with brevity, and preferably with a sense of how evidence can help in the short and medium-term as well as the long-term. Impatience with evaluations which report, sometimes after the completion of a programme, is evident amongst all ministers and ex-ministers that I know.

Sometimes policies themselves are adopted or passed into law but the time between announcement and implementation may seem an age. For example, my decision in the autumn of 2010 to ensure that Welsh students did not have to pay £9,000 tuition fees did not come into effect until the autumn of 2012. Decisions on the framework for school closures, initially announced in January 2010, could not be completed until the passage of legislation in 2013 (Andrews, 2014, p 22). A wider range of theoretical reflections on the impact of temporal issues on policy-making is addressed in Pierson (2004) and Pollitt (2008).

Trusted sources

A recent literature review concluded that ‘little of substance has been written on the subject of ministerial effectiveness’ (Drabble, 2011). Certainly no-one teaches you to be a Minister. You learn on the job, and you bring with you the learning from prior political roles and your previous career (Hartley, 2011). If time is the greatest challenge for politicians in digesting evidence, a sometimes unconscious barrier for evidence transmission may be confirmation bias: in the melee of information through which you are driving forward, it is inevitable that you will come to rely on sources you trust, whether they be organisations or individuals. The importance of trust is heavily emphasised in advice given to civil servants on working with ministers:

A good relationship with a minister, special adviser or other officials is based primarily on trust, rather than necessarily on personal liking, and effort must be devoted to winning and maintaining that trust (Jary, 2015).

Building trust, which can rest on a reputation for credibility and reliability, takes time, but it is arguably the most fundamental element the policy community needs to address if evidence is to be translated into policy. I made an early decision as Education Minister to ask Sir Michael Barber to contribute to our department’s consideration of priorities. I had never met him: but I had read his book setting out his account of his time as head of the Prime Minister’s Delivery Unit (Barber 2007); one of my former specialist advisors, Dr. Tim Williams, had previously worked with him, and so had one of our senior civil servants, Chris Tweedale (Andrews, 2014). His engagement allowed me to set the agenda for the Department in my first 6 weeks: he was seen as an authoritative source-and his involvement also sent a signal within the department about the likely direction of travel.

Trust underpins the capacity of the policy community to turn evidence-generation into policy. Therefore understanding the stakeholder networks where policy ideas are generated, understanding the points at which intervention may be most valuable (earlier in the development of a policy, rather than later, when ideas may have been filtered by the internal assessments within government, the availability of resources or the outcomes of consultation) are key to practical engagement. As Mulgan says (2013):

Instead it matters a lot who gives the advice—and whether they are trusted and reputable. It matters how advice is given, and in particular how it is framed—preferably fitting the cognitive style of the receiver, and with a tone that is neither hectoring nor patronising.

There are many routes into identifying opportunities for evidence contributions. Ministers may use speeches to float ideas that are at an early stage of development: I certainly did (Andrews, 2014, p 44). (So, incidentally, did Michael Gove: see Nelson, 2012). Indeed, Ministerial speeches for politicians are to policy what conference papers for academics are to journal articles. Circulating within those policy networks at conferences, seminars, and other events, developing ideas which are related to—or indeed, sometimes challenge—the direction of policy, builds the cultural and social capital, and individual and institutional capacity, on which the transmission of evidence may be based: indeed, in the media sector, some think of it as ‘a methodology of hanging about with businessmen (and women), policy makers and politicians’ (Collins, 2009, p 10). Policy enquiries by committees in Parliament or the devolved institutions or the European Parliament may themselves provide opportunities for engagement. Analysis of policy exchanges in committees or in plenary sessions in Parliament or the devolved institutions will also provide better understanding of context. (As a Minister moving into a new portfolio in December 2009 I asked my Private Office to pull out the relevant Assembly Question and Answer sessions my predecessor had undertaken over the previous year or two, simply to get a feel for how existing issues were being addressed by the Opposition parties and by stakeholders).

Committee inquiries themselves, and government consultations, will also provide platforms for presentation of ideas in written and sometimes oral format. Presenting complex ideas in simple terms to parliamentary committees or individual parliamentarians is a skill in itself to be cultivated. Davies, reviewing the success of evidence-based policymaking worldwide, and drawing on CHSRF, 2001, reiterates the importance of the presentation of research findings in ‘a 1:3;25’ format’—with one page of main messages, three page executive summary, and the presentation of the findings in no more than 25 jargon-free pages (Davies, 2004). Certainly this is a helpful approach for dissemination of research findings, not only to ministers but to the general media.

It is important to identify likely sources of influence on Ministers. Ministers will usually, though not always, have had some connection with their policy portfolio beforehand. They may have served in opposition with the same brief; they may have served on a committee relevant to their portfolio as a backbencher; some may have been stakeholders themselves in earlier lives in the portfolio area in which they now operate as a Minister. They may have long-standing connections with think-tanks, unions, pressure groups, industry. (For example, Ball and Exley (2010) and Schlesinger (2009a, b) have separately and in different contexts mapped the networks of leading New Labour special advisers from 1997–2010). They develop points of connection and reach judgements on whom to rely. Academics can strengthen the political capital of their evidence by building trust over time—turning that evidence into a useful ‘information subsidy’ for governments. As Schlesinger and Tumber argued in their study of the reporting of crime:

Just as information subsidies flow from sources to journalists, so too do they flow from pressure groups to legislators. This process may be linked to the credibility that a given group has achieved with politicians over a period of time. (Schlesinger and Tumber, 1994, p 96)

Ministers are likely to be supported not only by policy civil servants who will be taking part in those networks but also by special advisers who will themselves form judgements about the expertise on offer both within and outside Government. (Indeed, special advisors may play a critical role for Ministers in identifying gaps in policy knowledge, expertise or skill within the department: my special advisers in the Welsh Government certainly played that role in the past, in some cases identifying academic specialists on whose expertise we could draw). Identifying opportunities to bring special advisers or policy officials into discussions within a university or with a research team through hosting events is a valuable exercise. These interactions themselves may in fact generate further research or impact ideas. Think-tanks can provide a useful bridge between academics and policy-makers (Taylor, 2011). As Education Minister in Wales I also had a Ministerial Advisory Board, with a Nolan-appointed membership drawing on outside expertise (Andrews, 2014): this enabled me to examine longer-term issues and also to gather expertise that I could weigh against the advice offered by officials: this was continued by my successor (Welsh Government, 2014)

The authorising environment

Finally, ministerial decisions are not generally taken in a vacuum. There is a wider authorising environment (Moore, 1995), and those seeking to influence policy also need to be influencing that wider environment. Ministers will be operating in an environment conditioned by a variety of external and internal factors. Internally, they will want to win allies amongst other Ministers—and retain the support of the person who appointed them, the First Minister or Prime Minister. They will want to keep on board, and elicit vocal support from, their party colleagues, both within the political institution or outside. They will have a sense of the political players who have a bearing on their own party decision-making, from unions to pressure groups to local political office-holders. In other words, ministers have ‘situated agency’ (Bevir and Rhodes, 2006, p 4)—they are situated within a context defined by a party programme, a history of prior policies, a balance of power within a Cabinet, particularly in a coalition context, and a budgetary framework.

Above all, they are likely to be concerned about the media perception of their actions, as that will condition the climate for the delivery and implementation of policy: it will also condition how others—Prime and First Ministers and the wider public—judge the effectiveness of their policies (Jary, 2015). Where specialist policy correspondents exist, Ministers will probably have long-standing relationships and a keen understanding of which ‘experts’ have the ear of the journalist. Specialist media outlets such as weekly magazines may carry a disproportionate level of influence with key stake-holders. The media can reinforce evidence—or can undermine it, particularly if the evidence flies in the face of the underpinning ideological position of a particular newspaper outlet. However, the media can provide platforms for evidence-holders to get ideas across to other stake-holders, to officials, to advisers, and occasionally to ministers themselves. While constant carping will never be persuasive, intelligent, constructive criticism and policy entrepreneurialism or provision of solutions will be better-received. Print media outlets offer opportunities to provide more popularly-packaged summaries of key elements of research. Online outlets, sometimes self-generated, sometimes institutionally-hosted, provide further platforms.

There is also a historical context to the environment in which ministers operate. As Wicks points out, some aspects of policy build on legislation developed decades ago, such as the creation of National Insurance. Certain policies therefore may be path dependent. Developments in more recent times should be retained within the corporate memory of a department—but that isn’t always so (Wicks, 2012, p 595; Hillman, 2016, p 331; Andrews, 2014, p 33). Academics with detailed specialist knowledge may be in a position to fill that gap in memory, as the ‘History and Policy’ initiative has begun to do (History & Policy, 2017).

All of this therefore requires sensitivity to the context of ministerial decision-taking. Understanding how evidence can help a minister achieve their goals and the wider government goals is therefore key to any intervention. Policy-making can be a complex, contested and crowded arena—nevertheless there is space for interventions and interpellations by academics and others. After all, others do so, often on limited resources: the academic literature on pressure groups is full of examples of inadequately-resourced groups having a nonetheless disproportionate impact on policy (see, for example, Donnison, 1981; Maloney et al., 1994; Seyd, 1976; Field, 1982; Wilson, 1970).

Indeed, as Rhodes has pointed out, in some domains, interest groups ‘become institutionalised’:

These routine, standardized patterns of interaction between government and insider interests become policy networks (2015).

Those policy networks are part of the overall authorising environment: theoretically informal, but in practice a routine part of decision-making which needs to be accommodated. They are ‘policy advisory systems’, which help to construct the context of decision-making (Halligan, 1995; Craft and Howlett, 2012, 2013). They illustrate the porous nature of modern governance, providing channels for influence by academics and other specialists. Indeed, sometimes governments may wish to out-source policy thinking to organised policy networks, particularly on emerging and controversial areas, as with the recent work undertaken by the Royal Society and British Academy on data governance (Royal Society, 2017).

Conclusion

I have been a government minister (Andrews, 2014)—and I have been in the position of seeking to influence governments (Andrews, 2005). In both roles timing, trust and context have been key determinants in realising policy goals. To be effective in grounding evidence in governmental policy, communications need to be concise, messages clear, delivered consistently and at relevant times. There is no shortage of tool-kits to advise those wishing to promote evidence to policy-makers. The AUE, the IfG and indeed the ESRC have provided guides on how best to communicate evidence to policy-makers, as well as assessments of how better policy can be made. (ESRC, 2016; IfG 2011; IfG 2013; Maybin 2013; AUE, 2016a, b). Indeed, I sometimes wonder if we are creating an evidence-based policy industry. (Davies, 2005).

Expertise is still valued. But it needs to be well-directed. It is also possible that there will be a shift in the nature of the evidence that is seen as most valuable. In my time as a Minister I saw randomized control trials and policy pilots become more important and evaluations less so; design principles being adopted more widely in the approach to policy development (Hilton, 2015, p 44); a continuing desire to focus on what has been proved to work; with rapid changes in technology, foresight and scenario planning growing in importance; but above all, experience of implementation and delivery becoming a sought after skill.

Of course, evidence-free policy will attract headlines, particularly where it ostentatiously fails (King and Crewe, 2013). But attention to the routines of government would demonstrate that evidence is not only welcomed, but has become institutionally embedded in the processes of government: and that with four governmental spaces developing and implementing policies in the UK, new approaches are being constantly, if not consistently, trialled.

Government policy-making, whether at UK or devolved levels, will always be contested and always complex. Ensuring evidence is appropriately valued entails a responsibility on academics to understand how specific ministers develop mechanisms to manage the time constraints they face; to invest the time needed to develop relationship of trust and confidence which will neither happen overnight nor may not pay off immediately as policy is in development; and to circulate and indeed mobilise within the interest groups or media which help to shape the policy environment.

Data availability

Data sharing is not applicable to this paper as no datasets were generated or analysed.