Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

How can we demonstrate the public value of evidence-based policy making when government ministers declare that the people ‘have had enough of experts’?


Recent political campaigns on both sides of the Atlantic have led some to argue that we live in the age of ‘post-factual’ or ‘post-truth’ politics, suggesting evidence has a limited role in debate and public policy. How can we demonstrate the public value of evidence-informed debate under those circumstances? Survey evidence on public attitudes to expertise offers some hope that the tone of much of this debate is unduly pessimistic. While policy-making always develops in an environment where evidence is contested, this paper will argue that understanding of the routines through which Ministers work and assimilate evidence is actually under-researched. Not only are Ministers open to evidence, but there is an institutional grounding for evidence-based policy in government. Meanwhile, the creation of devolved institutions has created new sites in the UK for evidence-based policy-making, despite the political tensions between UK and devolved governments. Drawing on academic and think tank insights, and experience as a Welsh Government Minister between 2007 and 2016, this paper argues for three key approaches for the academic community to adopt: understanding the temporal focus of ministers, building trust amongst ministers and those who advise them in the evidence-promoting capacity of the academic policy community, and shaping the wider authorising environment, including the media that contributes to the framing of key policy debates.


Michael Gove’s declaration during the 2016 referendum on UK membership of the European Union that ‘the people of this country have had enough of experts’ (Youtube, 2016) is now notorious—and could be taken as proof that Ministers are not interested in evidence. Even advocates for evidence-based policy worry that we live in as ‘post-truth’ age (Breckon, 2016). But it is important not to allow a stereotype to take hold: after all, as a Minister, even Michael Gove commissioned evidence (for example, Goldacre, 2013). Meanwhile, a recent survey for the Institute for Government (IfG) indicated that 85% of people wanted politicians to consult professionals and experts when making difficult decisions, and 83% wanted government to make decisions based on objective evidence—and that these percentages had increased since their previous survey in 2014 (IfG, 2016a). A survey of those who are called to advise Ministers—senior officials—emphasised their open-ness to evidence and the factors bearing on their assimilation and adoption of it (Talbot and Talbot, 2014). A recent survey of over 50 UK Parliamentarians by academics proposing to establish a science-based Parliamentary Evidence Information Service identified a clear desire to engage with evidence in the formation of policy, and uncovered a range of factors affecting how Parliamentarians came to make judgements about policy (Lawrence et al., 2016).

Of course, specific examples of evidence-free policy do arise. But I will argue that in practice, proof of evidence is largely normative now in governmental policy-making and requirements of proof of evidence are built into the processes of government. Additionally, the development of devolved government within the UK provides new spaces for policy trials and experiments, although policy learning and transfer as a result of devolution remains at an early stage of development. I give specific examples from the Welsh context of such initiatives. Finally, I drawn on experience as a minister in the Welsh Government (Andrews, 2014) to set out some examples of how ministers commission and draw on evidence and can be persuaded of the value of expertise, and urge a greater understanding of three factors which bear on ministers’ approaches to policy decisions: time constraints; confidence in evidence sources; and the wider political ‘authorising environment’.

Ministerial engagement in policy-making

There are principally three reasons why the myth of evidence-free Ministers persists. First, as Cairney explains (2016), the process of policy-making is not well-understood, including amongst academics who wish to see their research having a stronger bearing on policy. Second, as Rhodes (2012) has argued, political science literature has little to say about what Ministers actually do. Third, as the former head of Tony Blair’s Prime Minister’s Delivery Unit has asserted (Barber, 2015), there is little in the literature on government about how policy change is carried through to delivery. I will take these arguments in turn.

First, policy is not formulated in a sterile laboratory environment. Cairney points out that:

In the real world, the evidence is contested, the policy process contains a large number of influential actors, scientific evidence is one of many sources of information, and policymakers base their decisions on a mixture of emotions, knowledge and shortcuts to gather relevant evidence.

Policymakers rely on evidence sources whom they trust. Building such relationships of trust can take time—and some policy solutions take years to become adopted, or to be seen as the new ‘common sense’. Persuasion, argument and framing of issues are all important in that process. Cairney also points out that policymakers exist in a complex system, shaped by institutions, ideas, networks and events anticipated and unexpected. (Cairney, 2016).

Second, political leadership and its development is ‘under-theorised and under-researched’ (Hartley, 2010, p 146). The study of Ministers often concentrates more on the political circumstances leading to the emergence of policies or decisions than the role of Ministers or political leaders in the detailed development of such policies. As Rhodes (2012) says, ‘The surprise is that mainstream political science should have had so little to say about the occupation of politician.’ Ministers are ‘missing links’ in academic research (Pollitt, 2006). Klein and Marmor (2008) argue for understanding of the perspectives, motivations and concerns of decision-makers in government, urging ‘empathy in the sense of capturing what drives policy actors and entering into their assumptive worlds.’ They warn that policy-making can be ‘as much drudgery as drama, a constant process of tinkering and repairing’. If analysis of what UK Ministers actually do on a day–today basis is thin, research on ministers in devolved administrations is virtually absent. Lynch (2006) in his work on First Ministers in Scotland and Wales complained that there was ‘insufficient evidence’ to analyse properly the relevance of theories of governance within the devolved administrations.

The advice given to civil servants on working with Ministers is directly relevant to academics seeking to influence policy:

Unless we can make that imaginative leap to see the world from a minister’s perspective, we cannot help them discharge the exacting duties upon which successful government depends (Jary, 2015).

Third, the lack of study of the day-to-day ‘drudgery’ or routines of government is a block to understanding (Barber, 2007, p 111):

Stubborn persistence, relentless monotony, attention to detail and glorying in routine are vastly underestimated in the literature on government and political history.

More recently, he has said that ‘very few of the books and little of the commentary focus on how to run a government so that it delivers the change it has promised’ (2015, p 11). Perhaps there is no surprise at that: political commentators on national newspapers are more interested, in the words of one columnist and former editor, in ‘fireworks, war-cries, blood on the decks’. Journalists, he says, ‘are bored by strategy, consistency, plugging on.’ (Moore, 2010). Barber argues that without routines, you get ‘government ‘by spasm’ (2017)—routines are ‘a way of making a complex and often anarchic world seem manageable’ (Bevir and Rhodes, 2010, p 202).

I would argue that analysis of the ‘routines’ of government would demonstrate that evidence-based policy-making is broadly normative now in the UK system of government. The need for evidence is reinforced by Treasury guidance in the Green Book and the Magenta Book on frameworks for appraisal and evaluation of major projects (See, for example, Price and Thurston, 2012; Davies, 2012). While it has been argued that the ROAMEF cycle of policy-making (Rationale, Objective, Appraisal, Monitoring, Evaluation, Feedback) cycle is unrealistic (IfG, 2011) it is still current Treasury guidance. Draft UK Government and devolved administration laws are expected to be accompanied by detailed impact assessments. Indeed, the duty to provide advice based on evidence is enshrined in the Civil Service Code. (Civil Service Code, 2015). There is therefore what we might call an institutional basis for evidence-based policy-making in the UK. Indeed, recent political history suggest that evidence-based policy is institutionally embedded on a bipartisan basis.

The continuing importance of evidence

The development of evidence-based policy in the UK is attributed to the commitment of the New Labour Government from 1997–2010, with the 1999 White Paper, Modernising Government, and its advocacy of ‘What Works’ (Burnett and Duncan, 2008; Davies, 2004; Downe et al., 2012; Nutley et al., 2002; Parsons, 2002; Solesbury, 2001; Wells, 2007). Commitment to the importance of evidence was continued by certain of the initiatives of the UK Coalition Government from 2010–2015, and the Conservative Government from 2015, such as the creation of the Behavioural Insights Team (BIT), with its commitment to randomised control trials (Halpern, 2015; BIT, 2015) and the Open Data Institute. The What Works centres have continued and been expanded (Cabinet Office, 2011).

In respect of data, the then UK Labour Government legislated in 2007 to create the UK Statistics Authority as an independent statutory body, operating at arm’s length from government as a non-ministerial department and reporting directly to the UK Parliament, the Scottish Parliament, the National Assembly for Wales and the Northern Ireland Assembly. The three arms of UK Government statistics, the Authority, the Office of National Statistics (the Authority’s Executive arm, and the Government Statistics Service, a ‘community of all those involved in the production of official statistics in the UK’ form what is known as the UK Statistical System, a key means of generating and publishing data on an open and transparent basis. (UKSA, 2016).The Open Data Institute was established separately from government by Sir Tim Berners-Lee and Professor Nigel Shadbolt and backed in its initial phase by Southampton University (Shadbolt, 2013; ODI, 2015) with a £2 million initial investment from the Technology Strategy Board.

Devolution as a policy laboratory

The former First Minister of Wales, the late Rhodri Morgan, saw devolution as offering the opportunity for four ‘living laboratories’ throughout the UK for policy development. (WASC, 2010). The extent to which this has happened has been examined in Keating et al. (2012) and McCormick (2013). Faulty UK-wide policy design by UK Government departments, forgetful of the different educational structures in the devolved nations, has been a significant problem (Andrews, 2014, p 353–368). UK-based organisations such as Alliance for Useful Evidence (AUE) and IfG have sought to assist the evidence transfer and policy learning between governments across the UK. (AUE, 2015; IfG, 2015; Paun et al., 2016).

While there is a long way to go on this agenda, there are some recent signs that there have been attempts to widen Whitehall’s understanding of the devolved administrations, their role, and the necessary interaction between them and Whitehall. (Jones, 2016; Civil Service blog 2016; Rycroft, 2016). This has particularly been the case at the level of the Policy Profession network within the UK and devolved governments. (Pendlebury, 2016). Further analysis of policy learning across the UK, on the lines of work on the smoking bans (Cairney, 2009), would clearly be useful.

The public policy institute for wales (PPIW)

The PPIW was a deliberate attempt to bring evidence-based policy-making into the heart of government. The Welsh Labour manifesto for the 2011 National Assembly elections contained a clear commitment to the establishment of ‘a pan-Wales public policy institute.’ The express purpose was ‘to develop the engagement of the wider Welsh civic society, including the higher education sector, with the Assembly Government’s policy-making process’ with the objective of creating higher quality strategic research. (Welsh Labour, 2011). A paper exploring the concept was taken to the Welsh Government Cabinet by the First Minister (Welsh Government, 2012a). It was envisaged that the Institute would support the Cabinet and Ministers by undertaking work commissioned by Ministers, helping Ministers identify research needs, advising Ministers on available expertise and making connections across research activity, and working alongside the Welsh Government’s own Knowledge and Analytical Services, help inform the commissioning and communication of policy and research (see Nicholl, 2013).

PPIW Institute opened for business in October 2013 after a period of one-to-one discussions with Ministers on their priorities (Martin, 2013) and was formally and publicly launched in January 2014 (PPIW, 2014). The First Minister of Wales said on the first anniversary of the PPIW’s establishment that the Welsh Government was using its advice ‘to inform our decisions, to focus our interventions, to target our policies’ (Cardiff University, 2015).

PPIW has produced dozens of reports in a number of specific devolved policy areas, namely Economy, Education, Finance, Health and Social Care, Housing, Natural Resources and Sustainable Development, Public Services, Tackling Poverty and Transport. (PPIW, 2015, 2016a). The Institute has established a network of experts drawn from higher education institutions and others from across the UK and beyond. It has also contributed extensively to discussions about policy diffusion. (Bristow, 2016; PPIW, 2016b; Shepherd, 2016; Smith, 2016; St Denny, 2016; White, 2016). A tender for a new Welsh Centre for Public Policy to take over from PPIW was announced by the ESRC and Cardiff University won the bid in 2017 (PPIW, 2017).

The Welsh Government has historically held open sessions for academics on how best they can engage with policymakers (Wales DTC, 2014; Thurston, 2014). Randomised control trials (RCTs) for the development of policy have been a feature in Welsh policy-making for some time. Professor Jonathan Shepherd’s work on alcohol-related violence, which used data-sharing and RCTs for different experiments, has been widely cited (See for example, Henderson, 2012, pp 178–181; John, 2016; Moore et al., 2014; Shepherd, 2016). In the Welsh NHS policy trials have been run for some time (Roberts, 2012). The present author brought BIT in to work with the Welsh Government from 2015 on a number of specific policy areas where it was felt RCTs might assist policy development (See Andrews, 2015; Andrews, 2016a; Written Assembly Questions, 2016). The Welsh Government Cabinet approved a paper on open data in February 2016, recognising ‘how real-time data was informing citizen choice, and how management and statistical information was being used effectively to drive performance improvement’, and released its Open Data Plan in March 2016. (Welsh Government 2016a and b). Academics in the Wales Institute of Social & Economic Research, Data & Methods, have developed a data portal as a web application which allows National Assembly research staff, Assembly Members and their support staff to access and map a wide range of data for National Assembly for Wales Constituencies and Regions (Wiserd, 2017).

What influences ministers: a personal perspective

I would argue that there are three key factors affecting the ability or willingness of ministers to consider evidence: first, the limitations of time; second the issue of trusted sources of evidence; and third the broad authorising environment, which conditions ministerial work. Understanding these factors is likely to assist anyone seeking to engage ministers with evidence in their domain of expertise. Individual ministers will have different personal styles and will operate in different ways, but for me personally what I would have expected of anyone seeking to influence me would have been an understanding of the time constraints I faced, and the infrastructure I developed with others to help me address those and develop policies over time; second, a willingness to invest time in build relationships, recognising that trust does not happen overnight, even for those with established reputations in particular domains; and third, by being visible within the networks and media which shaped the environment within which policies were developed, recognising that policies do not emerge from nowhere, but are shaped through discussion and debate.


The former head of Tony Blair’s strategic communications unit, Peter Hyman, once wrote: ‘too often in government the urgent crowds out the important’ (2005). It is a refrain which can be traced back to US President Eisenhower and the balancing of the urgent and important has become a standard text in business management (Eisenhower, 1961). The value of clarity and brevity in briefing has been emphasised by politicians at least since Churchill’s famous memo of 1940 (Edwards, 2017; National Archives, 2013).

The problem for politicians is not a dearth of evidence. Government is not an ‘evidence-free zone’ (Nutley et al., 2002). In fact, the lives of Ministers in particular are littered with evidence—so the major obstacle to reviewing evidence is often the availability of time. Michael D. Higgins, now the President of Ireland, reflecting on the time-pressure he faced as the Irish culture minister, said:

I’ve had to now develop an economy of what I am doing, and I am trying to pull back for more consideration of what I am doing and I have a very definite set of priorities. (Kelly, 1994)

Wicks (2012) analysed his own experience as a minister and stressed the ‘mundane’ and the ‘routine’ as taking up a considerable amount of time—such as the signing of correspondence. He recorded ‘one fundamental fact about ministerial life is that it is an exceedingly busy one’, stress the short-cuts which ministers must effect to make judgements:

If family and sleep were not to be altogether sacrificed, the Minister has to make careful judgements. What can be quickly scanned and authorized?

This necessarily implies that ministers sometimes have to rely on others—both permanent civil servants and special advisers—as ‘evidence filters’ in decision-making.

Rhodes identifies that at most 20% of a UK Minister’s day can be spent on policy issues—and says that is probably an overestimate (2011, p 102). Ministers in the UK systems of government are of course also constituency or regional representatives, and in time terms there can certainly be a problem of role conflict, but it can mean that ministers get direct feedback on the operation of a public service from their constituents in a way that their officials will not.

I would argue that intellectually-confident ministers will often be keen to bring in external expertise. I commissioned external expertise for a variety of reasons, including to

  • Assess processes to learn from best practice, especially if that meant moving money to the frontline (Welsh Government, 2011a, 2015)

  • Benchmark our system against others (OECD, 2014)

  • Gather best practice (e.g., Estyn, 2011)

  • Evaluate existing practice, impact and implementation (e.g., Taylor et al., 2014)

  • Explain best practice (e.g., Harris and Jones, 2017)

  • Develop new programmes (e.g., see Hadfield et al., 2017; Hayward, 2012)

  • Identify capacity gaps at both departmental and system levels (e.g. Welsh Government, 2011b)

  • Examine the scope for new policy developments (Welsh Government, 2012b)

Some of these projects were carried out by teams led by or including academics; others by consultants; some by our school inspectorate, Estyn; some by specialists within the field, such as head-teachers. Feedback loops were also important in this process, as Shepherd suggests in his analysis of the ‘evidence eco-system’ for the What Works network (Shepherd, 2014). I established new feedback loops as Education Minister precisely so that I could evaluate how initiatives and their implementation were being perceived on the ground (Andrews, 2014, pp 45–51).

In any policy domain, there is a plethora of evidence sources ranging from active stakeholder groups to academics and individual members of the public with a specific interest. The weight given to stakeholders, and their provenance, will often vary from field to field. In some fields, such as education policy, the key stake-holders—at least until the growth of academy trusts in England—were in my experience most likely to be public, third-sector, academic or trade-union sources (Andrews, 2014). By contrast, in the field of media policy-making, in which I have prior experience (Andrews, 2005), commercial companies as sources of evidence have tended to be dominant (Freedman, 2008).

As a minister, I held a monthly policy board with senior officials to consider broader and thornier policy questions within my portfolio. Sometimes academics would be invited in to share latest evidence so that we could understand the relevance of new research to our system. The advantage of the monthly policy board was that it provided a regular punctuation point in a busy schedule to take stock and assess progress on policy development and implementation, and space for consideration of new challenges, including the need for further evidence. It was a way of balancing the urgent and the important.

Ministers’ judgements of time are not only conditioned by the day-to-day: they are also conscious of the time they have to make an impact, before they may move portfolios, lose their jobs or face an election (Rose, 1972). In the UK government specifically, churn amongst ministers is frequent (Cleary and Reeves, 2009): for example, there were 13 housing ministers between 1995 and 2015 (Raynsford, 2016). These factors explain why politicians may operate, to borrow an expression from US presidential politics, as though they are in a permanent campaign (Kelly, 1993). All of these factors bear on ministerial prioritisation. So those seeking to influence ministers must start from the proposition that they do not want to be time-wasters, and be prepared to explain what they have to offer concisely and with brevity, and preferably with a sense of how evidence can help in the short and medium-term as well as the long-term. Impatience with evaluations which report, sometimes after the completion of a programme, is evident amongst all ministers and ex-ministers that I know.

Sometimes policies themselves are adopted or passed into law but the time between announcement and implementation may seem an age. For example, my decision in the autumn of 2010 to ensure that Welsh students did not have to pay £9,000 tuition fees did not come into effect until the autumn of 2012. Decisions on the framework for school closures, initially announced in January 2010, could not be completed until the passage of legislation in 2013 (Andrews, 2014, p 22). A wider range of theoretical reflections on the impact of temporal issues on policy-making is addressed in Pierson (2004) and Pollitt (2008).

Trusted sources

A recent literature review concluded that ‘little of substance has been written on the subject of ministerial effectiveness’ (Drabble, 2011). Certainly no-one teaches you to be a Minister. You learn on the job, and you bring with you the learning from prior political roles and your previous career (Hartley, 2011). If time is the greatest challenge for politicians in digesting evidence, a sometimes unconscious barrier for evidence transmission may be confirmation bias: in the melee of information through which you are driving forward, it is inevitable that you will come to rely on sources you trust, whether they be organisations or individuals. The importance of trust is heavily emphasised in advice given to civil servants on working with ministers:

A good relationship with a minister, special adviser or other officials is based primarily on trust, rather than necessarily on personal liking, and effort must be devoted to winning and maintaining that trust (Jary, 2015).

Building trust, which can rest on a reputation for credibility and reliability, takes time, but it is arguably the most fundamental element the policy community needs to address if evidence is to be translated into policy. I made an early decision as Education Minister to ask Sir Michael Barber to contribute to our department’s consideration of priorities. I had never met him: but I had read his book setting out his account of his time as head of the Prime Minister’s Delivery Unit (Barber 2007); one of my former specialist advisors, Dr. Tim Williams, had previously worked with him, and so had one of our senior civil servants, Chris Tweedale (Andrews, 2014). His engagement allowed me to set the agenda for the Department in my first 6 weeks: he was seen as an authoritative source-and his involvement also sent a signal within the department about the likely direction of travel.

Trust underpins the capacity of the policy community to turn evidence-generation into policy. Therefore understanding the stakeholder networks where policy ideas are generated, understanding the points at which intervention may be most valuable (earlier in the development of a policy, rather than later, when ideas may have been filtered by the internal assessments within government, the availability of resources or the outcomes of consultation) are key to practical engagement. As Mulgan says (2013):

Instead it matters a lot who gives the advice—and whether they are trusted and reputable. It matters how advice is given, and in particular how it is framed—preferably fitting the cognitive style of the receiver, and with a tone that is neither hectoring nor patronising.

There are many routes into identifying opportunities for evidence contributions. Ministers may use speeches to float ideas that are at an early stage of development: I certainly did (Andrews, 2014, p 44). (So, incidentally, did Michael Gove: see Nelson, 2012). Indeed, Ministerial speeches for politicians are to policy what conference papers for academics are to journal articles. Circulating within those policy networks at conferences, seminars, and other events, developing ideas which are related to—or indeed, sometimes challenge—the direction of policy, builds the cultural and social capital, and individual and institutional capacity, on which the transmission of evidence may be based: indeed, in the media sector, some think of it as ‘a methodology of hanging about with businessmen (and women), policy makers and politicians’ (Collins, 2009, p 10). Policy enquiries by committees in Parliament or the devolved institutions or the European Parliament may themselves provide opportunities for engagement. Analysis of policy exchanges in committees or in plenary sessions in Parliament or the devolved institutions will also provide better understanding of context. (As a Minister moving into a new portfolio in December 2009 I asked my Private Office to pull out the relevant Assembly Question and Answer sessions my predecessor had undertaken over the previous year or two, simply to get a feel for how existing issues were being addressed by the Opposition parties and by stakeholders).

Committee inquiries themselves, and government consultations, will also provide platforms for presentation of ideas in written and sometimes oral format. Presenting complex ideas in simple terms to parliamentary committees or individual parliamentarians is a skill in itself to be cultivated. Davies, reviewing the success of evidence-based policymaking worldwide, and drawing on CHSRF, 2001, reiterates the importance of the presentation of research findings in ‘a 1:3;25’ format’—with one page of main messages, three page executive summary, and the presentation of the findings in no more than 25 jargon-free pages (Davies, 2004). Certainly this is a helpful approach for dissemination of research findings, not only to ministers but to the general media.

It is important to identify likely sources of influence on Ministers. Ministers will usually, though not always, have had some connection with their policy portfolio beforehand. They may have served in opposition with the same brief; they may have served on a committee relevant to their portfolio as a backbencher; some may have been stakeholders themselves in earlier lives in the portfolio area in which they now operate as a Minister. They may have long-standing connections with think-tanks, unions, pressure groups, industry. (For example, Ball and Exley (2010) and Schlesinger (2009a, b) have separately and in different contexts mapped the networks of leading New Labour special advisers from 1997–2010). They develop points of connection and reach judgements on whom to rely. Academics can strengthen the political capital of their evidence by building trust over time—turning that evidence into a useful ‘information subsidy’ for governments. As Schlesinger and Tumber argued in their study of the reporting of crime:

Just as information subsidies flow from sources to journalists, so too do they flow from pressure groups to legislators. This process may be linked to the credibility that a given group has achieved with politicians over a period of time. (Schlesinger and Tumber, 1994, p 96)

Ministers are likely to be supported not only by policy civil servants who will be taking part in those networks but also by special advisers who will themselves form judgements about the expertise on offer both within and outside Government. (Indeed, special advisors may play a critical role for Ministers in identifying gaps in policy knowledge, expertise or skill within the department: my special advisers in the Welsh Government certainly played that role in the past, in some cases identifying academic specialists on whose expertise we could draw). Identifying opportunities to bring special advisers or policy officials into discussions within a university or with a research team through hosting events is a valuable exercise. These interactions themselves may in fact generate further research or impact ideas. Think-tanks can provide a useful bridge between academics and policy-makers (Taylor, 2011). As Education Minister in Wales I also had a Ministerial Advisory Board, with a Nolan-appointed membership drawing on outside expertise (Andrews, 2014): this enabled me to examine longer-term issues and also to gather expertise that I could weigh against the advice offered by officials: this was continued by my successor (Welsh Government, 2014)

The authorising environment

Finally, ministerial decisions are not generally taken in a vacuum. There is a wider authorising environment (Moore, 1995), and those seeking to influence policy also need to be influencing that wider environment. Ministers will be operating in an environment conditioned by a variety of external and internal factors. Internally, they will want to win allies amongst other Ministers—and retain the support of the person who appointed them, the First Minister or Prime Minister. They will want to keep on board, and elicit vocal support from, their party colleagues, both within the political institution or outside. They will have a sense of the political players who have a bearing on their own party decision-making, from unions to pressure groups to local political office-holders. In other words, ministers have ‘situated agency’ (Bevir and Rhodes, 2006, p 4)—they are situated within a context defined by a party programme, a history of prior policies, a balance of power within a Cabinet, particularly in a coalition context, and a budgetary framework.

Above all, they are likely to be concerned about the media perception of their actions, as that will condition the climate for the delivery and implementation of policy: it will also condition how others—Prime and First Ministers and the wider public—judge the effectiveness of their policies (Jary, 2015). Where specialist policy correspondents exist, Ministers will probably have long-standing relationships and a keen understanding of which ‘experts’ have the ear of the journalist. Specialist media outlets such as weekly magazines may carry a disproportionate level of influence with key stake-holders. The media can reinforce evidence—or can undermine it, particularly if the evidence flies in the face of the underpinning ideological position of a particular newspaper outlet. However, the media can provide platforms for evidence-holders to get ideas across to other stake-holders, to officials, to advisers, and occasionally to ministers themselves. While constant carping will never be persuasive, intelligent, constructive criticism and policy entrepreneurialism or provision of solutions will be better-received. Print media outlets offer opportunities to provide more popularly-packaged summaries of key elements of research. Online outlets, sometimes self-generated, sometimes institutionally-hosted, provide further platforms.

There is also a historical context to the environment in which ministers operate. As Wicks points out, some aspects of policy build on legislation developed decades ago, such as the creation of National Insurance. Certain policies therefore may be path dependent. Developments in more recent times should be retained within the corporate memory of a department—but that isn’t always so (Wicks, 2012, p 595; Hillman, 2016, p 331; Andrews, 2014, p 33). Academics with detailed specialist knowledge may be in a position to fill that gap in memory, as the ‘History and Policy’ initiative has begun to do (History & Policy, 2017).

All of this therefore requires sensitivity to the context of ministerial decision-taking. Understanding how evidence can help a minister achieve their goals and the wider government goals is therefore key to any intervention. Policy-making can be a complex, contested and crowded arena—nevertheless there is space for interventions and interpellations by academics and others. After all, others do so, often on limited resources: the academic literature on pressure groups is full of examples of inadequately-resourced groups having a nonetheless disproportionate impact on policy (see, for example, Donnison, 1981; Maloney et al., 1994; Seyd, 1976; Field, 1982; Wilson, 1970).

Indeed, as Rhodes has pointed out, in some domains, interest groups ‘become institutionalised’:

These routine, standardized patterns of interaction between government and insider interests become policy networks (2015).

Those policy networks are part of the overall authorising environment: theoretically informal, but in practice a routine part of decision-making which needs to be accommodated. They are ‘policy advisory systems’, which help to construct the context of decision-making (Halligan, 1995; Craft and Howlett, 2012, 2013). They illustrate the porous nature of modern governance, providing channels for influence by academics and other specialists. Indeed, sometimes governments may wish to out-source policy thinking to organised policy networks, particularly on emerging and controversial areas, as with the recent work undertaken by the Royal Society and British Academy on data governance (Royal Society, 2017).


I have been a government minister (Andrews, 2014)—and I have been in the position of seeking to influence governments (Andrews, 2005). In both roles timing, trust and context have been key determinants in realising policy goals. To be effective in grounding evidence in governmental policy, communications need to be concise, messages clear, delivered consistently and at relevant times. There is no shortage of tool-kits to advise those wishing to promote evidence to policy-makers. The AUE, the IfG and indeed the ESRC have provided guides on how best to communicate evidence to policy-makers, as well as assessments of how better policy can be made. (ESRC, 2016; IfG 2011; IfG 2013; Maybin 2013; AUE, 2016a, b). Indeed, I sometimes wonder if we are creating an evidence-based policy industry. (Davies, 2005).

Expertise is still valued. But it needs to be well-directed. It is also possible that there will be a shift in the nature of the evidence that is seen as most valuable. In my time as a Minister I saw randomized control trials and policy pilots become more important and evaluations less so; design principles being adopted more widely in the approach to policy development (Hilton, 2015, p 44); a continuing desire to focus on what has been proved to work; with rapid changes in technology, foresight and scenario planning growing in importance; but above all, experience of implementation and delivery becoming a sought after skill.

Of course, evidence-free policy will attract headlines, particularly where it ostentatiously fails (King and Crewe, 2013). But attention to the routines of government would demonstrate that evidence is not only welcomed, but has become institutionally embedded in the processes of government: and that with four governmental spaces developing and implementing policies in the UK, new approaches are being constantly, if not consistently, trialled.

Government policy-making, whether at UK or devolved levels, will always be contested and always complex. Ensuring evidence is appropriately valued entails a responsibility on academics to understand how specific ministers develop mechanisms to manage the time constraints they face; to invest the time needed to develop relationship of trust and confidence which will neither happen overnight nor may not pay off immediately as policy is in development; and to circulate and indeed mobilise within the interest groups or media which help to shape the policy environment.

Data availability

Data sharing is not applicable to this paper as no datasets were generated or analysed.


  1. Andrews L (2005) A UK Case: Lobbying for a new BBC charter. In: Harris P, Fleisher CS (eds) The handbook of public affairs. Sage, London, p 247–268

    Chapter  Google Scholar 

  2. Andrews L (2014) Ministering to education. Parthian, Cardigan, p 2014

    Google Scholar 

  3. Andrews L (2015) How Welsh public services benefit from cross-UK evidence exchange, Institute for Government. Accessed 27 Nov

  4. Andrews L (2016a) Written statement—increasing electoral registration in Wales by Leighton Andrews, Minister for Public Services, Welsh Government. Accessed 16 Mar

  5. AUE (2015) Increasing divergence: More or less reason for exchange’. Alliance for Useful Evidence, London

  6. AUE (2016a) Using research evidence: A practice guide.

  7. AUE (2016b) Using evidence: What works.

  8. Ball SJ, Exley S (2010) Making policy with ‘good ideas’: Policy networks and the ‘intellectuals’ of new labour. J Educ Policy 25(2):151–169

    Article  Google Scholar 

  9. Barber M (2007) Instruction to deliver. Politico’s, London

    Google Scholar 

  10. Barber M (2015) How to run a government so that citizens benefit and taxpayers don’t go crazy. Allen Lane, London

    Google Scholar 

  11. Barber M (2017) How governments can get things done. Presentation to Cardiff Business School, Cardiff, 2 March

  12. Bevir M, Rhodes RAW (2006) Governance stories. Routledge, London

    Google Scholar 

  13. Bevir M, Rhodes RAW (2010) The State as Cultural Practice, OUP, Oxford

  14. BIT (2015) Update Report 2013–2015.

  15. Breckon J (2016) Evidence in an era of ‘post-truth’ politics. Alliance for Useful Evidence, London

  16. Bristow D (2016) Fuelling the evidence ecosystem: Thoughts from the PPIW second anniversary event, 30 March.

  17. Burnett J, Duncan S (2008) Reflections and observations: An interview with the UK’s first chief government social researcher. Crit Soc Policy 28(3):283–298

    Article  Google Scholar 

  18. Cabinet Office (2011) The Cabinet Manual.

  19. Cairney P (2009) The role of ideas in policy transfer: the case of uk smoking bans since devolution. J Eur Public Policy 16(3):471–488

    Article  Google Scholar 

  20. Cairney P (2016) The politics of evidence-based policy making. Palgrave MacMillan, London

    Book  Google Scholar 

  21. Cardiff University (2015) Welsh think tank marks a successful first year Accessed 19 Dec 2016

  22. CHSRF (2001) ‘Reader-friendly writing: 1:3:25’, Canadian Health Services Research Foundation.

  23. Civil Service blog (2016) Devolution and you—how to get the most out of ‘interchange shadowing’ 9 March.

  24. Civil Service Code (2015) Civil Service Code.

  25. Cleary H, Reeves R (2009) Ministerial churn, Demos, London

  26. Collins R (2009) Three myths of internet governance. Intellect Books, Bristol

    Google Scholar 

  27. Craft J, Howlett M (2012) Policy formulation, governance shifts and policy influence. J Public Policy 32(2):79–98

    Article  Google Scholar 

  28. Craft J, Howlett M (2013) The dual dynamics of policy advisory systems. Policy Soc.

  29. Davies, P (2004) Is evidence-based government possible? Jerry Lee Lecture 2004 Annual Campbell Collaboration Colloquium, Washington D.C., 19 Feb.

  30. Davies P (2012) The state of evidence based policymaking and its role in policy formation. Natl Inst Econ Rev 219:1241–1252

    Article  Google Scholar 

  31. Davies W (2005) Evidence-based policy and democracy. Open Democracy. Accessed 23 Nov

  32. Donnison D (1981) The politics of poverty. Wiley-Blackwell, London

    Google Scholar 

  33. Downe J, Martin S, Bovaird T (2012) Learning from complex policy evaluations. Policy & Politics 40(4):505–523

  34. Drabble S (2011) Ministerial effectiveness, a literature review. Institute for Government, London

    Google Scholar 

  35. Edwards J (2017) This memo from Winston Churchill on ‘Brevity’ is all you need to improve your writing. Business Insider. Accessed 26 May

  36. Eisenhower D (1961) Speech to the century association, 7 Dec

  37. ESRC (2016) Presenting your case.

  38. Estyn (2011) Tackling poverty and disadvantage in schools: Working with the community and other services.

  39. Field F (1982) Poverty and politics. Heinemann, London

    Google Scholar 

  40. Freedman D (2008) The politics of media policy. Polity Press, Cambridge

    Google Scholar 

  41. Goldacre B (2013) Teachers! What would evidence based practice look like? Accessed 15 March.

  42. Hadfield M et al. (2017) Developing the capacity to support beginning teachers in Wales: Lessons learnt from the masters in educational practice. Wales J Educ 19(1):90–106

    Article  MathSciNet  Google Scholar 

  43. Halligan J (1995) Policy advice and the public sector. In: Peters BG, Savoie DT (eds) Governance in a changing environment. McGill-Queen’s University Press, Montreal, p 138–172

    Google Scholar 

  44. Halpern D (2015) Inside the nudge unit. WH Allen, London, p 2015

    Google Scholar 

  45. Harris A, Jones M (2017) Professional learning communities: A strategy for school and system improvement? Wales J Educ 19(1):16–38

    Article  Google Scholar 

  46. Hartley J (2010) Political Leadership. In: Brookes S, Grint K (eds) The New Public Leadership Challenge. Palgrave MacMillan Basingstoke

  47. Hartley J (2011) Learning in the whirlwind: Politicians and leadership development. Publ Money Manag 31(5):331–338

    Article  Google Scholar 

  48. Hayward J (2012) Find it, make it, use it share it: learning in digital Wales.

  49. Henderson M (2012) The Geek manifesto. Bantam Press, London, p 2012

    Google Scholar 

  50. Hillman N (2016) The coalition’s higher education reforms in England. Oxf Rev Educ 42(3):330–345

    Article  Google Scholar 

  51. Hilton S (2015) More human. WH Allen, London

    Google Scholar 

  52. History & Policy (2017) History & Policy: Who we are.

  53. Hyman P (2005) Reforming zeal is no substitute for better education policies. Guardian.

  54. IfG (2011) Policy making in the real world.

  55. IfG (2013) Is evidence enough? The limits of evidence-based policy making.

  56. IfG (2015) Akash Paun and Robyn Munro, ‘Governing in a Looser Union’ Institute for Government, 2015.

  57. IfG (2016a) ‘Trust in government is growing –but it needs to deliver’, Institute for Government, 19 September 2016.

  58. Jary C (2015) Working with ministers, Civil Service Policy profession.

  59. John P (2016) The Potential for Welsh Nudges. Accessed 6 Jun

  60. Jones Sir D. (2016) The ultimate ‘policy lab’ - how the four UK nations can learn from each other. Accessed 26 Apr

  61. Keating M, Cairney P, Hepburn E (2012) Policy convergence, transfer and learning in the UK under devolution’. J Reg Fed Stud 22:289–307

    Article  Google Scholar 

  62. Kelly A (1994) Interview with Michael D. Higgins, minister for the arts, culture and the Gaeltacht. Int J Cult Policy 1(1):73–89

    Article  Google Scholar 

  63. Kelly M (1993) Man in the News: A master of the image: David Richmond Gergen. New York Times, 30 May

  64. King A, Crewe I (2013) The blunders of our governments. OneWorld publications, London

    Google Scholar 

  65. Klein R, Marmor TR (2008) Reflections on policy analysis: Putting it together again. In: Goodin RE, Moran M, Rein M (eds) The Oxford handbook of public policy. Oxford University Press, Oxford

    Google Scholar 

  66. Lawrence, NS, Chambers, JC, Morrison, SM, Bestmann, S, O’Grady, G, Chambers, CD, Kythreotis, A (2016) The Evidence Information service as a new platform for supporting evidence-based policy: a consultation of UK parliamentarians. Evid Policy, 6 June, Open Access at

  67. Lynch P (2006) Governing devolution; understanding the office of First Minister in Scotland and Wales. Parliam Aff 59(3) 420–436

  68. Maloney WA, Jordan G, McLoughlin AM (1994) Interest groups and public policy: The insider/outsider model revisited. J Publ Policy 14(1):17–38

    Article  Google Scholar 

  69. Martin S (2013) PPIW opens for business, Accessed 1 Oct

  70. Maybin J (2013) Experience-based policymaking. Accessed 16 Apr

  71. McCormick J (2013) Evidence exchange: Learning from social policy across the UK. Carnegie UK, Joseph Rowntree Foundation.

  72. Moore C (2010) Ignore the grumblers, David Cameron is following in a noble tradition. Daily Telegraph, 19 Feb.

  73. Moore M (1995) Creating public value. Harvard University Press, Cambridge, Mass

    Google Scholar 

  74. Moore SC, O’Brien C, Alam M, Cohen D, Hood K, Huang C, Moore L, Murphy S, Playle R, Sivarajasingam V, Spasic I, Williams A, Shepherd J (2014) All-Wales licensed premises intervention (AWLPI): a randomised controlled trial to reduce alcohol-related violence. BMC Public Health 14.

  75. Mulgan G (2013) Experts and experimental government. Accessed 5 Apr

  76. National Archives (2013) ‘Chrchill’s call for brevity’. Accessed 17 Oct

  77. Nelson F (2012) Pay attention in class! Michael Gove is teaching the art of politics. Daily Telegraph. Accessed 15 Nov

  78. Nicholl A (2013) The capacity of the civil service in Wales. In: Osmond J, Upton S (eds) A Stable, sustainable future for Wales. UK’s Changing Union Project, Cardiff.,

    Google Scholar 

  79. Nutley S, Davies, H, Walter I (2002) Evidence-based policy and practice: cross-sector lessons from the UK, ESRC UK Centre for Evidence Based Policy and Practice, Working Paper, St Andrews, 9 August

  80. ODI (2015) ODI to forge stronger connections between UK data innovators and government. Accessed 3 Nov

  81. OECD (2014) Improving schools in Wales: An OECD perspective.

  82. Parsons W (2002) From muddling through to muddling up -evidence based policy making and the modernisation of british government. Publ Policy Adm 17(3):43–60

    Google Scholar 

  83. Paun A, Rutter J, Nicholl A (2016) Devolution as a policy laboratory. Accessed Feb 2016

  84. Pendlebury G (2016) Banging the drum for policy-making across the UK. Accessed 20 May

  85. Pierson P (2004) Politics in time. Princeton University Press, Princeton

    Book  Google Scholar 

  86. Pollitt C (2006) Performance information for democracy: The missing link. Evaluation 12(1):38–55

    Article  Google Scholar 

  87. Pollitt C (2008) Time, policy, management. Oxford University Press, Oxford

    Google Scholar 

  88. PPIW (2014) First minister launches the public policy institute for Wales.

  89. PPIW (2015) Using evidence to improve policy. PPIW, Cardiff

  90. PPIW (2016a) Publications, Public Policy Institute for Wales.

  91. PPIW (2016b) What works in promoting what works? Accessed 5 Oct

  92. PPIW (2017) Cardiff University wins £6m funding to create the new Wales Centre for Public Policy. Accessed 14 Jun

  93. Price J, Thurston R (2012) Presentation.

  94. Raynsford N (2016) Substance not spin. Policy Press, Bristol

    Book  Google Scholar 

  95. Rhodes RAW (2011) Everyday life in British Government. Oxford University Press Oxford

  96. Rhodes RAW (2012) Theory, method and british political life history. Polit Stud Rev 10(2):161–176

    Article  Google Scholar 

  97. Roberts C (2012) Developing the use of research evidence through policy trials: undated blog, 2012.

  98. Rose R (1972) The making of cabinet ministers. Br J Polit Sci 1(4):393–414

    Article  Google Scholar 

  99. Royal Society (2017) Royal Society and British Academy Data management and use: Governance in the 21st century.

  100. Rycroft P (2016) Getting to grips with the devolution challenge. Accessed 5 Feb

  101. Schlesinger P (2009a) The Politics of Media and Cultural Policy, MEDIA@LSE Electronic Working Papers No. 17.

  102. Schlesinger P (2009b) Creativity and the experts: New Labour, think-tanks and the policy process. Int J Press Polit 14(1):3–20

    Article  Google Scholar 

  103. Schlesinger P, Tumber H (1994) Reporting Crime. Clarendon Press, Oxford

    Google Scholar 

  104. Seyd P (1976) Political quarterly 47(2):189–202

    Article  Google Scholar 

  105. Shadbolt N (2013) The geeky revolution that will change our lives. The Times, 28 October.

  106. Shepherd J (2014) How to achieve more effective services: The evidence ecosystem, what works network.

  107. Shepherd J (2016) Evidence—generation, compilation, evaluation and dissemination in the next assembly term. Accessed 3 Aug

  108. Smith R (2016) Mind the gap: Engaging education practitioners with evidence. Accessed 9 Sep

  109. Solesbury W (2001) Evidence based policy: Whence it came and where it’s going: ESRC centre for evidence based policy and practice, October.

  110. St Denny E (2016) Public policy ‘Made in Wales’? PPIW Blog Accessed 29 Nov

  111. Talbot C, Talbot C (2014) Sir Humphrey and the Professors: What does Whitehall want from academics? Policy@Manchester, University of Manchester

  112. Taylor C, Maynard T, Davies R, Waldron S, Rhys M, Power S, Moore L, Blackaby D, Plewis I (2014) Evaluating the foundation phase: Final Report.

  113. Taylor M (2011) Think tanks, public policy and academia. Public Money and Management 31(1)

  114. Thurston R (2014) The role of evidence in policy-making: A Welsh Government perspectiv.

  115. UKSA (2016) UK Statistical System.

  116. Wales DTC (2014) The role of evidence in policy making: a Welsh government perspective, 30 January presentation by Richard Thurston, Head of the Knowledge and Analytics Service, Welsh Government.

  117. WASC (2010) Welsh Affairs Select Committee, Evidence given by Rt Hon Rhodri Morgan AM. Accessed 11 Jan

  118. Wells P (2007) New labour and evidence based policy making: 1997–2007. People Place Policy 1(1).

  119. Welsh Government (2011a): Review of the cost of administering the education system in Wales. Accessed 29 Jun

  120. Welsh Government (2011b) Structure of education services review task and finish group.

  121. Welsh Government (2012a) Minutes and papers of a meeting of the Cabinet. Accessed 27 Mar

  122. Welsh Government (2012b) Review of qualifications for 14 to 19-year-olds in Wales.

  123. Welsh Government (2014) Public Appointments to the Education and Skills Ministerial Advisory Board.

  124. Welsh Government (2015) Written statement—review of local authority administrative costs. Accessed 12 Jun

  125. Welsh Government (2016a) Minutes and Papers of a meeting of the Cabinet. Accessed 9 Feb

  126. Welsh Government (2016b) Open Data Plan. Accessed 31 Mar

  127. Welsh Labour (2011) Standing up for Wales. Welsh Labour manifesto, Cardiff

    Google Scholar 

  128. White H (2016) ‘Putting Global Knowledge to Work Locally: The role of knowledge brokering in evidence-based policy and practice. Accessed 19 Dec

  129. Wicks M (2012) What ministers do. Polit Quart 88(3):585–598

    Article  Google Scholar 

  130. Wilson D (1970) I know it was the place’s fault. Oliphants, London

  131. Wiserd (2017) WISERD Dashboard.

  132. Written Assembly Questions (2016) Written Assembly Questions, WAQ71003. Accessed 28 Sept

  133. Youtube (2016) Gove: Britons have had enough of experts. Accessed 21 Jun

Download references

Author information



Corresponding author

Correspondence to Leighton Andrews.

Ethics declarations

Competing interests

The author declares no competing financial interests.

Additional information

Publisher’s note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Andrews, L. How can we demonstrate the public value of evidence-based policy making when government ministers declare that the people ‘have had enough of experts’?. Palgrave Commun 3, 11 (2017).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI:


Quick links