How did UK policymaking in the COVID-19 response use science? Evidence from scientific advisers

View author details View Less
  • 1 University of Liverpool, , UK
  • | 2 University of Oxford, , UK
Full Access
Get eTOC alerts
Rights and permissions Cite this article

Background

Responses to COVID-19 have invested heavily in science. How this science was used is therefore important. Our work extends existing knowledge on the use of science in the pandemic by capturing scientific advisers’ experiences in real time.

Aims and objectives

Our aim was to present generalisable messages on key qualifications or difficulties involved in speaking of ‘following the science’.

Methods

Ninety-three interviews with UK scientific advisors and government officials captured their activities and perceptions during the pandemic in real time. We also examined Parliamentary Select Committee transcripts and government documents. This material was analysed for thematic content.

Findings and discussion

(1) Many scientists sought guidance from policymakers about their goals, yet the COVID-19 response demonstrated the absence of a clear steer, and a tendency to change course quickly; (2) many scientists did not want to offer policy advice, but rather to provide evidence; and (3) a range of knowledge informed the UK’s pandemic response: we examine which kinds were privileged, and demonstrate the absence of clarity on how government synthesised the different forms of evidence being used.

Conclusions

Understanding the reasons for a lack of clarity about policy goals would help us better understand the use of science in policy. Realisation that policy goals sometimes alter rapidly would help us better understand the logistics of scientific advice. Many scientists want their evidence to inform policy rather than determine the options selected. Since the process by which evidence leads to decisions is obscure, policy cannot be said to be evidence-based.

Abstract

Background

Responses to COVID-19 have invested heavily in science. How this science was used is therefore important. Our work extends existing knowledge on the use of science in the pandemic by capturing scientific advisers’ experiences in real time.

Aims and objectives

Our aim was to present generalisable messages on key qualifications or difficulties involved in speaking of ‘following the science’.

Methods

Ninety-three interviews with UK scientific advisors and government officials captured their activities and perceptions during the pandemic in real time. We also examined Parliamentary Select Committee transcripts and government documents. This material was analysed for thematic content.

Findings and discussion

(1) Many scientists sought guidance from policymakers about their goals, yet the COVID-19 response demonstrated the absence of a clear steer, and a tendency to change course quickly; (2) many scientists did not want to offer policy advice, but rather to provide evidence; and (3) a range of knowledge informed the UK’s pandemic response: we examine which kinds were privileged, and demonstrate the absence of clarity on how government synthesised the different forms of evidence being used.

Conclusions

Understanding the reasons for a lack of clarity about policy goals would help us better understand the use of science in policy. Realisation that policy goals sometimes alter rapidly would help us better understand the logistics of scientific advice. Many scientists want their evidence to inform policy rather than determine the options selected. Since the process by which evidence leads to decisions is obscure, policy cannot be said to be evidence-based.

Key messages

  1. Scientific advisors need to know policy goals, but these can be obscure and changeable.
  2. Many scientists want their evidence to inform policy rather than determine the policy selected.
  3. Evidence feeds into decisions in obscure ways, so policy cannot be said to be evidence-based.
  4. ‘Evidence-informed’ policy is a more feasible aim than ‘evidence-based’ policy.

Background

Faced with a pandemic high-consequence infectious disease caused by a novel pathogen, responses to COVID-19 have invested heavily in science. How this science was used is therefore important. Our work extends existing knowledge on the use of science in the pandemic (Atkinson et al, 2020; WHO, 2020) by capturing scientific advisors’ experiences in real time. We show how a rhetoric of ‘following the science’ or ‘evidence-based policymaking’ pays insufficient attention to the meaning of these phrases or how they are operationalised. The case of the UK helps us think through what it means to use evidence for policy by offering a series of generalisable messages to sum up key qualifications or difficulties. First, we find that many scientists sought guidance from policymakers about their goals: they wanted to focus their work where it will be used. Yet cases like COVID-19 demonstrate the absence of a clear steer, and a tendency to change course quickly. Second, we show that many scientists did not want to offer policy advice, which suggests that (for them) policy could never simply be ‘evidence-based’. What they wished to provide was evidence to inform value-based policy choices. Finally, we note the range of knowledge that informed the UK’s pandemic response, examine which kinds were privileged, and demonstrate the absence of clarity on how government synthesised the different forms of evidence being used.

In the UK, cases were first detected in January 2020: this led to the convening of the government’s Scientific Advisory Group for Emergencies (SAGE). In March 2020 a complete national lockdown was announced. This was released from June 2020 as the infection rate fell, but lockdown was reintroduced in September 2020 as infections grew again. Restrictions were eased over Christmas but returned until gradual release began again in March 2021, by which time vaccination was having an impact.

Existing literature covers topics including governments’ use of science to depoliticise policy decisions, what types of science are used and why, and how scientists interact with government. On the first of these points, Weible et al (2020) noted that expert advice was used ‘to inform and legitimise governments’ choices, especially in high-stakes situations… experts become part of the rationale of governments’ responses and serve as a means to reassure the public’. Flinders (2020) characterised the UK government’s rhetoric as a device, by no means unique to the UK, to avoid blame for unpalatable decisions such as lockdowns.

The tendency to use science to depoliticise, or bureaucratise, decisions is heightened in the response to pandemics, a situation characterised by Baekkeskov and others as a time of ‘powerful experts and hands-off political leaders’ (Baekkeskov and Rubin, 2014; Rubin et al, 2021). These authors also note some consequences of the scientific uncertainty created by a novel pathogen: the need to act despite great uncertainty, rapid learning (including the retraction of publications, even in major scientific journals), and an absence of consensus about how strong evidence has to be to support policy decisions, an issue explored by Lancaster et al (2020).

Claims that policy is ‘led by the science’ illustrate Kavanagh et al’s (2021) point that ‘the idea of “evidence-based” policy is itself deeply political’. Others have warned how the government’s use of science can be selective (Stevens, 2020) and, more fundamentally, that science is a source of evidence about ‘what is’, whereas policy is about ‘what ought to be’ – since decisions are based on values, which science cannot provide (Atkinson, 2020). This reaffirmation of the importance of politics as societies’ way to manage disagreements about values is at odds with the outlook of many scientists, that policies should be determined by scientific evidence and that politics is an obstacle to this.

A second topic in the literature is the types of science used in epidemic responses. Salajan et al (2020) reviewed the literature on how evidence supports decision making during infectious disease outbreaks. The urgent tasks in a disease outbreak are to characterise the risks and identify the most effective management strategies: they found that policymakers relied for this principally on epidemiological data and mathematical modelling, sometimes without the necessary understanding of their limitations. Scientific uncertainty about how the disease will develop (a defining feature of pandemics) contributed to contested views about policy choices (though Salajan et al (2020) might have added that disagreements about values would have produced controversy in any case).

The prominence of mathematical modelling of infection rates, particularly in response to different interventions, has drawn much attention. Hine’s independent review of the UK response to the 2009 H1N1 pandemic concluded that the emphasis on modelling reduced opportunities for contribution by other disciplines (Hine, 2010). As Boin et al (2020) note, the ‘allure’ of modelling is that it looks authoritative, amid all the uncertainty. Leach et al (2021) argued that during COVID-19 ‘the UK government has claimed to “follow the science”, yet framed “the science” narrowly through reliance on risk-based epidemiological modelling’. They perhaps overstated this case: behavioural sciences were well represented on SAGE and its behavioural sub-group on Scientific Pandemic Insights on Behaviours (SPI-B), although it can be argued that insufficient weight was given to this advice.

A third topic in the literature concerns the ways scientists interact with government. Previous literature on using science in policymaking discussed ‘evidence-based policymaking’ and analysed why policymakers were governed by other considerations alongside science. The work of Lomas (2000) on the contextual influences on policy decision making is a good example, illustrating the problematic assumption that policy should be based on (rather than just informed by) scientific evidence. More recent work addresses the political nature of evidence: for example Hawkins and Parkhurst (2016) set out some ‘good governance’ criteria for its use. Recent publications by Cairney (2016; 2021), Oliver and Faul (2018) and Weible (2020) shed more light on the use of evidence. Our present article is an empirically rich contribution to a better understanding of the use of evidence in crisis situations, using data from people involved in the inner workings of the decision-making process.

Some literature on evidence-based policy making views ‘science’ and ‘policy’ analytically as two communities, a view that over-simplifies accounts of their interactions. The more recent ‘policy communities’ literature overcomes some of these problems, providing more insight into why some experts have more influence than others. Oliver and Faul (2018) discuss how these communities are constructed, describing them as ‘evidence-policy ecosystems’ where research knowledge is used alongside the views of other policy actors. Elsewhere, policymakers are described as paying varying attention to different experts based on their resources (for example, of relevant knowledge), and their choice of ‘insider’ or ‘outsider’ strategies in their approach to government (Jordan and Cairney, 2013; Cairney, 2016; Dunlop et al, 2018). Cairney (2021) describes the production and use of evidence in this way as ‘part of a political process in which the status, power, and strategies of participants can matter more than “the evidence”… scientists often face a stark choice: to “speak truth to power”… or follow the “rules of the game”… if they seek to inform government policy’.

Methods

We used two main sources of data: interviewing, and the evidence of witnesses to Parliamentary Select Committees. Beginning in February 2020 we rapidly identified interviewees who were involved in the provision of scientific advice to the government, using the existing networks of the NIHR Health Protection Research Unit in Emerging and Zoonotic Infections. Speed was essential to capture longitudinal data as early as possible. This provided a rare and valuable opportunity to hear individuals’ experiences and opinions in real time, and at least partially escape the issue of hindsight bias. We secured 93 interviews with ten people (Table 1). Interviews took the form of semi-structured calls asking them to update us on their activities. Data capture at the height of a pandemic was opportunistic: one participant spoke to us almost every 1–2 weeks until July 2021, the others with lower frequency depending on availability during their hectic pandemic work schedules. Consent was obtained before interviewing began. Interviews were conducted via telephone or videoconference by PA and were audio-recorded and transcribed verbatim. Interviewees spoke on the condition of anonymity and we do not disclose the organisations where they work. Those quoted here have agreed to the use of their responses. The study was conducted under institutional ethical approval from the University of Liverpool Research Ethics Committee (ref. 5465).

Table 1:

Interview participants and interview details

IntervieweesNumber of interviews (date range)
A4 (06.03.2020 – 03.06.2020)
B7 (17.02.2020 – 14.07.2020)
C5 (28.02.2020 – 02.07.2020)
D54 (26.03.2020 –22.07.2021)
E1 (28.10.2020)
F6 (28.02.2020 – 26.11.2020)
G1 (28.10.2020)
H13 (04.03.2020 – 14.08.2020)
J1 (09.10.2020)
K1 (13.10.2020)

Alongside the interview transcripts, we examined Parliamentary Select Committee Minutes of Evidence for material from people we were unable to interview, including joint chairs of SAGE Patrick Vallance (the government’s Chief Scientific Advisor) and Chris Whitty (Chief Medical Officer). This source also included evidence from policymakers, a term we define as Ministers, their political advisors, and their most senior administrative civil servants (Table 2). We also accessed and analysed key government policy documents and official statements. The result is a rich assembly of sources reflecting varied points of view, combining public and anonymous material. Data from all types of source were coded in NVivo12 Pro to abstract and synthesise information. We began from inductive coding, staying close to our data. Three themes were then identified as salient in the analysis, drawing too on previous literature and discussions with colleagues inside and outside the research team. These themes were: scientists’ search for guidance from policymakers about their goals; scientists’ reluctance to offer policy advice; and the difficulties of synthesising expert input of diverse kinds.

Table 2:

UK House of Commons Select Committee inquiries and witnesses used as data sources

Committee and inquiryDate of evidenceWitnessRole in scientific advisory/decision makingAffiliation
Science and Technology Committee Inquiry: ‘UK science, research and technology capability and influence in global disease outbreaks’ (2021)25.03.2020Neil FergusonSAGE and SPI-MDirector, MRC Centre for Global Infectious Disease Analysis, Imperial College
Patrick VallanceGovernment Chief Scientific AdvisorUK Government
16.04.2020James RubinSPI-BReader in the Psychology of Emerging Health Risks, King’s College London
Graham MedleySAGE and Chair of SPI-MProfessor of Infectious Disease Modelling, London School of Hygiene and Tropical Medicine
03.11.2020Patrick VallanceGovernment Chief Scientific AdvisorUK Government
Chris WhittyChief Medical OfficerUK Government
09.03.2021Patrick VallanceGovernment Chief Scientific AdvisorUK Government
Chris WhittyChief Medical OfficerUK Government
Joint Health and Social Care and Science and Technology Committees Inquiry: ‘Coronavirus: lessons learned’ (2021)21.10.2020Clare GardinerDirector Joint Biosecurity CentreUK Government
John EdmundsSAGE, SPI-MProfessor at London School of Hygiene and Tropical Medicine
Mark WoolhouseSPI-MProfessor of Infectious Disease Epidemiology at University of Edinburgh
04.11.2020Kate BinghamChair, UK Government Vaccine TaskforceUK Government
24.11.2020Matt HancockSecretary of State, DHSCUK Government
Patrick VallanceGovernment Chief Scientific AdvisorUK Government
Chris WhittyChief Medical OfficerUK Government
26.05.2021Dominic CummingsFormer chief advisor to the Prime MinisterUK Government
10.06.2021Matt HancockSecretary of State, DHSCUK Government
Liaison Committee23.05.2020Boris JohnsonPrime MinisterUK Government

Findings and discussion

This section presents and discusses our findings on each of these themes in turn. We examine each theme under different subheadings.

Scientists seeking to understand policymakers’ goals

Here, we look first at our interviewees’ more general comments about seeking to understand policy goals; then give more detail by looking at Reasonable Worst Case Scenarios (RWCS); we then note how policy goals became clearer during the pandemic, and finally how scientists found unpredictably-timed announcements of new policy problematic.

Problems when the policy goal of COVID-19 response was unclear

Early in the pandemic, an interviewee told us: “[policymakers] would say, ‘what should we do?’ And [scientists] say ‘well what do you want to achieve?’ And we just go round and round in circles” (Interviewee D, 29 April 2020).

Previous literature shows that experienced scientific advisors want policymakers to communicate the goals of their policies: only then can scientific advisors focus on the areas where policymakers will use their help. Cairney (2021) and Smith et al (2021) argue that a more specific and actionable goal than ‘make the pandemic stop’, or ‘save lives’ was needed. These scholars set out ranges of possible goals, from suppression of the virus by strict lockdown measures, to flattening the peak of infections to protect health services from being overwhelmed, protecting those most vulnerable to infection, minimising extreme poverty, and mitigating economic loss. As Smith et al (2021) add, clear and justified goals also provide the basis for allocating scarce resources to different elements of the pandemic response (for example who gets vaccines first).

For all these reasons, policy cannot ‘follow the science’ but should rather be ‘informed by the science’ (Atkinson et al, 2020). The Institute for Government’s report on science advice in the COVID-19 response cites the report of the Phillips Inquiry into Bovine Spongiform Encephalopathy (BSE) and Creutzfeldt-Jakob disease (CJD) in the UK to make the same point (Haddon et al, 2020).

When (and how) were UK goals set? In practice, although a ‘flattening the peak’ policy (to avoid a steep spike in hospital admissions that would overwhelm services) was announced in mid-March 2020, ministers did not communicate their priorities to science advisors clearly.

The basic strategy – suppression of the virus – was known. But beyond that… scientists … often had little idea of what politicians’ objectives were and what actions they were prepared to consider taking. …

[P]articularly in the initial months of the crisis, the questions that came to [SAGE] were often poorly formulated. With limited understanding of ministers’ thinking, scientists often struggled to answer them. This undermined the ability of scientific expertise to feed into political decisions. (Haddon et al, 2020)

Our evidence bears out Haddon et al’s view. Early in the response, advisors reported challenges in delivering scientific advice, due to the lack of an overall plan: “we can’t advise what the best next step is, because that is contingent on what the overall plan is, and there isn’t one. So it makes it quite hard for us to do our job in terms of scientific advice” (Interviewee D, 8 April 2020).

The example of reasonable worst-case scenarios

Because of their structured nature, official RWCS illustrate especially clearly the problems which arise when policy goals are unclear. The UK’s approach to managing ‘civil contingencies’ requires scientific advisors and officials to prepare RWCS, which represent the worst plausible manifestation of the risk to enable proportionate planning of policy responses (HM Government, 2020a). Rules for the preparation of RWCS required scientific advisors to avoid assuming future policy changes. In a somewhat Kafkaesque way:

it is the policy that determines the course of the epidemic, and the reasonable worst-case has to be policy neutral… if they can’t tell us what the policy is then there is no way in which we can model a reasonable worst-case for them… if they can’t tell us what the policy is going to be we can’t tell them what is going to happen. (Interviewee D, 14 October 2020)

On a number of occasions in 2020, RWCS were published and then exceeded, only then to be followed by change in policy. Exceeding the RWCS seemed to act as a signal that an unpalatable choice could no longer be delayed. Scientific advisors started to feel that the RWCS had lost its original role and become a trajectory of how infection and death rates were expected to develop: “if that pattern continues then the reasonable worst-case isn’t a reasonable worst-case, because basically we go up to it and then the government reacts. It actually becomes what the epidemic is going to be” (Interviewee D, 14 October 2020).

Developments in goal-setting after December 2020

From December 2020 the Cabinet Office appeared to become better at stating a goal, and its interaction with scientific advisors became more effective. (No specific reason is clear, though changes in personnel probably played a part.) As an input to the RWCS, scientific advisors reported they were now permitted to use the government’s previous decisions as indications of how it might react next. More widely, the Winter Plan (HM Government, 2020b) of November 2020 provided a longer-term view of the government’s plans for the first time: “the Winter Plan, looking forward, the Chief Medical Officer said the other day… that they are now looking forward to the end of April [2021], which is a huge improvement… thinking about what impact the decisions have now on… five months ahead” (Interviewee D, 16 December 2020).

For this interviewee, late 2020 seemed a very significant turning point. Preparation began for a ‘Roadmap out of Lockdown’, published on 22 February 2021 (Cabinet Office, 2021), setting out the actions to be taken in a series of steps, and the tests to be met before each could be taken. Indicative dates were provided, though it was emphasised that the triggers would be ‘data, not dates’ (Johnson, 2021).

The frustrations of rapidly changing policy

There are good reasons why policy might need to change rapidly in a situation of (scientific) uncertainty. However, scientific advisors found it more frustrating when the timing of announcements was driven by the political process. At times, advisors reported responding to a request for evidence on a subject, only for policy announcements to be made before the work was completed or discussed in SAGE. In such cases, how can the government claim to have been ‘guided by the science’?

A couple of times… we have had a commission one week, got people working over the weekend ready for it to go to SAGE on Thursday…. And over the weekend, while people are working on it, the decision has been announced… that is quite demoralising for the people doing it. (Interviewee D, 27 January 2021)

In the modern policy environment of 24-hour news and social media, it is part of political accountability in an open society that policymakers will pay attention to this kind of ‘news management’, in which shaping public discourse becomes the most pressing requirement. We note merely that this situation makes it harder to secure good quality science advice.

Scientific advisors’ reluctance to give policy advice

This theme discusses scientific advisors’ concern that policymakers wanted them to exceed their advisory role, and then identifies fear of blame for policy decisions as one factor. We then look at scientists’ public explanations that their role was to give one input to policy alongside others. This theme closes by noting how, later in 2020, politicians became more ready to take ownership for policy, while some scientific advisors became readier to tell them what policy should be.

Advisors’ concern at being pressed to decide policy

At first, policymakers’ failure to set out a policy goal left scientific advisors feeling that government wanted SAGE to decide policy. That would exceed SAGE’s terms of reference, which say: ‘SAGE is responsible for ensuring that timely and coordinated scientific advice is made available to decision makers to support UK cross-government decisions in the Cabinet Office briefing room (COBR). The advice provided by SAGE does not represent official government policy’ (SAGE, 2020).

Recommending a policy to government would also contravene longer standing guidance: ‘As well as providing scientific evidence, advisors may be asked to identify policy options as part of the advisory process, however the line between advising on/identifying policy options and making the decision on the final policy must be respected’ (Government Office for Science, 2010).

In the chaotic first few months of 2020, the government was simultaneously failing to give scientists a policy lead about what it wanted to achieve, yet trying to reassure the public that it was in control by presenting the steps it took as inevitable consequences of scientific advice. There were times when advisors had to filter requests, sending back any that were ‘not a scientific question’, and were viewed as ‘interpretation of a policy question’ (Interviewee C, 30 March 2020). Cairney’s (2021) description of a continuum, from ‘minimal’ guidance (helping resolve uncertainties of fact) to ‘maximal’ (helping reduce ambiguity about how to define and solve the policy problem), is useful here. Sending back requests viewed as asking advisors to interpret a policy question is a clear example of declining to give ‘maximal guidance’.

Vallance also sought to limit the scope of scientific advice in a national newspaper article about the pandemic response in May 2020. He stressed that science advice: ‘is advice. Ministers must decide and have to take many other factors into consideration. In a democracy, that is the only way it should be’ (Vallance, 2020).

Using the example of facemasks, one scientific advisor we interviewed described the difficulty of separating scientific evidence from other considerations such as the constrained availability of personal protective equipment (PPE). Again the emphasis is on resolving uncertainty and not going further:

We can say what the evidence is, and point out the weaknesses of that, but then we are pushed to give recommendations… we have to… try to avoid giving a recommendation… that is… a political choice, because there are much wider concerns, like diverting masks from healthcare workers, or risk compensation… many of these are not scientific, but they are important considerations nevertheless… no, we should interpret the science and give the best estimates of what the science is telling us, but the decision and all of the other evidence is a policy decision. (Interviewee C, 21 April 2020)

The question of blame

Linked to this was a concern from scientists not to be blamed for policymakers’ decisions:

Policies are being very scientific[ally] informed…. But also slightly nervous that… these are policy decisions and we are not taking them as scientists and it would be unfair if we were blamed for policies that didn’t follow the science, or based on scientific uncertainty were implemented and didn’t have the desired effect… we are giving the scientific advice but we are not making policy and sometimes that is a little bit blurred. (Interviewee C, 21 April 2020)

One of our interviewees went further, suggesting ministers might even be shifting responsibility to scientists for policies which did not ‘follow the science’:

It is absolutely right that government makes policy, that is government’s prerogative… but… transparently, either… following scientific advice, or… not…. They can do either, but they can’t do both. They can’t make their own decisions and say they are following … scientific advice, but not follow scientific advice. (Interviewee A, 6 March 2020)

Telling the public that other factors inform policy besides science

When asked by Parliamentary Select Committees how comfortable he was with the phrase ‘following the science’, SAGE member John Edmunds responded:

Pretty uncomfortable. It can hide a lot of things. It is pretty apparent that there is not one scientific view anyway. It never has been the case that it is just following the science… government have to weigh these things up against other things – the impact on the economy being one of the other very important aspects.… They have been weighing it all along.… They should perhaps be a little more honest and say, “Look, we are doing this”. I think maybe they are being a bit more honest about that now. (House of Commons Joint Health and Social Care and Science and Technology Committees, 2021)

Edmunds’ final sentence reflects scientific advisors’ success in resisting the ‘following the science’ rhetoric. Vallance and Whitty were instrumental in this. As early as March 2020, Vallance told a Parliamentary Select Committee:

I think the government have listened to the advice of SAGE very carefully and followed it. Clearly, there are decisions that need to be made by politicians on how they want to implement that advice, and those areas are, rightly, political decisions and not scientific ones. (House of Commons Science and Technology Committee, 2021)

Whitty’s gloss on this, in evidence given a year later, was: ‘The science is part of a decision-making process. It is not the full decision-making process in these very big, societally very important decisions’ (House of Commons Science and Technology Committee, 2021).

The discretion an official like Vallance has to exercise in evidence to a Parliamentary Committee is clear in the imprecision of words such as ‘listen’, ‘follow’ and ‘implement’. We get closer to the relationship between SAGE and policy by looking at its procedure. As in previous crises such as BSE and H1N1 influenza, SAGE defined its task as to provide an expert consensus on the best scientific evidence. SAGE distinguished the ‘evidence’, which it collectively synthesised, from the ‘advice’, based upon it, which Vallance and Whitty, as government officials, would draft (Interviewee D, 8 April 2020). This distinction between ‘evidence’ and ‘advice’ dates from, at the latest, the Hine report (Hine, 2010).

The changing position later in 2020

From about May 2020, Ministers began to move away from simply claiming to follow the science (Atkinson, 2020). It could be that by this stage they sometimes wished to prioritise economic imperatives over epidemiological ones: summer 2020 was a period when, according to (then) chief policy advisor Dominic Cummings, Prime Minister Johnson was denying that lockdowns worked, and taking steps to release the first lockdown with the ‘plan to rebuild’, which laid out ambitions to reopen the economy and society (Cabinet Office, 2020; Cummings, 2021). By November 2020, Matt Hancock, then Secretary of State for Health and Social Care, was also drawing this distinction between spring and summer 2020, telling a Select Committee:

If you are following the science, it implies an automaticity, as opposed to ministerial judgment, taking into account all of the effects based on the science. That is a truer reflection of what we do.… in the period that you are talking about [early March 2020], we absolutely… followed the scientific advice… and based our decisions on that. My point about saying that the better phrase is to be guided by the science is that there are times when that was not the case. (House of Commons Joint Health and Social Care and Science and Technology Committees, 2021)

By September 2020, many scientific advisors were frustrated that the government was not accepting the epidemiological case for a second lockdown. Here we see them taking a different view to their earlier preference for what Cairney labels ‘minimal guidance’. They now tried to persuade policymakers to take stronger measures such as a ‘circuit breaker’ lockdown, by: “writing one-page summaries and policy papers and trying to get a message to land that we’re actually where we were in late February early March and that we didn’t want to be back in the same position again” (Interviewee D, 23 September 2020).

To sum up the theme of scientists’ reluctance to give policy advice, the large amount of public discourse, for instance in Committee hearings, about the idea of ‘following the science’, itself indicates that the government’s early use of the term was a discursive choice with important implications. Cairney’s work (2021) on the early period of SAGE’s involvement with COVID-19, using his distinction between minimal and maximal guidance, helps to show what was at stake. He concludes that, during that period, SAGE’s advice ‘underpinned how ministers defined the policy problem’, with ‘a major impact on the initial substance of policy and timing’. Our interviewees said they did not want this: that they were much more comfortable confining themselves to the task of diminishing uncertainty. Certainly their public statements recorded here (such as Vallance’s and Whitty’s) push back against the argument that scientists could decide policy. And yet some did indeed advocate particular policy options, such as a ‘circuit breaker’ lockdown in autumn 2020. It seems that ‘minimal’ was their considered preference, but some resorted to ‘maximal’ guidance in a short-term crisis of rising infection rates, a position defended on ethical grounds by Birch (2021).

Disparate kinds of science and the problem of synthesis

This theme discusses our evidence about what kinds of knowledge were used and which were privileged, before turning to the difficulty of synthesising these, and the solution adopted by the government in the COVID-19 response.

The different types of useful ‘science’ and ‘knowledge’

An effective pandemic response has to draw on knowledge of many kinds. Which of these society treats as ‘science’, and in what ways that privileges such knowledge over other knowledge, is important. It matters what we choose to treat as science, because of science’s privileged status. The UK response was inevitably shaped by the UK’s existing science policy and political culture, which emphasise the biological, physical and mathematical, over the social sciences (with the humanities nowhere). In any emergency, UK government turns for academic advice first to the numerate disciplines. Cummings’ enthusiasm about being advised on coronavirus (and other challenges) by ‘a smart physicist’ is one illustration (Cummings, 2021). This emphasis on quantification and what are (revealingly) called the ‘hard sciences’ has benefits: for example, RWCS based on modelling are valuable (though they can be misunderstood or misused).

However, it also limits the influence of advice from valuable ‘soft’ sciences such as the behavioural sciences. Reflecting recent pre-coronavirus trends in UK policymaking, such as the creation of a Behavioural Insights Team in the Cabinet Office in 2011, behavioural sciences are present in SAGE and have their own subgroup (SPI-B). However, the advice of behavioural scientists has influenced government decisions less than that of the modellers and infectious disease specialists. For example, their advice to make compliance with lockdown easier for the worst-off by increasing financial compensation to those required to self-isolate at home, and to emphasise rewarding compliance over punishing non-compliance, has not been reflected in government policy.

There is at times a fairly problematic assumption that SAGE and its use of evidence are, since they represent ‘science’, somehow bias-less and purely objective representations of reality. For example, one interviewee told us that science advisors on SAGE and its subgroups are there to: ‘generate unfettered and pure evidence, and it then goes to a different group of people to then weigh up the different evidence and say we are going to do X rather than Y’ (Interviewee E, 28 October 2020), while another spoke of what science is ‘telling us’, as though it spoke unequivocally. Most of our interviewees thought that the proper role of SAGE was: ‘to give the science advice, not to make the decision and not to give economic advice’ (Interviewee K, 13 October 2020).

We consider the relationship between scientific advice and economic advice in the next section.

How different forms of knowledge are synthesised

The UK government acknowledges that science does not give policy solutions, but rather, ‘gives you a range of issues and evidence on which you need to take a broader decision’ (Interviewee G, 28 October 2020). For those decisions it also uses evidence from elsewhere:

The government understands the importance of considering a range of relevant evidence and factors in all its decision making in relation to the pandemic. This has been its approach throughout. Alongside the views of the scientific community… [it] has also undertaken significant wider analysis and evaluation to inform decisions. (Department of Health and Social Care, 2021)

In the words of a SAGE member: “there is multidisciplinary science on SAGE, but questions about societal preferences and economic questions are not represented on SAGE, and of course they are huge aspects of making the decision” (Interviewee D, 16 April 2020).

Policymakers interpret and weigh up a range of sometimes contradictory voices. It has been hard to see the mechanisms for achieving this synthesis. Where (for instance) does economics come in? We have seen how Whitty and Vallance distil advice out of SAGE’s consensus about the evidence. That advice goes into the central government machinery for emergency response: the Cabinet Office, including its Civil Contingencies Secretariat, and COBR with its ministerial committees. These coordinate policy for the UK government (Cabinet Office, 2012). These central policymaking processes considered many different inputs to the COVID-19 response, alongside Whitty and Vallance’s advice.

Our sources were aware of, and responded to, the view that SAGE could integrate economic evidence with scientific evidence. After all, economists take part in the Welsh Government’s Technical Advisory Group (TAG), an approximate equivalent of SAGE (Welsh Parliament, 2020), so why not also in England? One interviewee described how the Welsh TAG benefited from careful selection of members so that all the necessary expert disciplines were present, alongside government officials (Interviewee D, 18 November 2020).

Some of the discussion about whether to use economics in SAGE suffers from eliding two different things: a capacity to synthesise evidence about economic matters (which appears to serve the Welsh TAG well), and the incorporation of economic concerns in policymaking, for example weighing the effects on employment or public spending in choosing between options: “You know, [SAGE] come out and lay everything as open and transparently as we can, and then Treasury looks at their secret book and says ‘oh no, we can’t afford that’. Without actually sharing what it is that they have done” (Interviewee D, 28 October 2020).

Whatever their reasoning was, we observe that the most senior participants backed the government view that economic input of any kind should occur within government, not at SAGE. This illustrates Cairney’s argument (2021) that, to become insiders, experts have to operate by the rules of the policymakers’ game:

It is inappropriate for SAGE to be the place where all economic advice gets integrated with the health advice… we have been very clearly instructed that the economic impact of this sits in [the Treasury]. [The Treasury] looks at the economic impact. Therefore we do not look at the economic impacts and we are not mandated to. (House of Commons Science and Technology Committee, 2021)

The question of where economics might be integrated is linked to the question of transparency, as Interviewee D’s comment above shows. This is because SAGE evidence on COVID-19, unlike SAGE evidence in earlier emergencies, began to be published almost in real time from March 2020, unlike most other inputs to policy. Those who would like to see greater transparency therefore favour the incorporation of more kinds of evidence within SAGE’s work. Government resists this: when a Parliamentary committee asked Matt Hancock why crucial advice (including economic) is not published in the same way as SAGE papers, he responded:

I think it is reasonable for Cabinet Committees to [receive] papers by the civil service… on which they make decisions, that are not fettered by the thought that they may soon be published. It is a long-standing convention of how you run government that there has to be a protected space for decision making… I see the Cabinet papers for the economic assessment, and I think it is reasonable that they should be written without the expectation of imminent publication. (House of Commons Joint Health and Social Care and Science and Technology Committees (2021).

This restates the UK constitutional position that officials’ advice to Ministers is confidential. Adding economists to SAGE would not alter it, and would not make public the Treasury’s additional sources of economic evidence, or the ways all this evidence affected policy decisions. To sum up the discussion of evidence synthesis, it is important not to let the relative visibility of SAGE and science make us think that this is all of the advice government is using: there are many other elements. It is, as is normal in policymaking, unclear how they are synthesised.

Conclusions

The strength of this study lies in the unique access we had in real time to a group of ten key scientific advisors, as their experiences and views of the COVID-19 response developed. On the other hand, these interviewees needed to be given anonymity, making it harder to situate their evidence in context. A wider pool of advisors could have given additional perspectives. We approached Ministers, their political advisors and their senior officials, for interviews but they were unavailable. Their views are represented here by their public statements.

We have three main conclusions, each with implications beyond the context of the COVID-19 response. First, experienced scientific advisors do not produce findings and then seek a policy audience for them. Rather, they know that they operate in policy communities, where their influence depends on delivering immediately useful knowledge: material that helps policymakers achieve their goals. Therefore they seek to understand what those goals are. More work on the reasons for a lack of clarity about policy goals would help us better understand the use of science in policy. A realisation that policy goals sometimes alter rapidly would help us better understand the logistics of scientific advice.

Second, the influence over policy which scientific advisors seek can vary, from diminishing the uncertainty (the ‘minimal’) to reframing the problem and the options (the ‘maximal’), or even to advocating one option. We found a range of behaviour, of which the most frequent, and the most interesting, because less expected, was the reluctance to do more than inform policy by reducing uncertainty. Our interviewees rarely wanted to advocate one policy option over another. If our interviewees wanted evidence-based policymaking, then it was in this limited sense, better described as ‘evidence-informed’.

Finally, we have shown that, despite the greater transparency of SAGE’s work since 2020, it remains very unclear how scientific advice is combined with other kinds of evidence and leads to the making of policy. Those who want to see ‘evidence-based’ policy can take little comfort from the COVID-19 experience: evidence is still processed in non-transparent ways, from which policy emerges. Measures taken, such as publication of SAGE papers, or discussed, such as considering economic evidence at SAGE, do not change this. Those who want to see ‘evidence-informed’ policy, however, at least have the satisfaction that scientific evidence has become more visible. Accordingly, deductions about how far it has informed policy have become easier.

Funding

UKRI/NIHR Grant No. CV220-202.

Acknowledgements

We are most grateful to our ten interviewees. Thanks are also due to two anonymous reviewers. Beside the project funding from the NIHR, our broader work on the Governance of Health is supported by Wellcome Trust award No.104845. We also received support from the National Institute for Health Research (NIHR) Health Protection Research Unit in Emerging and Zoonotic Infections at University of Liverpool in partnership with Public Health England, and in collaboration with Liverpool School of Tropical Medicine and the University of Oxford (Grant No. NIHR200907). The views expressed are those of the Authors and not necessarily those of the NHS, the NIHR, the Department of Health and Social Care or PHE. We are grateful also for the support of Liverpool Health Partners, and the Centre of Excellence in Infectious Disease Research (CEIDR), Liverpool.

Research ethics statement

The study was conducted under institutional ethics approval from the University of Liverpool (ref 5465) on 10/2/20. All participants were provided with written information prior to taking part in the research, and had the opportunity to ask questions, all participants gave informed consent to take part in the study.

Contributor statement

PA and SS conceptualised the study. PA and HM designed the study, with input from CP, SS, and TS. PA conducted the interviews. A-MM, HM and PA transcribed interview data. HM and PA conducted data analysis and interpretation, with contributions from A-MM, CP, SS and AJB. PA and HM wrote the first draft and revised subsequent drafts of the manuscript, with comments and edits from A-MM, CP, AJB, SS and TS.

Conflict of interest statement

The authors declare that there is no conflict of interest.

References

  • Atkinson, P. (2020) The policy dynamics of COVID-19: what science can and cannot do, https://blogs.lse.ac.uk/politicsandpolicy/what-science-can-and-cannot-do/.

    • Search Google Scholar
    • Export Citation
  • Atkinson, P., Gobat, N., Lant, S., Mableson, H., Pilbeam, C., Solomon, T., Tonkin-Crine, S. and Sheard, S. (2020) Understanding the policy dynamics of COVID-19 in the UK: early findings from interviews with policymakers and healthcare professionals, Social Science and Medicine, 266: 113423. doi: 10.1016/j.socscimed.2020.113423

    • Search Google Scholar
    • Export Citation
  • Baekkeskov, E. and Rubin, O. (2014) Why pandemic response is unique: powerful experts and hands-off political leaders, Disaster Prevention and Management, 23(1): 8193. doi: 10.1108/DPM-05-2012-0060

    • Search Google Scholar
    • Export Citation
  • Birch, J. (2021) Science and policy in extremis: the UK’s initial response to COVID-19, European Journal for Philosophy of Science, 11(3): 90. doi: 10.1007/s13194-021-00407-z

    • Search Google Scholar
    • Export Citation
  • Boin, A., Lodge, M. and Luesink, M. (2020) Learning from the COVID-19 crisis: an initial analysis of national responses, Policy Design and Practice, 3(3): 189204. doi: 10.1080/25741292.2020.1823670

    • Search Google Scholar
    • Export Citation
  • Cabinet Office (2012) Enhanced SAGE Guidance: A Strategic Framework for the Scientific Advisory Group for Emergencies (SAGE), London: Cabinet Office.

    • Search Google Scholar
    • Export Citation
  • Cabinet Office (2020) The Next Chapter in Our Plan to Rebuild: the UK Government’s COVID-19 Recovery Strategy, London: Cabinet Office.

    • Search Google Scholar
    • Export Citation
  • Cabinet Office (2021) COVID-19 Response – Spring 2021, London: Cabinet Office.

  • Cairney, P. (2016) The Politics of Evidence-based Policymaking, London: Palgrave Pivot.

  • Cairney, P. (2021) The UK government’s COVID-19 policy: what does ‘Guided by the Science’ mean in practice? Frontiers in Political Science, 3: 624068, doi: 10.3389/fpos.2021.624068.

    • Search Google Scholar
    • Export Citation
  • Cummings, D. (2021) Dominic Cummings – the interview, BBC News Special, https://www.bbc.co.uk/iplayer/episode/m000ygcg/bbc-news-special-dominic-cummings-the-interview.

    • Search Google Scholar
    • Export Citation
  • Department of Health and Social Care (2021) The Government’s Response to the Science and Technology Report: the UK Response to COVID-19: Use of Scientific Evidence, London: Department of Health and Social Care.

    • Search Google Scholar
    • Export Citation
  • Dunlop, C., Radaelli, C. and Trein, P. (eds) (2018) Learning in Public Policy, London: Palgrave.

  • Flinders, M. (2020) Gotcha! Coronavirus, crises and the politics of blame games, Political Insight, 11(2): 2225. doi: 10.1177/2041905820933371

    • Search Google Scholar
    • Export Citation
  • Government Office for Science (2010) The Government Chief Scientific Advisor’s Guidelines on the Use of Scientific and Engineering Advice in Policymaking, London: Government Office for Science.

    • Search Google Scholar
    • Export Citation
  • Haddon, C., Sasse, T. and Nice, A. (2020) Science Advice in a Crisis, London: Institute for Government.

  • Hawkins, B. and Parkhurst, J. (2016) The ‘good governance’ of evidence in health policy, Evidence and Policy, 12(4): 57592. doi: 10.1332/174426415X14430058455412

    • Search Google Scholar
    • Export Citation
  • Hine, D. (2010) The 2009 Influenza Pandemic: An Independent Review of the UK Response to the 2009 Influenza Pandemic, London: HMSO.

  • HM Government (2020a) National Risk Register, 2020 edn, London: HMSO.

  • HM Government (2020b) COVID-19 Winter Plan, London: HMSO.

  • House of Commons Joint Health and Social Care and Science and Technology Committees (2021) Coronavirus: Lessons Learned to Date, London: House of Commons.

    • Search Google Scholar
    • Export Citation
  • House of Commons Science and Technology Committee (2021) UK science, research and technology capability and influence in global disease outbreaks, https://committees.parliament.uk/work/91/uk-science-research-and-technology-capability-and-influence-in-global-disease-outbreaks/.

    • Search Google Scholar
    • Export Citation
  • Johnson, B. (2021) PM statement to the House of Commons on roadmap for easing lockdown restrictions in England, 22 February, https://www.gov.uk/government/speeches/pm-statement-to-the-house-of-commons-on-roadmap-for-easing-lockdown-restrictions-in-england-22-february-2021.

    • Search Google Scholar
    • Export Citation
  • Jordan, A.G. and Cairney, P. (2013) What is the ‘dominant model’ of British policy making? Comparing majoritarian and policy community ideas, British Politics, 8(3): 23359. doi: 10.1057/bp.2013.5

    • Search Google Scholar
    • Export Citation
  • Kavanagh, M., Parish, K. and Gupta, S. (2021) Drivers of health policy adoption: a political economy of HIV treatment policy, Policy and Politics, 49(3): 34368.

    • Search Google Scholar
    • Export Citation
  • Lancaster, K., Rhodes, T. and Rosengarten, M. (2020) Making evidence and policy in public health emergencies: lessons from COVID-19 for adaptive evidence-making and intervention, Evidence and Policy, 16(3): 47790. doi: 10.1332/174426420X15913559981103

    • Search Google Scholar
    • Export Citation
  • Leach, M., MacGregor, M., Ripoll, S., Scoones, I. and Wilkinson, A. (2021) Rethinking disease preparedness: incertitude and the politics of knowledge, Critical Public Health, doi: 10.1080/09581596.2021.1885628.

    • Search Google Scholar
    • Export Citation
  • Lomas, J. (2000) Connecting research and policy, Isuma: Canadian Journal of Policy Research, 1(1): 14044.

  • Oliver, K. and Faul, V. (2018) Networks and network analysis in evidence, policy and practice, Evidence and Policy, 14(3): 36979. doi: 10.1332/174426418X15314037224597

    • Search Google Scholar
    • Export Citation
  • Rubin, O., Errett, N.A., Upshur, R. and Baekkeskov, E. (2021) The challenges facing evidence-based decision making in the initial response to COVID-19, Scandinavian Journal of Public Health, 49: 79096. doi: 10.1177/1403494821997227

    • Search Google Scholar
    • Export Citation
  • Salajan, A., Tsolova, S., Ciotti, M. and Suk, J.E. (2020) To what extent does evidence support decision making during infectious disease outbreaks? A scoping literature review, Evidence and Policy, 16(3): 45375. doi: 10.1332/174426420X15808913064302

    • Search Google Scholar
    • Export Citation
  • SAGE (Scientific Advisory Group for Emergencies) (2020) About us, https://www.gov.uk/government/organisations/scientific-advisory-group-for-emergencies/about.

  • Smith, M.J. et al. (2021) Top five ethics lessons of COVID-19 that the world must learn, Wellcome Open Research, 6: 17. doi: 10.12688/wellcomeopenres.16568.1

    • Search Google Scholar
    • Export Citation
  • Stevens, A. (2020) Governments cannot just ‘follow the science’ on COVID-19, Nature Human Behaviour, 4: 560. doi: 10.1038/s41562-020-0894-x

    • Search Google Scholar
    • Export Citation
  • Vallance, P. (2020) Chief scientific advisor’s Sunday Telegraph article, 31 May, https://www.gov.uk/government/speeches/chief-scientific-advisers-sunday-telegraph-article-31-may-2020?utm_source=5361f03e-3969-403d-afbe-ece837b20004&utm_medium=email&utm_campaign=govuk-notifications&utm_content=daily.

    • Search Google Scholar
    • Export Citation
  • Weible, C.M., Nohrstedt, D., Cairney, P., Carter, D.P., Crow, D.A., Durnová, A.P., Heikkila, T., Ingold, K., McConnell, A. and Stone, D. (2020) COVID-19 and the policy sciences: initial reactions and perspectives, Policy Sciences, 53(2): 22541. doi: 10.1007/s11077-020-09381-4

    • Search Google Scholar
    • Export Citation
  • Welsh Parliament (2020) Heath, social care and sport committee: fifth senedd, 16 September, https://record.assembly.wales/Committee/6440.

    • Search Google Scholar
    • Export Citation
  • WHO (World Health Organization) (2020) WHO R&D Blueprint novel coronavirus: COVID-19 social science working group, https://www.who.int/docs/default-source/blue-print/socsci-tors.pdf.

    • Search Google Scholar
    • Export Citation
  • Atkinson, P. (2020) The policy dynamics of COVID-19: what science can and cannot do, https://blogs.lse.ac.uk/politicsandpolicy/what-science-can-and-cannot-do/.

    • Search Google Scholar
    • Export Citation
  • Atkinson, P., Gobat, N., Lant, S., Mableson, H., Pilbeam, C., Solomon, T., Tonkin-Crine, S. and Sheard, S. (2020) Understanding the policy dynamics of COVID-19 in the UK: early findings from interviews with policymakers and healthcare professionals, Social Science and Medicine, 266: 113423. doi: 10.1016/j.socscimed.2020.113423

    • Search Google Scholar
    • Export Citation
  • Baekkeskov, E. and Rubin, O. (2014) Why pandemic response is unique: powerful experts and hands-off political leaders, Disaster Prevention and Management, 23(1): 8193. doi: 10.1108/DPM-05-2012-0060

    • Search Google Scholar
    • Export Citation
  • Birch, J. (2021) Science and policy in extremis: the UK’s initial response to COVID-19, European Journal for Philosophy of Science, 11(3): 90. doi: 10.1007/s13194-021-00407-z

    • Search Google Scholar
    • Export Citation
  • Boin, A., Lodge, M. and Luesink, M. (2020) Learning from the COVID-19 crisis: an initial analysis of national responses, Policy Design and Practice, 3(3): 189204. doi: 10.1080/25741292.2020.1823670

    • Search Google Scholar
    • Export Citation
  • Cabinet Office (2012) Enhanced SAGE Guidance: A Strategic Framework for the Scientific Advisory Group for Emergencies (SAGE), London: Cabinet Office.

    • Search Google Scholar
    • Export Citation
  • Cabinet Office (2020) The Next Chapter in Our Plan to Rebuild: the UK Government’s COVID-19 Recovery Strategy, London: Cabinet Office.

    • Search Google Scholar
    • Export Citation
  • Cabinet Office (2021) COVID-19 Response – Spring 2021, London: Cabinet Office.

  • Cairney, P. (2016) The Politics of Evidence-based Policymaking, London: Palgrave Pivot.

  • Cairney, P. (2021) The UK government’s COVID-19 policy: what does ‘Guided by the Science’ mean in practice? Frontiers in Political Science, 3: 624068, doi: 10.3389/fpos.2021.624068.

    • Search Google Scholar
    • Export Citation
  • Cummings, D. (2021) Dominic Cummings – the interview, BBC News Special, https://www.bbc.co.uk/iplayer/episode/m000ygcg/bbc-news-special-dominic-cummings-the-interview.

    • Search Google Scholar
    • Export Citation
  • Department of Health and Social Care (2021) The Government’s Response to the Science and Technology Report: the UK Response to COVID-19: Use of Scientific Evidence, London: Department of Health and Social Care.

    • Search Google Scholar
    • Export Citation
  • Dunlop, C., Radaelli, C. and Trein, P. (eds) (2018) Learning in Public Policy, London: Palgrave.

  • Flinders, M. (2020) Gotcha! Coronavirus, crises and the politics of blame games, Political Insight, 11(2): 2225. doi: 10.1177/2041905820933371

    • Search Google Scholar
    • Export Citation
  • Government Office for Science (2010) The Government Chief Scientific Advisor’s Guidelines on the Use of Scientific and Engineering Advice in Policymaking, London: Government Office for Science.

    • Search Google Scholar
    • Export Citation
  • Haddon, C., Sasse, T. and Nice, A. (2020) Science Advice in a Crisis, London: Institute for Government.

  • Hawkins, B. and Parkhurst, J. (2016) The ‘good governance’ of evidence in health policy, Evidence and Policy, 12(4): 57592. doi: 10.1332/174426415X14430058455412

    • Search Google Scholar
    • Export Citation
  • Hine, D. (2010) The 2009 Influenza Pandemic: An Independent Review of the UK Response to the 2009 Influenza Pandemic, London: HMSO.

  • HM Government (2020a) National Risk Register, 2020 edn, London: HMSO.

  • HM Government (2020b) COVID-19 Winter Plan, London: HMSO.

  • House of Commons Joint Health and Social Care and Science and Technology Committees (2021) Coronavirus: Lessons Learned to Date, London: House of Commons.

    • Search Google Scholar
    • Export Citation
  • House of Commons Science and Technology Committee (2021) UK science, research and technology capability and influence in global disease outbreaks, https://committees.parliament.uk/work/91/uk-science-research-and-technology-capability-and-influence-in-global-disease-outbreaks/.

    • Search Google Scholar
    • Export Citation
  • Johnson, B. (2021) PM statement to the House of Commons on roadmap for easing lockdown restrictions in England, 22 February, https://www.gov.uk/government/speeches/pm-statement-to-the-house-of-commons-on-roadmap-for-easing-lockdown-restrictions-in-england-22-february-2021.

    • Search Google Scholar
    • Export Citation
  • Jordan, A.G. and Cairney, P. (2013) What is the ‘dominant model’ of British policy making? Comparing majoritarian and policy community ideas, British Politics, 8(3): 23359. doi: 10.1057/bp.2013.5

    • Search Google Scholar
    • Export Citation
  • Kavanagh, M., Parish, K. and Gupta, S. (2021) Drivers of health policy adoption: a political economy of HIV treatment policy, Policy and Politics, 49(3): 34368.

    • Search Google Scholar
    • Export Citation
  • Lancaster, K., Rhodes, T. and Rosengarten, M. (2020) Making evidence and policy in public health emergencies: lessons from COVID-19 for adaptive evidence-making and intervention, Evidence and Policy, 16(3): 47790. doi: 10.1332/174426420X15913559981103

    • Search Google Scholar
    • Export Citation
  • Leach, M., MacGregor, M., Ripoll, S., Scoones, I. and Wilkinson, A. (2021) Rethinking disease preparedness: incertitude and the politics of knowledge, Critical Public Health, doi: 10.1080/09581596.2021.1885628.

    • Search Google Scholar
    • Export Citation
  • Lomas, J. (2000) Connecting research and policy, Isuma: Canadian Journal of Policy Research, 1(1): 14044.

  • Oliver, K. and Faul, V. (2018) Networks and network analysis in evidence, policy and practice, Evidence and Policy, 14(3): 36979. doi: 10.1332/174426418X15314037224597

    • Search Google Scholar
    • Export Citation
  • Rubin, O., Errett, N.A., Upshur, R. and Baekkeskov, E. (2021) The challenges facing evidence-based decision making in the initial response to COVID-19, Scandinavian Journal of Public Health, 49: 79096. doi: 10.1177/1403494821997227

    • Search Google Scholar
    • Export Citation
  • Salajan, A., Tsolova, S., Ciotti, M. and Suk, J.E. (2020) To what extent does evidence support decision making during infectious disease outbreaks? A scoping literature review, Evidence and Policy, 16(3): 45375. doi: 10.1332/174426420X15808913064302

    • Search Google Scholar
    • Export Citation
  • SAGE (Scientific Advisory Group for Emergencies) (2020) About us, https://www.gov.uk/government/organisations/scientific-advisory-group-for-emergencies/about.

  • Smith, M.J. et al. (2021) Top five ethics lessons of COVID-19 that the world must learn, Wellcome Open Research, 6: 17. doi: 10.12688/wellcomeopenres.16568.1

    • Search Google Scholar
    • Export Citation
  • Stevens, A. (2020) Governments cannot just ‘follow the science’ on COVID-19, Nature Human Behaviour, 4: 560. doi: 10.1038/s41562-020-0894-x

    • Search Google Scholar
    • Export Citation
  • Vallance, P. (2020) Chief scientific advisor’s Sunday Telegraph article, 31 May, https://www.gov.uk/government/speeches/chief-scientific-advisers-sunday-telegraph-article-31-may-2020?utm_source=5361f03e-3969-403d-afbe-ece837b20004&utm_medium=email&utm_campaign=govuk-notifications&utm_content=daily.

    • Search Google Scholar
    • Export Citation
  • Weible, C.M., Nohrstedt, D., Cairney, P., Carter, D.P., Crow, D.A., Durnová, A.P., Heikkila, T., Ingold, K., McConnell, A. and Stone, D. (2020) COVID-19 and the policy sciences: initial reactions and perspectives, Policy Sciences, 53(2): 22541. doi: 10.1007/s11077-020-09381-4

    • Search Google Scholar
    • Export Citation
  • Welsh Parliament (2020) Heath, social care and sport committee: fifth senedd, 16 September, https://record.assembly.wales/Committee/6440.

    • Search Google Scholar
    • Export Citation
  • WHO (World Health Organization) (2020) WHO R&D Blueprint novel coronavirus: COVID-19 social science working group, https://www.who.int/docs/default-source/blue-print/socsci-tors.pdf.

    • Search Google Scholar
    • Export Citation

Content Metrics

May 2022 onwards Past Year Past 30 Days
Abstract Views 11 11 0
Full Text Views 17 17 11
PDF Downloads 15 15 9

Altmetrics

Dimensions