6: WHO you want to reach – policymakers and managers

Author:

This chapter sets out a few examples of high-impact research which has changed and influenced policy as well as practice. I then look at the theoretical and empirical research which tells us how policymakers make decisions and use evidence in the real world. Researchers need to understand this context and the messy and dispersed nature of policymaking, in a world of competing demands. Policymakers may rely on their instincts when responding to and acting on research. They also depend on trusted individuals and organisations, like thinktanks, to make sense of evidence. Researchers need to understand these chains of influence in their field. I interview the head of a What Works Centre and share learning on effective mechanisms of evidence use. As policymaking is so diffuse, it is worth looking at managers and health and social care system leaders as well as central government. I consider recent studies on how managers use evidence which confirms the central notion that ‘evidence does not speak for itself’. The chapter concludes with research on packaging evidence for policymakers, with practical tips for writing effective policy briefs which may make your research more likely to be used by key decision-makers.

As well as the public, patients and staff, research can also make a difference to policy decisions and practice. This does not only mean understanding central policymakers in Westminster, Whitehall or Cardiff. In this chapter, when talking about policymakers, I mean those making decisions at a system or organisational (rather than team) level. This includes system leaders and managers. In health and social care, decision-making happens at all levels in a diffuse way.

Summary

This chapter sets out a few examples of high-impact research which has changed and influenced policy as well as practice. I then look at the theoretical and empirical research which tells us how policymakers make decisions and use evidence in the real world. Researchers need to understand this context and the messy and dispersed nature of policymaking, in a world of competing demands. Policymakers may rely on their instincts when responding to and acting on research. They also depend on trusted individuals and organisations, like thinktanks, to make sense of evidence. Researchers need to understand these chains of influence in their field. I interview the head of a What Works Centre and share learning on effective mechanisms of evidence use. As policymaking is so diffuse, it is worth looking at managers and health and social care system leaders as well as central government. I consider recent studies on how managers use evidence which confirms the central notion that ‘evidence does not speak for itself’. The chapter concludes with research on packaging evidence for policymakers, with practical tips for writing effective policy briefs which may make your research more likely to be used by key decision-makers.

Who makes the decisions?

As well as the public, patients and staff, research can also make a difference to policy decisions and practice. This does not only mean understanding central policymakers in Westminster, Whitehall or Cardiff. In this chapter, when talking about policymakers, I mean those making decisions at a system or organisational (rather than team) level. This includes system leaders and managers. In health and social care, decision-making happens at all levels in a diffuse way. Your research might be useful to local directors of adult social care services, care home group chains, local commissioners, charity heads or hospital chief executives as well as civil servants and ministers. This chapter looks at what we know already from evidence on how policymakers access and use research and strategies to maximise uptake of findings. Although work in this area emphasises the complexity of policy systems and decision-making, researchers should not lose heart. There are many practical steps that researchers can take, working in partnership with others, to give your findings a better chance of influencing policy and shaping services.

Step one: ask the right questions

Let me start with a couple of examples of research which has influenced policy in different ways. These feature high-impact studies on widening access to talking therapies (Box 6.1) and research on nurse staffing levels (Box 6.2).

Research example – talking therapies

It’s good to talk

David Clark, a clinical psychologist at Oxford University, is one of the founders of the successful national programme to widen access for talking therapies. Despite clinical guidelines confirming the effectiveness of psychological therapies as the first choice for treating anxiety and depression, few patients were receiving best care. Many were waiting for more than 18 months to be seen. Since the formal launch of the programme to access these therapies in 2008, more than 10,500 therapists have been trained to deliver psychological therapies. Every year, more than a million people are now being treated, being seen on average within five weeks.

Success like this with hindsight looks inevitable. But even with strong research at its core, this was by no means a given and took many years. Hearing David Clark speak, it is clear that his clinical advocacy, persistence, connections and direct lobbying helped to achieve his ambitious goal of a nationally funded programme with a new workforce to deliver effective care. Some interesting features emerge from his story.

One was the force of serendipity – and networks – which helped to pair him with a labour economist, Richard Layard, after chatting in a coffee queue at a conference. The clinician and economist together wrote articles on the economic case for expanding access to proven psychological therapies. Their argument was compelling in terms of reduced suffering but also increased wellbeing and productivity at relatively modest cost (Layard et al, 2007). This also led to a public-facing book (Thrive) and an easy-read version produced with the charity MIND (We Need to Talk).

As well as getting broad public and targeted stakeholder support, Clark also employed direct lobbying tactics. He and Layard sent a memo to Prime Minister Tony Blair’s Policy Unit in the run-up to the 2005 General Election. They were then invited to give a seminar to the Cabinet Office in January 2005. They distilled their message into some ‘killer messages’, including an estimate of the current problem in terms of reduced GDP of 4 per cent in untreated anxiety and depression. They made the case, and in May 2005 the programme to improve access to talking therapies was in the manifesto. Another feature of this successful campaign was to monitor and measure. The newly elected government supported demonstration sites in Doncaster and Newham, and evaluations of the pilots confirmed economic and clinical benefits, as predicted (Clark 2018). The full national programme was launched in 2008 and has since inspired similar models in Canada, Australia and Norway among other countries.

Of course not every researcher can land a direct briefing at Number 10. But the combination of having the right information – not just critical data on effectiveness and impact, but a range of tailored briefings – at the right time to influence policy action including manifesto pledges was crucial.

Research example – safer staffing

What difference do nurses make?

Another example where research has influenced policy as well as practice is around workforce research. In the last 20 years, there has been high-quality research linking nurse staffing levels to hospital death rates, from the work of Linda Aiken in the US to more recent multi-country European cross-sectional RN4Cast study in Europe (Aiken et al 2014). While such research could not prove causal links, the high-quality cross-sectional study provided good evidence of associations between staffing and outcome. However, when, in the wake of concerns about failings relating to under-staffing at Mid Staffordshire, NICE commissioned two reviews of evidence to inform national guidance on safe staffing levels, this proved difficult.

The evidence base was advanced by recent work by Peter Griffiths and Jane Ball in Southampton (Griffiths et al 2018). Their NIHR study, part of a wider programme of work, showed that higher levels of registered nurses were associated with fewer missed observations (or ‘care left undone’), reduced length of stay and adverse events, including mortality. The authors sought to address some of the limitations in the evidence base to date by linking data at the patient level and modelling the economic impact of changes in ward staffing. In this way, they were able to test possible causal mechanisms. Results showed that the relative risk of death increased by 3 per cent for every day registered nurse staffing fell below the ward average. As well as these key headline findings, there were interesting analyses about the relationship between registered nurses and support staff. This suggests that healthcare assistants are unlikely to make up for the shortfall of qualified nurses in terms of patient outcomes.

Interview – Peter Griffiths

Can research answer policy questions?

Although the evidence on nurse staffing and patient outcomes is as good as it is likely to get, Peter Griffiths, Professor of Health Services Research at Southampton University (who styles himself as a ‘workforce epidemiologist’), says there will always be a gap between what evidence can be reasonably provided and the questions which policymakers want addressed.

Talking to him, he says, ‘We have a body of evidence that is being used to answer the question “how many nurses do we need?”. The answer from the research so far has come back with the answer “more”. But we do not know how much. More research might help us to better quantify the effects of investing in nursing staff and identify points at which there are diminishing returns, but decisions about what is “optimal” can never just be a matter of evidence. Defining an optimal staffing level depends on values – what level of quality we want, what outcomes we value most and what we are willing (and able) to pay for as a society as a whole.’

What can we take from this? We may be looking in the wrong place if we expect research to deliver a magic number, uncontested, which can address what is essentially a political and policy decision around priorities, constraints and levels of spend. However, high-quality evidence can go some way to setting the parameters for that decision and debate.

These examples show different ways in which research may create ripples of influence, sometimes over many years. Not all research has immediate traction with policy and policymakers. It is not always possible to set out a direct and track-able journey from research to decision. We know that it is often difficult or inappropriate to attribute effect to single studies – indeed, that is largely not how science or knowledge works. The slow drip of accumulating knowledge and meandering ways in which research may or may not reach places and people of influence may be hard to measure. Short-term impacts and instrumental effects are easier to track than research which changes the conversation.

Step two: understand the context

It is worth taking a step back to consider what we know about the context of policymaking in which evidence gets used. We can draw on the work of scholars like Paul Cairney1 (Box 6.4) who uses theory and evidence in the fields of political science and policy studies to illuminate how research is used by decision-makers. Annette Boaz and Kathryn Oliver have also provided many insights into the interface between evidence and policy, drawing on many disciplines and fields (Oliver and Boaz 2019).

Interview – Paul Cairney

Real-world policy and evidence use

I spoke to Paul Cairney, Professor of Politics and Public Policy at the University of Stirling, about his work on the relationship between research evidence and policy.

‘The starting point for many researchers is that they want to know why policymakers “ignore” the evidence. Why is there this gap between good evidence and policy decisions?

There is a whole body of theory and research from political studies over some years which helps us answer that question.

First, think again about the question. Framing the problem as a gap between the research you produce and what happens to it is not helpful. Instead, find out how the policy process works and how your research might fit with this.

Second, understand the limited capacity of policymakers to pay attention. They have to have a very strong reason to read your research given the volume of information moving across their desk, so you need to articulate what is special and helpful about your work.

Third, policymaking is complex and distributed. People often think of a small group of people in charge making rational, considered decisions. In practice, it is messy and episodic, with important discussions happening with networks and people outside the room. You need to understand and work with these chains of influence.’

How policymakers use evidence

It is worth elaborating on some of these points, as they are critical to our understanding of how research has influence and impact on policymaking.

Rather like the shift of thinking on knowledge mobilisation, policy process is no longer seen as a simple linear model of stages from agenda setting to option appraisal and implementation (Cairney 2020). Instead, the reality is more dispersed, distributed and messier than ‘policymaking-as-imagined’. Drawing on an established body of research on policymakers, Cairney notes:

The political process encourages them to make decisions more quickly, in the face of uncertainty, while their attention tends to lurch, rather unpredictably, from issue to issue. Consequently, their demand for information may be unpredictable, and their ability to devote sufficient time, to understand the evidence, is very limited. Crucially, they still make decisions. (Cairney 2016: 16; emphasis in original)

Another point which is important for researchers understanding the mindset of policymakers is that ‘they must find efficient ways to ignore almost all information, to make timely choices’ (Cairney 2020). Given the size and scope of the state and decisions to be made, their aim is to reduce ambiguity – hence, the power of the nuggets of ‘evidence’ in the earlier example from Clark and Layard on the case for psychological therapies.

This information overload means that only clear, compelling messages achieve cut-through. And this works best when using emotional or belief-driven shortcuts, often relating to the power of storytelling (for a great discussion of the role of emotion in policymaking, see Drew Westen’s book The Political Brain) (Westen 2008). Policymakers, from cabinet ministers to civil servants, are people who will have visceral responses to research which is relevant to them, perhaps their frail mother recovering from hip surgery or risks to their teenage children from online grooming.

It is not just how decisions are made but who makes the decisions that is different from what might be imagined. Rather than an individual or committee weighing formal evidence in a deliberative way at a particular moment, much decision-making happens over time in dispersed and distributed systems of influence. Ministers or other top decision-makers rely on civil servants or supporting staff who will themselves have networks with advocacy groups and trusted advisers. Again, rather than a visible cycle of decision-making at the top, the reality is that most activity happens in policy communities out of sight.

What can researchers do about this? Other parts of this chapter will look at involving the right people and the relational work in building or connecting with coalitions of interest; the role of intermediary bodies in the wider system, like What Works centres; and how to shape your research into tailored formats for policymakers like policy briefs. First, it is worth looking at local level decision-makers in the health system and how they use research.

How healthcare managers use evidence

Many of the lessons from Cairney’s work on public policymaking resonate with a cohort of NIHR-funded observational studies on how service leaders in healthcare use evidence. For instance, Nicolini’s ethnographic study shadowed hospital chief executives and characterised their information use as ‘effective scanning activities’ (Nicolini et al 2014). They rarely used discrete bits of knowledge or actively searched for formal research findings – instead, they delegated this to their top team or trusted staff (this study also showed how small the ‘inner conversational circle’ was – a feature often seen in central government as well). Their job was not to access research themselves but to ‘join the dots’ and make sense of a range of evidence from disparate sources.

We can see in these studies that we come back to many earlier themes of this book. What ‘counts’ as evidence is itself contested. Formal research has to compete with other, often more powerful forms of research. One study (Rushmer et al 2015) on evidence used by commissioners to develop and implement alcohol policies found that evidence on alcohol-related harms was augmented, interpreted and sometimes overtaken by local data and knowledge on context, geography, what had been tried before and local fit. Research by Dopson et al (2013) showed that formal research findings were least valued as a source of information for health managers. This was confirmed in observational work by Wye et al which noted:

Media such as conversations and stories fitted particularly well with the fast-changing, flexible world of commissioning, and often ‘trumped’ hard data that could be questioned or sidelined on account of their low perceived usability. Local data often were more persuasive than national or research-based information. (Wye et al 2015: 131)

This illustrates that it is not just the ‘what’ but the ‘who’ and the ‘how’. Looking at the empirical evidence on managers use of research, it is clear in the words of many that, ‘Research does not speak for itself’. Managers often relied on others, such as medical directors or clinical advisers, to interpret and make sense of research, combining it with local data and experience on what worked recently. Indeed, Swan notes that often it required people with authority to advocate for the evidence and how it related to the problem in hand – ‘Bringing the “evidence” to the table without the expert is almost always inadequate’ (Swan et al 2012: 180). The importance of trusted advisers with personal influence to make the case for certain pieces of research is a theme which recurs in the literature.

Another aspect of ‘what’ (with an element of ‘how’) is the importance of data on costs and impact, seen in the example on psychological therapies. A review by Wallace et al on barriers and facilitators to use of systematic reviews by decision-makers showed the need for information on local applicability and costs and contextualisation of findings (Wallace et al 2012). What would it cost to implement here? How many emergency admissions could we avoid or how many months could older people in our district stay living independently in their own homes? We will look in more detail later in this chapter at the format for packaging research in ways which are more likely to appeal to policymakers and managers.

Step three: involve the right people

As we discovered earlier, policymakers and managers are unlikely to turn to academic journals direct for evidence. Instead, they rely on intermediary bodies, like thinktanks, and trusted advisers and colleagues who have already digested and interpreted research to meet the policy need. Jo Maybin carried out ethnographic work in English health departments, looking at how policymakers made decisions (Maybin 2016). She concluded that there was reliance on personal networks, colleagues seen to have knowledge and contacts or ‘contacts of contacts’ with influence from professional bodies, charities, thinktanks and advocacy groups. Policymakers were more likely to seek personal contact – what she calls ‘embodied knowledge’ – wanting judgement, as well as ‘facts’ from trusted sources. The risk of loss of organisational memory when colleagues or external contacts moved on makes this a tricky strategy, but it is very hard-wired into ways of working for senior policymakers. The same was true in Nicolini’s study of chief executives of NHS organisations and their over-reliance on a small cabal of trusted advisers and colleagues (Nicolini et al 2014). Researchers with something to say need to find ways to influence policy and decision-making when it happens and where it happens.

Since personal contacts with policymakers are difficult to forge and maintain, researchers working on a topic need to form effective coalitions with other interested parties, from professional bodies to patient groups. Working with others provides reassurance to decision-makers of a concerted focus or ‘single voice’ and enable a wider set of influencers to make the most of opportunities which arise. This is essential given the time it takes for evidence to influence policy and the diffuse nature of policymaking, as shown in Cairney’s case study of tobacco control with decades-long efforts by scientists and health groups to get recognition of the problem as a public health priority and support for evidence-based policy solutions (Cairney 2016).

Step four: partner with organisations, networks and champions

Many policymakers in health and social care rely for evidence on intermediary bodies such as thinktanks, charities and research institutes. In the UK, there is also the What Works Network, a set of bodies aiming to put evidence at the heart of public policy and decision-making. This is explicit in their remit – ‘to improve the way government and other organisations create, share and use (or “generate, transmit and adopt”) high quality evidence for decision-making’.2 These focus on a range of public services and arenas, from criminal justice and economic generation to education and wellbeing. The inspiration for these bodies was the setting up of NICE in 1999 with a focus on evidence-based treatments and mechanisms for making decisions about high-cost new technologies. A good account of these bodies is given by Jonathan Breckon and David Gough, including the ways in which their activities have been contested and perceived as exercises in managerial control of research (Breckon and Gough 2019). I interviewed the head of one What Works Centre (Box 6.5) who gave me interesting insights into how evidence does and doesn’t influence policy.

Interview – Nancy Hey

Finding opportunities for influence

The What Works Wellbeing Centre was set up in 2014 with a focus on wellbeing and what government, business, communities and individuals can do to improve it. Nancy Hey was the founding director of this Centre and spoke to me about her experience in getting research to decision-makers.

‘From my time working across government, I know that officials want to use evidence but they’re going to make decisions regardless, events move fast, so the best thing you can do is to get as much evidence into their thinking as possible.’

Follow the science – mechanisms and behaviour

‘When we set up our What Works Centre, we did a review on what works for research use (see Figure 6.1). We found six mechanisms. We over-weight our efforts on mechanisms like getting policymakers and researchers together in a room by holding a roundtable or reception for MPs. Events like these are very popular, very resource-intensive and yet there is very little evidence to show that it makes much difference. Unlike other mechanisms, where we really know they are important, like good communication and building research into decision-making systems – like system flags for clinicians to ask about smoking, which don’t rely on individuals being hugely enthusiastic themselves. And this is where our knowledge of mechanisms needs to relate to what we know of behavioural insights – we tend to over-estimate peoples’ motivation as a driver.’

Make it easy

‘Busy people don’t want to click on lots of different headings and levels to find out what something’s about. Make it as easy as possible to get to the parts that matter. When I brief a government minister on a complex topic like say community resilience, you have a couple of sides maximum, but they won’t read all of it. Start with the findings. And try to be as clear as possible. There is good advice from government on web design and creating accessible documents www.gov.uk/guidance/publishing-accessible-documents.’

Engaging audiences

‘Strategic communication is important – engaging stakeholders and segmenting our audience. Each strand has many dimensions to it, for instance government includes national, devolved, local and elected members, officials and analysts in public sector. The more specific you can be the better in addressing your particular audience. And knowing their channels – for instance, a piece from us on wellbeing at work or corporate responsibility in Business Insider will reach places our website won’t.

It’s too late at the end of your project to try to get important policymakers involved. If the right people are involved in some way early on, there is a pull for the evidence which follows – your audience is ready.’

Figure 6.1:
Figure 6.1:

Prioritising engagement activity according to the evidence – what works in research use

Source: What Works Centre for Wellbeing under the Creative Commons license 4.0, adapted from Langer et al 2016.

Communication, dissemination and engagement activities at the What Works Centre for Wellbeing are prioritised in a strategic way, drawing on a useful review of evidence on approaches to getting research to decision-makers (Langer et al 2016). As well as formal evidence from 25 published reviews – skewed towards health, the source of most evidence to date on evidence-based policy and practice – there is a useful broader overview of learning from other fields in social sciences, like behavioural science, social marketing and adult learning theory.

Looking at over 150 interventions in the formal review across these different kinds of evidence use mechanisms, sometimes used in combination, it was striking how little evidence of impact there was in certain fields. There was reasonable evidence to support good communication strategies and design, skills-building on evidence literacy and structural ways of embedding evidence (Figure 6.1). There was less evidence of impact for relationship-building exchanges between researchers and policymakers – although this may just reflect how difficult it is for these kinds of informal activities to be evaluated and measured in systematic ways. Overall, there were richer insights and learning from the broad scoping work in diverse fields of social sciences than the narrow effectiveness search in the formal literature review.

Strategies and tactics to optimise research uptake include, in Cairney’s formulation, ‘identifying where the action takes place; learning about the properties of subsystems, the rules of the game, and how to frame evidence to fit policy agendas; forming coalitions with other influential actors; and, engaging in the policy process long enough to exploit windows of opportunity’ (Cairney 2016: 81). It also means understanding the limits of policymakers’ attention and the need for short, targeted summaries which appeal to the frames and feelings of the reader.

Step five: present content which is engaging and accessible

Writing a policy brief

There is a small but growing evidence base on what managers and policymakers say they value about formats of evidence. This is largely qualitative research where individuals or groups of managers are asked to identify features that they find engaging or off-putting in research and research summaries. Findings from this research are summarised in reviews (Oliver et al 2014; Wallace et al 2014; Tricco et al 2016), which include the following barriers (I have picked out the main items here):

  • lack of relevant content;

  • lack of contextualisation of findings;

  • length of paper or report;

  • poor presentation format;

  • too much on methods or research quality.

In terms of what they liked in the way of format, this mirrored the list of aforementioned negatives, but with some positive suggestions by managers, including:

  • one-page top summary;

  • graded format with key messages upfront;

  • use of white space and bullets;

  • more on implications for policy;

  • web-based format preferred;

  • framing title as question.

Particularly important for decision-makers is the need for information with local applicability and costs (Wallace et al 2012). Others have noted the importance for decision-makers of research where the implementation, economy and equity impacts are explicitly considered (Vogel et al 2013). It is a good exercise to test out the relevance of your findings for decision-makers. This might mean extrapolating findings to the footprint of an integrated care system for service leaders – how could this change impact on number of emergency hospital admissions in our area? At a national level, it might mean projected outcomes using employment and criminal justice data if new approaches for care leaver support were adopted.

Wickremasinghe and colleagues (2016) carried out a literature search, including grey literature such as advice for civil servants, and expert interviews, to identify what forms of evidence synthesis policymakers want. An important finding was that policymakers often want answers to broad questions, while researchers need to frame tight and narrow research or review questions as part of a commitment to reliability and reproducibility. But there is good agreement on the importance of readability, relevance and rigour – noting the tensions and trade-offs between rigour and relevance, if policy windows (see Chapter 7 on timing) are too short for a complete and systematic review of evidence on a given topic (Thomson 2013).

One important aspect is to start with the problem facing the decision-maker and not your study. This means knowing which topics have high policy salience and providing enough context to your research to underline their relevance and importance to topical issues of the day (Moat et al 2013). Keeping up to date with health and social care policy through service and practice journals, thinktank briefings and conferences is helpful. Framing your study according to policy need is the difference between ‘pull’ where the decision-maker wants the information and ‘push’ where you as a researcher are promoting your study. At our evidence centre, we pulled together relevant evidence starting with the problems facing policymakers, from safe levels of nurse staffing to improving care and outcomes for people with learning disabilities.

Briefings for politicians

The Parliamentary Office for Science and Technology was set up 30 years ago to provide reliable evidence for the UK parliament. It provides helpful tips on how to prepare policy-facing briefs: https://post.parliament.uk/how-to-write-a-policy-briefing/. While much of this is familiar advice on good writing, there are some particular tips for evidence focused at parliamentarians, such as connecting the topic to data – numbers and stories – for particular areas and regions to get the attention of MPs. There is a library of structured short briefings for government on a range of topics, from cloud computing to marine renewables: https://post.parliament.uk/type/postnote/.

The policy brief championed by John Lavis et al (2009) and others is a way of focusing first on the pressing policy problem. Their series describing and supporting tools for increasing the uptake of trials and systematic reviews by decision-makers were a landmark in the health-related evidence-informed policy debate. They pioneered a format of briefing for decision-makers, including those in low- and middle-income countries.

To get a sense of the format of policy briefs, it is a good idea to browse existing resources (Box 6.6). Some of the key format and content principles useful for getting – and keeping – the attention of decision-makers include:

  • Start with the findings upfront

Begin with a declarative title or paragraph in which you explain your key findings and what they mean.

  • Explain why it is important

Set the findings in the context of policy and why this is important now. Identify uncertainties in current decision-making and how this research will fill some of these gaps. Link to recent events or crises which underline why new policy is needed.

  • Keep methods to a minimum

Say enough about study design for the reader to understand the weight of evidence. Put some critical information in sidebars or boxes, so you don’t detract from the main messages.

  • Implications, not recommendations

State what the implications for policy might be from your research, but don’t stray too far into explicit policy recommendations as you may be out of your depth (Whitty 2015). This might include consideration of costs and sustainability of initiatives, if scaled up.

The last word should come from Katherine Oliver and Paul Cairney (2019), who carefully reviewed formal research and grey literature across different disciplines to come to the following practical advice for researchers in reaching policymakers (Box 6.7).

Tips for researchers to influence policy

  • do high quality research;

  • make your research relevant and readable;

  • understand policy processes;

  • be accessible to policymakers: engage routinely, flexible, and humbly;

  • decide if you want to be an issue advocate or honest broker;

  • build relationships (and ground rules) with policymakers;

  • be ‘entrepreneurial’ or find someone who is;

  • reflect continuously: should you engage, do you want to, and is it working?

Source: Oliver, K. and Cairney, P. (2019)

This paper is well worth reading in full, as an overview of existing guidance from a wide and dispersed evidence base (Oliver and Cairney 2019). As a critical review, it also helpfully highlights the tensions and unexamined contradictions in much ‘how to’ advice on influencing policymakers – for instance, whether researchers should present themselves as disinterested voices of science or as champions for particular causes. Both are legitimate, but require different activities and skillsets (and perhaps personalities).

There is a danger that the sum effect of this chapter is to make researchers feel disheartened about their ability to influence policy and decision-making. I am aware that research in this area emphasises the complexity (Lamont 2020) and diffuse nature of policymaking, which may lead to a counsel of despair. However, there are also examples of research throughout this book which have shaped policy, from understanding the weekend effect of hospital admissions to organisation of stroke services. Effective research teams I know use a range of approaches, hard and soft, to make their research land in policy circles.

Practical pointers to reach policymakers and managers

Understand chains of influence for your field of study

Map out who are the players in your field, some of whom you may have come across in your research already. Who are the influencers at foundations or lobby groups or thinktanks with a special interest in your field? Who do they work with? How does your work link into ongoing policy discussions and questions? As a researcher on self-funding residential places, you might want to dip into sessions of a parliamentary select committee inquiry on future funding of adult social care. As a radiographer you may want to find out more about central policy direction on imaging networks as a context and frame for your study.

Write a policy brief

Construct your findings as a one-page policy brief. Start with the headline findings upfront. Ground this in the context of the important policy problem it addresses, explaining what this adds to what we already know. Present implications of your research in terms of critical health or wellbeing benefits, costs, service efficiencies or equity impacts. Include important data points as boxed items. Use white space and consider the layout to emphasise the main take-home points. Test this brief out with policy leads or proxies.

Follow the debate

You can be a ‘lurker’ and perhaps participant in Twitter discussion. Identify people you admire in your field with a policy bent, read their posts and see who they follow, understand what the debates are and how they are framed. Identify relevant hashtags for events, campaigns or communities who may be interested in your work. Join in when you can, for instance signposting useful research articles at the right time in Twitter exchanges or Tweetchats. Attend relevant thinktank briefings, webinars, conferences – many of which are now free and online. Share a few insights from speakers which resonated with you and make links to relevant research. You can play your part in enriching policy debates with useful research.

  • Figure 6.1:

    Prioritising engagement activity according to the evidence – what works in research use

  • Adams, C.E., Jayaram, M., Bodart, A.Y.M., Sampson, S., Zhao, S. and Montgomery, A.A. (2016) ‘Tweeting links to Cochrane Schizophrenia Group reviews: a randomised controlled trial’, BMJ Open, 6(3): e010509.

  • Adams, R.C., Challenger, A., Bratton, L., Boivin, J., Bott, L., Powell, G., Williams, A., Chambers, C.D. and Sumner, P. (2019) ‘Claims of causality in health news: a randomised trial’, BMC Medicine, 17(1): 1–11.

  • Aiken, L.H., Sloane, D.M., Bruyneel, L., Van den Heede, K., Griffiths, P., Busse, R. et al (2014) ‘Nurse staffing and education and hospital mortality in nine European countries: a retrospective observational study’, The Lancet, 383: 1824–30.

  • Alvesson, M., Gabriel, Y. and Paulsen, R. (2017) Return to Meaning: A Social Science with Something to Say, Oxford: Oxford University Press.

  • Appleby, J., Raleigh, V., Frosini, F., Bevan, G., Gao, H. and Lyscom, T. (2011) Variations in Health Care: The Good, the Bad and the Inexplicable, London: King’s Fund.

  • Atkins, L., Smith, J.A., Kelly, M.P. and Michie, S. (2013) ‘The process of developing evidence-based guidance in medicine and public health: a qualitative study of views from the inside’, Implementation Science, 8(1): 1–12.

  • Badenoch, D. and Tomlin, A. (2015) ‘Keeping up to date with reliable mental health research: Minervation White Paper’. Available from: www.minervation.com (accessed 19 October 2020).

  • Banks, S., Herrington, T. and Carter, K. (2017) ‘Pathways to co-impact: action research and community organising’, Educational Action Research, 25(4): 541–59.

  • Bath P.M., Woodhouse, L.J., Appleton, J.P., Beridze, M., Christensen, H., Dineen, R.A. et al (2017) ‘Antiplatelet therapy with aspirin, clopidogrel, and dipyridamole versus clopidogrel alone or aspirin and dipyridamole in patients with acute cerebral ischaemia (TARDIS): a randomised, open-label, phase 3 superiority trial’, The Lancet, 391(10123): 850–9.

  • Baxter, K., Heavey, E. and Birks, Y. (2020) ‘Choice and control in social care: experiences of older self-funders in England’, Social Policy & Administration, 54(3): 460–74.

  • Bayley, J. and Phipps, D. (2019) ‘Extending the concept of research impact literacy: levels of literacy, institutional role and ethical considerations’, Emerald Open Research, 1: 14.

  • Bellos, D. (2012) Is That a Fish in Your Ear? Translation and the Meaning of Everything, London: Penguin Books.

  • Beresford, P. (2016) All Our Welfare: Towards Participatory Social Policy, Bristol: Policy Press.

  • Best, A. and Holmes, B. (2010) ‘Systems thinking, knowledge and action: towards better models and methods’, Evidence & Policy: A Journal of Research, Debate and Practice, 6(2): 145–59.

  • Bickerdike, L., Booth, A., Wilson, P.M., Farley, K. and Wright, K. (2017) ‘Social prescribing: less rhetoric and more reality. A systematic review of the evidence’, BMJ Open, 7(4): e013384.

  • Boaz, A. and Nutley, S. (2019) ‘Using evidence’, in A. Boaz, H. Davies, A. Fraser and S. Nutley (eds) What Works Now? Evidence-Informed Policy and Practice, Bristol: Policy Press, pp 251–77.

  • Boaz, A., Biri, D. and McKevitt, C. (2016) ‘Rethinking the relationship between science and society: has there been a shift in attitudes to patient and public involvement and public engagement in science in the United Kingdom?’, Health Expectations, 19(3): 592–601.

  • Boaz, A., Hanney, S., Borst, R., O’Shea, A. and Kok, M. (2018) ‘How to engage stakeholders in research: design principles to support improvement’, Health Research Policy and Systems, 16(1): 1–9.

  • Boaz, A., Davies, H., Fraser, A. and Nutley, S. (eds) (2019) What Works Now? Evidence-informed Policy and Practice, Bristol: Policy Press.

  • Bornbaum, C.C., Kornas, K., Peirson, L. and Rosella, L.C. (2015) ‘Exploring the function and effectiveness of knowledge brokers as facilitators of knowledge translation in health-related settings: a systematic review and thematic analysis’, Implementation Science, 10: 1–12.

  • Bowman, D. (2019) ‘I’m a Professor of Medical Ethics, but having cancer changed my beliefs about medicine’ . Available from: www.royalmarsden.nhs.uk/im-professor-medical-ethics-having-cancer-changed-my-beliefs-about-medicine (accessed 24 October 2020).

  • Boyd, B. (2009) On the Origin of Stories: Evolution, Cognition, and Fiction, Cambridge, Mass: Harvard University Press.

  • Braithwaite, J., Glasziou, P. and Westbrook, J. (2020) ‘The three numbers you need to know about healthcare: the 60–30–10 challenge’, BMC Medicine, 18: 1–8.

  • Breckon, J. and Gough, D. (2019) ‘Using evidence in the UK’, in A. Boaz, H. Davies, A. Fraser and S. Nutley (eds) What Works Now? Evidence-Informed Policy and Practice, Bristol: Policy Press, pp 285–302.

  • Brooks, P. (1984) Reading for the Plot: Design and Intention in Narrative, New York: AA Knopf.

  • Brown, J.S. and Duguid, P. (2017) The Social Life of Information: Updated, with a New Preface, Boston, Mass: Harvard Business Review Press.

  • Cairney, P. (2016) The Politics of Evidence-Based Policymaking, London: Palgrave Macmillan.

  • Cairney, P. (2020) Understanding Public Policy (2nd edn), London: Red Globe Press.

  • Cairney, P. and Kwiatkowski, R. (2017) ‘How to communicate effectively with policymakers: combine insights from psychology and policy studies’, Palgrave Communications, 3(1): 1–8.

  • Campbell, J. (2008) The Hero with a Thousand Faces (3rd edn), Novato, Calif: New World Library.

  • Carroll, N. and Conboy, K. (2020) ‘Normalising the “new normal”: changing tech-driven work practices under pandemic time pressure’, International Journal of Information Management, 55: 102186.

  • Chakravarthy, U., Harding, S.P., Rogers, C.A., Downes, S.M., Lotery, A.J., Culliford, L.A. et al (2013) ‘Alternative treatments to inhibit VEGF in age-related choroidal neovascularisation: 2-year findings of the IVAN randomised controlled trial’, The Lancet, 382 (9900): 1258–67.

  • Chalmers, I. and Glasziou P. (2009) ‘Avoidable waste in the production and reporting of research evidence’, The Lancet, 374(9683): 86–9.

  • Chapman, A.L. and Greenhow, C. (2019) ‘Citizen-scholars: social media and the changing nature of scholarship’, Publications, 7(1): 11.

  • Chapman, S. (2017) ‘Frozen shoulder: making choices about treatment’, 12 October. Available from: www.evidentlycochrane.net/frozen-shoulder-2/ (accessed 24 October 2020).

  • Charon, R. (2008) Narrative Medicine: Honoring the Stories of Illness, Oxford: Oxford University Press.

  • Christensen, C.M. (2013) The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail, Boston, Mass: Harvard Business Review Press.

  • Clark D.M. (2018) ‘Realizing the mass public benefit of evidence-based psychological therapies: the IAPT program’, Annual Review of Clinical Psychology, 14: 159–83.

  • Cook, E. (1913) The Life of Florence Nightingale Vol 2 (1862–1910), London: Macmillan, pp 25–35. Available as ebook (2012), Urbana Illinois: Project Gutenberg, at http://www.gutenberg.org/files/40058/40058-h/40058-h.htm (accessed 22 March 2021).

  • Correll, C.U., Galling, B., Pawar, A., Krivko, A., Bonetto, C., Ruggeri, M. et al (2018) ‘Comparison of early intervention services vs treatment as usual for early-phase psychosis: a systematic review, meta-analysis, and meta-regression’, JAMA Psychiatry, 75: 555–65.

  • Cowan, K. and Oliver S. (2021) The James Lind Alliance Guidebook (Version 10), Southampton: National Institute for Health Research Evaluation, Trials and Studies Coordinating Centre. Available from: www.jla.nihr.ac.uk/jla-guidebook/ (accessed 13 March 2021).

  • Currie, G., Waring, J. and Finn, R. (2008) ‘The limits of knowledge management for UK public services modernization: the case of patient safety and service quality’, Public Administration, 86(2): 363–85.

  • Davies, H.T.O., Powell, A.E. and Nutley, S.M. (2015) ‘Mobilising knowledge to improve UK health care: learning from other countries and other sectors – a multimethod mapping study’, Health Services & Delivery Research, 3(27), https://doi.org/10.3310/hsdr03270

  • Dixon-Woods M. (2014) ‘The problem of context in quality improvement’, Perspectives on Context: A Selection of Essays Considering the Role of Context in Successful Quality Improvement, Health Foundation. Available from: www.health.org.uk/publications/perspectives-on-context (accessed 14 March 2021).

  • Dopson, S., Bennett, C., Fitzgerald, L., Ferlie, E., Fischer, M., Ledger, J., McCulloch, J. and McGivern, G. (2013) ‘Health care managers’ access and use of management research’, NIHR Service Delivery and Organisation Programme.

  • Drummond, M. and Banta, D. (2009) ‘Health technology assessment in the United Kingdom’, International Journal of Technology Assessment in Health Care, 25(S1): 178–81.

  • Dunleavy, P. and Tinkler, J. (2020) Maximizing the Impacts of Academic Research: How to Grow the Recognition, Influence, Practical Application and Public Understanding of Science and Scholarship, London: Macmillan.

  • DuVal, G. and Shah, S. (2020) ‘When does evidence from clinical trials influence health policy? A qualitative study of officials in nine African countries of the factors behind the HIV policy decision to adopt Option B+’, Evidence & Policy: A Journal of Research, Debate and Practice, 16(1): 123–44.

  • Elbow, P. (2013) ‘Maybe academics aren’t so stupid after all’, OUPblog, 6 February. Available from: https://blog.oup.com/2013/02/academic-speech-patterns-linguistics/ (accessed 20 October 2020).

  • Elliott, J.H., Turner, T., Clavisi, O., Thomas, J., Higgins, J.P., Mavergames, C. and Gruen, R.L. (2014) ‘Living systematic reviews: an emerging opportunity to narrow the evidence-practice gap’, PLOS Medicine, 11(2): e1001603.

  • Engebretsen, M. and Kennedy, H. (eds) (2020) Data Visualization in Society, Amsterdam: Amsterdam University Press.

  • Evans S. and Scarbrough H. (2014) ‘Supporting knowledge translation through collaborative translational research initiatives: “bridging” versus “blurring” boundary-spanning approaches in the UK CLAHRC initiative’, Social Science & Medicine, 106: 119–27.

  • Fanshawe T.R., Halliwell W., Lindson N., Aveyard, P., Livingstone-Banks, J. and Hartmann-Boyce, J. (2017) ‘Tobacco cessation interventions for young people’, Cochrane Database of Systematic Reviews, 11: CD003289.

  • Featherstone, K., Northcott, A., Harden, J., Harrison-Denning, K., Tope, R., Bale, S. and Bridges, J. (2019) ‘Refusal and resistance to care by people living with dementia being cared for within acute hospital wards: an ethnographic study’, Health Services & Delivery Research, 7(11), https://doi.org/10.3310/hsdr07110

  • Franck, G. (2019) ‘The economy of attention’, Journal of Sociology, 55(1): 8–19.

  • Freeman, R. (2007) ‘Epistemological bricolage: how practitioners make sense of learning’, Administration & Society, 39(4): 476–96.

  • Fulop, N.J., Ramsay, A.I.G., Hunter, R.M., McKevitt, C., Perry, C., Turner, S.J. et al (2019) ‘Evaluation of reconfigurations of acute stroke services in different regions of England and lessons for implementation: a mixed-methods study’, Health Services and Delivery Research, 7(7), https://doi.org/10.3310/hsdr07070

  • Gabbay, J. and le May, A. (2011) Practice-based Evidence for Health Care: Clinical Mindlines, Abingdon: Routledge.

  • Gates, S., Lall, R., Quinn, T., Deakin, C.D., Cooke, M.W., Horton, J., Lamb, S.E., Slowther, A.M., Woollard, M., Carson, A. and Smyth, M. (2017) ‘Prehospital randomised assessment of a mechanical compression device in out-of-hospital cardiac arrest (PARAMEDIC): a pragmatic, cluster randomised trial and economic evaluation’, Health Technology Assessment, 21(11), https://doi.org/10.3310/hta21110

  • Gawande, A. (2014) Being Mortal: Medicine and What Matters in the End, New York: Metropolitan Books.

  • Glasziou, P. and Chalmers, I. (2018) ‘Research waste is still a scandal: an essay by Paul Glasziou and Iain Chalmers’, BMJ, 363: k4645.

  • Glenton, C. (2017) ‘How to write a plain language summary of a Cochrane intervention review’, Cochrane Norway. Available from: www.cochrane.no/sites/cochrane.no/files/public/uploads/how_to_write_a_cochrane_pls_12th_february_2019.pdf (accessed 26 February 2021).

  • Glenton C., Rosenbaum, S. and Fønhus, M.S. (2019) ‘Checklist and guidance for disseminating findings from Cochrane intervention reviews’ Cochrane. Available from: https://training.cochrane.org/sites/training.cochrane.org/files/public/uploads/Checklist%20FINAL%20version%201.1%20April%202020pdf.pdf (accessed 26 February 2021).

  • Gough, D., Maidment, C. and Sharples, J. (2018) UK What Works Centres: Aims, Methods and Contexts, London: EPPI-Centre. Available from: https://eppi.ioe.ac.uk/cms/Default.aspx?tabid=3731 (accessed 14 March 2021).

  • Graff, G. and Birkenstein, C. (2010) ‘They Say/I Say’: The Moves that Matter in Persuasive Writing (2nd edn), New York: Norton.

  • Graham, I.D. and Tetroe, J. (2007) ‘Some theoretical underpinnings of knowledge translation’, Academy of Emergency Medicine, 14(11): 936–41.

  • Gravier, E. (2019) ‘Spending 2 hours in nature each week can make you happier and healthier, new study says’, 2 July, cnbc.com (online), www.cnbc.com/2019/07/02/spending-2-hours-in-nature-per-week-can-make-you-happier-and-healthier.html (accessed 14 March 2021).

  • Green, L.W. (2008) ‘Making research relevant: if it is an evidence-based practice, where’s the practice-based evidence?’ Family Practice, 25(Suppl 1): i20–4.

  • Greenhalgh, T. (2018) How to Implement Evidence-Based Healthcare, Oxford: Wiley Blackwell.

  • Greenhalgh, T. and Wieringa, S. (2011) ‘Is it time to drop the “knowledge translation” metaphor? A critical literature review’, Journal of the Royal Society of Medicine, 104(12): 501–09.

  • Greenhalgh, T. and Fahy, N. (2015) ‘Research impact in the community-based health sciences: an analysis of 162 case studies from the 2014 UK Research Excellence Framework’, BMC Medicine, 13(1): 1–12.

  • Greenhalgh, T., Schmid, M.B., Czypionka, T., Bassler, D. and Gruer, L. (2020) ‘Face masks for the public during the covid-19 crisis’, BMJ, 369: m1435.

  • Grey, C. (2012) Decoding Organization: Bletchley Park, Codebreaking and Organization Studies, Cambridge: Cambridge University Press.

  • Griffiths, P., Ball, J., Bloor, K., Böhning, D., Briggs, J., Dall’Ora, C. et al (2018) ‘Nurse staffing levels, missed vital signs and mortality in hospitals: retrospective longitudinal observational study’, Health Services & Delivery Research: 6(38), https://doi.org/10.3310/hsdr06380

  • Hanney, S.R., Castle-Clarke, S., Grant, J., Guthrie, S., Henshall, C., Mestre-Ferrandiz, J., Pistollato, M., Pollitt, A., Sussex, J. and Wooding, S. (2015) ‘How long does biomedical research take? Studying the time taken between biomedical and health research and its translation into products, policy, and practice’, Health Research Policy and Systems, 13(1): 1–18.

  • Harris R., Sims S., Leamy M., Levenson R., Davies N., Brearley S. et al (2019) ‘Intentional rounding in hospital wards to improve regular interaction and engagement between nurses and patients: a realist evaluation’, Health Services & Delivery Research, 7(35), https://doi.org/10.3310/hsdr07350

  • Haux, T. (2019) Dimensions of Impact in the Social Sciences: The Case of Social Policy, Sociology and Political Science Research, Bristol: Policy Press.

  • Hickey, G., Richards, T. and Sheehy, J. (2018) ‘Co-production from proposal to paper’, Nature, 562: 29–31.

  • Hogwood, B.W. and Gunn, L.A. (1984) Policy Analysis for the Real World, Oxford: Oxford University Press.

  • Holmes, A., Dixon-Woods, M., Ahmad, R., Brewster, E., Castro Sanchez, E.M., Secci, F., Zingg, W. et al (2015) Infection Prevention and Control: Lessons from Acute Care in England. Towards a Whole Health Economy Approach, Health Foundation. Available from: www.health.org.uk/publications/infection-prevention-and-control-lessons-from-acute-care-in-england (accessed 14 March 2021).

  • Holmes, B.J., Best, A., Davies, H., Hunter, D., Kelly, M.P., Marshall, M. and Rycroft-Malone, J. (2017) ‘Mobilising knowledge in complex health systems: a call to action’, Evidence & Policy, 13(3): 539–60.

  • Hook, D.W., Calvert, I. and Hahnel, M. (2019) The Ascent of Open Access: An Analysis of the Open Access Landscape since the Turn of the Millennium. Available from: https://digitalscience.figshare.com/articles/report/The_Ascent_of_Open_Access/7618751 (accessed 14 March 2021).

  • Hopkins, C. (1923) Scientific Advertising, New York: Crown Publishers.

  • Houghton, C., Meskell, P., Delaney, H., Smalle, M., Glenton, C., Booth, A. et al (2020) ‘Barriers and facilitators to healthcare workers’ adherence with infection prevention and control (IPC) guidelines for respiratory infectious diseases: a rapid qualitative evidence synthesis’, Cochrane Database of Systematic Reviews, 4.

  • Hutchinson, J.R. and Huberman, M. (1994) ‘Knowledge dissemination and use in science and mathematics education: a literature review’, Journal of Science Education and Technology, 3: 27–47.

  • Isett, K.R. and Hicks, D. (2020) ‘Pathways from research into public decision making: intermediaries as the third community’, Perspectives on Public Management and Governance, 3(1): 45–58.

  • Johnson, D., Deterding, S., Kuhn, K.A., Staneva, A., Stoyanov, S. and Hides, L. (2016) ‘Gamification for health and wellbeing: a systematic review of the literature’, Internet Interventions, 6: 89–106.

  • Kam, C.D. (2005) ‘Who toes the party line? Cues, values, and individual differences’, Political Behavior, 27(2): 163–82.

  • Kazdin, A. E. (2008) ‘Evidence-based treatment and practice: new opportunities to bridge clinical research and practice, enhance the knowledge base, and improve patient care’, American Psychologist, 63(3), 146–59.

  • Kincheloe, J.L. (2001) ‘Describing the bricolage: conceptualizing a new rigor in qualitative research’, Qualitative Inquiry, 7(6): 679–92.

  • Kingdon, J. (1995) Agendas, Alternatives and Public Policies (2nd edn), New York: Harper Collins.

  • Lamont, T. (2020) ‘But does it work? Evidence, policy-making and systems thinking: comment on “what can policy-makers get out of systems thinking? Policy partners’ experiences of a systems-focused research collaboration in preventive health”’, International Journal of Health Policy and Management, 10(5): 287–9, doi: 10.34172/ijhpm.2020.71

  • Landhuis, E. (2016) ‘Scientific literature: information overload’, Nature, 535: 457–8.

  • Langer, L., Tripney, J. and Gough, D.A. (2016) The Science of Using Science: Researching the Use of Research Evidence in Decision-Making, London: EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London. Available from: https://eppi.ioe.ac.uk/cms/Default.aspx?tabid=3504 (accessed 14 March 2021).

  • Larivière, V., Gingras, Y. and Archambault, É. (2009) ‘The decline in the concentration of citations, 1900–2007’, Journal of the American Society for Information Science and Technology, 60(4): 858–62.

  • Lave J. and Wenger E. (1991) Situated Learning: Legitimate Peripheral Participation, New York: Cambridge University Press.

  • Lavis J.N., Permanand G., Oxman A.D., Lewin S. and Fretheim A. (2009) SUPPORT Tools for evidence-informed health policymaking (STP) 13: preparing and using policy briefs to support evidence-informed policymaking. Health Research Policy and Systems, 7(Suppl 1): S13, doi:10.1186/1478-4505-7-S1-S13

  • Layard R., Clark D.M., Knapp M. and Mayraz G. (2007) ‘Cost-benefit analysis of psychological therapy’, National Institute Economic Review, 202(1): 90–8.

  • Leder, D. (1990) The Absent Body, Chicago: University of Chicago Press.

  • Leith, S. (2012) You Talkin’ to Me: Rhetoric from Aristotle to Obama, London: Profile Books.

  • Lindstrom, M. (2012) Brandwashed: Tricks Companies Use to Manipulate our Minds and Persuade Us to Buy, London: Kogan Page Publishers.

  • Lomas J. (2000) ‘Using “linkage and exchange” to move research into policy at a Canadian foundation’, Health Affairs (Millwood), 19(3): 236–40.

  • Maben, J., Peccei, R., Adams, M., Robert, G., Richardson, A., Murrells, T. and Morrow, E. (2012) Exploring the Relationship between Patients’ Experiences of Care and the Influence of Staff Motivation, Affect and Wellbeing, Final report. Southampton: NIHR service delivery and organization programme.

  • Maguire, L.K. and Clarke, M. (2014) ‘How much do you need: a randomised experiment of whether readers can understand the key messages from summaries of Cochrane Reviews without reading the full review’, Journal of the Royal Society of Medicine, 107(11): 444–9.

  • Marshall, M.N. (2014) ‘Bridging the ivory towers and the swampy lowlands: increasing the impact of health services research on quality improvement’, International Journal for Quality in Health Care, 26(1): 1–5.

  • May, C.R. and Finch, T. (2009) ‘Implementing, embedding, and integrating practices: an outline of normalization process theory’, Sociology, 43(3): 535–54.

  • May, C.R., Eton, D.T., Boehmer, K., Gallacher, K., Hunt, K., MacDonald, S. et al (2014) ‘Rethinking the patient: using Burden of Treatment Theory to understand the changing dynamics of illness’, BMC Health Services Research, 14(1): 1–11.

  • Maybin, J. (2016) Producing Health Policy: Knowledge and Knowing in Government Policy Work, Basingstoke: Palgrave Macmillan.

  • McDonald, L. (ed) (2005) Collected Works of Florence Nightingale (Vol. 8), Waterloo ON: Wilfrid Laurier University Press.

  • McDonald, L. (2015) ‘Florence Nightingale: a research-based approach to health, healthcare and hospital safety’, in F. Collyer (ed) The Palgrave Handbook of Social Theory in Health, Illness and Medicine, New York: Springer, pp 59–74.

  • Meacock, R., Anselmi, L., Kristensen, S.R., Doran, T. and Sutton, M. (2017) ‘Higher mortality rates amongst emergency patients admitted to hospital at weekends reflect a lower probability of admission’, Journal of Health Services Research & Policy, 22(1): 12–19.

  • Mintrom, M. (2019) ‘So you want to be a policy entrepreneur?’, Policy Design and Practice, 2(4): 307–23.

  • Mir, G., Salway, S., Kai, J., Karlsen, S., Bhopal, R., Ellison, G.T. and Sheikh, A. (2013) ‘Principles for research on ethnicity and health: the Leeds Consensus Statement’, The European Journal of Public Health, 23(3): 504–10.

  • Mitchell, K.R., Purcell, C., Forsyth, R., Barry, S., Hunter, R., Simpson, S.A. et al (2020) ‘A peer-led intervention to promote sexual health in secondary schools: the STASH feasibility study’, Public Health Research, 8(15), https://doi.org/10.3310/phr08150

  • Moat, K.A., Lavis, J.N. and Abelson, J. (2013) ‘How contexts and issues influence the use of policy-relevant research syntheses: a critical interpretive synthesis’, The Milbank Quarterly, 91(3): 604–48.

  • Morris, S., Hunter, R.M., Ramsay, A.I.G., Boaden, R., McKevitt, C., Perry, C. et al (2014) ‘Impact of centralising acute stroke services in English metropolitan areas on mortality and length of hospital stay: difference-in-differences analysis’, BMJ, 349: g4757.

  • Morris, S., Ramsay A.I.G., Boaden R.J., Hunter R.M., McKevitt C., Paley L. et al (2019) ‘Impact and sustainability of centralising acute stroke services in English metropolitan areas: retrospective analysis of hospital episode statistics and stroke national audit data’, BMJ, 364: 11.

  • Morris, Z.S., Wooding, S. and Grant, J. (2011) ‘The answer is 17 years, what is the question: understanding time lags in translational research’, Journal of the Royal Society of Medicine, 104(12): 510–20.

  • National Institute for Health and Care Excellence (NICE) (2016) Psychosis and Schizophrenia in Children and Young People: Recognition and Management. Clinical Guideline (CG155). Available from: www.nice.org.uk/guidance/cg155 (accessed 4 February 2021).

  • National Institute for Health and Care Excellence (NICE) (2017) Intrapartum Care for Healthy Women and Babies. Clinical Guideline (CG190). Available from: www.nice.org.uk/guidance/cg190 (accessed 4 February 2021).

  • National Institute for Health and Care Excellence (NICE) (2020) Developing NICE Guidelines: The Manual. Process and Methods (PMG20). Available from: www.nice.org.uk/process/pmg20/chapter/glossary (accessed 4 February 2021).

  • National Institute for Health Research (NIHR) (2020) Living with COVID-19, NIHR Evidence. Available from: https://evidence.nihr.ac.uk/themedreview/living-with-covid19/ (accessed 30 October 2020).

  • National Institute for Health Research (NIHR) (2021) Living with COVID-19, Second Review, NIHR Evidence. Available from: https://evidence.nihr.ac.uk/themedreview/living-with-covid19-second-review/ (accessed 20 March 2021).

  • Newbould, J., Ball, S., Abel, G., Barclay, M., Brown, T., Corbett, J. et al (2019) ‘A “telephone first” approach to demand management in English general practice: a multimethod evaluation’, Health Service & Delivery Research, 7(17), https://doi.org/10.3310/hsdr07170

  • Newman, T.B. (2003) ‘The power of stories over statistics’, BMJ, 327(7429): 1424–7.

  • NHS (2019) The NHS Long Term Plan, https://www.longtermplan.nhs.uk/

  • Nicholson, J. and Ioannidis, J. (2012) ‘Conform and be funded’, Nature, 492(7427): 34–6.

  • Nicolini, D., Powell, J. and Korica, M. (2014) ‘Keeping knowledgeable: how NHS Chief Executives mobilise knowledge and information in their daily work’, Health Services & Delivery Research, 2(26), https://doi.org/10.3310/hsdr02260

  • Nixon, J., Smith, I.L., Brown, S., McGinnis, E., Vargas-Palacios, A., Nelson, E.A., Coleman, S., Collier, H., Fernandez, C., Gilberts, R. and Henderson, V. (2019) ‘Pressure relieving support surfaces for pressure ulcer prevention (PRESSURE 2): clinical and health economic results of a randomised controlled trial’, EClinicalMedicine, 14: 42–52.

  • Ocloo, J. and Matthews, J. (2016) ‘From tokenism to empowerment: progressing patient and public involvement in healthcare improvement’, BMJ Quality & Safety, 25(8): 626–32.

  • Oliver, K. and Boaz, A. (2019) ‘Transforming evidence for policy and practice: creating space for new conversations’, Palgrave Communications, 5: 60.

  • Oliver, K. and Cairney, P. (2019) ‘The dos and don’ts of influencing policy: a systematic review of advice to academics’, Palgrave Communications, 5: 21.

  • Oliver, K., Innvar, S., Lorenc, T., Woodman, J. and Thomas J. (2014) ‘A systematic review of barriers to and facilitators of the use of evidence by policymakers’, BMC Health Services Research, 14(1): 1–12.

  • Oliver, K., Kothari, A. and Mays, N. (2019) ‘The dark side of coproduction: do the costs outweigh the benefits for health research?’, Health Research Policy and Systems, 17(1): 33.

  • Oran, D.P. and Topol, E.J. (2020) ‘Prevalence of asymptomatic SARS-CoV-2 infection: a narrative review’, Annals of Internal Medicine, 173: 362–7.

  • Oxman, A.D., Glenton, C., Flottorp, S., Lewin, S., Rosenbaum, S. and Fretheim, A. (2020) ‘Development of a checklist for people communicating evidence-based information about the effects of healthcare interventions: a mixed methods study’, BMJ Open, 10(7): e036348.

  • Pagel, C., Rogers, L., Brown, K., Ambler, G., Anderson, D., Barron, D. et al (2017) ‘Improving risk adjustment in the PRAiS (Partial Risk Adjustment in Surgery) model for mortality after paediatric cardiac surgery and improving public understanding of its use in monitoring outcomes’, Health Services & Delivery Research, 5(23), https://doi.org/10.3310/hsdr05230

  • Petchey, R., Hughes, J., Pinder, R., Needle, J., Partington, J. and Sims, D. (2013) Allied Health Professionals and Management: An Ethnographic Study, Southampton: National Institute for Health Research.

  • Phillips, D.P., Kanter, E.J., Bednarczyk, B. and Tastad, P.L. (1991) ‘Importance of the lay press in the transmission of medical knowledge to the scientific community’, New England Journal of Medicine, 325: 1180–3.

  • Pollitt, C. and Bouckaert, G. (2011) Public Management Reform: A Comparative Analysis: NPM, Governance and the Neo-Weberian State (3rd edn), Oxford: Oxford University Press.

  • Porter, M.E. (1985) The Competitive Advantage: Creating and Sustaining Superior Performance, New York: Free Press.

  • Powell, P. (2010) The Interrogative Mood, London: Profile Books.

  • Prichard, C. (2013) ‘All the lonely papers, where do they all belong?’, Organization, 20(1): 143–50.

  • Public Health England (2017) Public Health Outcomes Framework: Health Equity Report Focus on Ethnicity, London: Public Health England. Available from: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/733093/PHOF_Health_Equity_Report.pdf (accessed 1 February 2021).

  • Pyrko, I., Dörfler, V. and Eden, C. (2017) ‘Thinking together: what makes communities of practice work?’, Human Relations, 70(4): 389–409.

  • Radford, M. (2011) ‘A manifesto for the simple scribe: my 25 commandments for journalists’, The Guardian (online) 19 January. Available from: www.theguardian.com/science/blog/2011/jan/19/manifesto-simple-scribe-commandments-journalists (accessed 24 October 2020).

  • Rangan, A., Handoll, H., Brealey, S., Jefferson, L., Keding, A., Martin, B.C. et al (2015) ‘Surgical vs nonsurgical treatment of adults with displaced fractures of the proximal humerus: the PROFHER randomized clinical trial’, JAMA, 313(10): 1037–47.

  • RECOVERY Collaborative Group (2021) ‘Dexamethasone in hospitalized patients with Covid-19’, New England Journal of Medicine, 384(8): 693–704.

  • Reed, M. (2018) The Research Impact Handbook (2nd edn), Aberdeenshire: Fast Track Impact.

  • REF2021 (2019) ‘Assessment framework and guidance on submissions’. Available from: www.ref.ac.uk/publications/guidance-on-submissions-201901/ (accessed 17 March 2021).

  • Renolen, Å., Høye, S., Hjälmhult, E., Danbolt, L.J. and Kirkevold, M. (2018) ‘“Keeping on track” – hospital nurses’ struggles with maintaining workflow while seeking to integrate evidence-based practice into their daily work: a grounded theory study’, International Journal of Nursing Studies, 77: 179–88.

  • Rickinson, M., Walsh, L., Cirkony, C., Salisbury, M. and Gleeson, J. (2020) Quality Use of Research Evidence Framework, Melbourne: Monash University. Available from: www.monash.edu/education/research/projects/qproject/publications/quality-use-of-research-evidence-framework-qure-report (accessed 17 March 2021).

  • Roumbanis, L. (2019) ‘Peer review or lottery? A critical analysis of two different forms of decision-making mechanisms for allocation of research grants’, Science, Technology, & Human Values, 44(6): 994–1019.

  • Rudd, A.G., Bowen, A., Young, G. and James, M.A. (2017) ‘National clinical guideline for stroke’, Clinical Medicine, 17: 154–5.

  • Rushmer, R.K., Cheetham, M., Cox, L., Crosland, A., Gray, J., Hughes, L. et al (2015) ‘Research utilisation and knowledge mobilisation in the commissioning and joint planning of public health interventions to reduce alcohol-related harms: a qualitative case design using a cocreation approach’, Health Services & Delivery Research, 3(33), https://doi.org/10.3310/hsdr03330

  • Rutter, H., Savona, N., Glonti, K., Bibby, J., Cummins, S., Finegood, D.T. et al (2017) ‘The need for a complex systems model of evidence for public health’, The Lancet, 390(10112): 2602–04.

  • Rycroft-Malone, J., Burton, C., Wilkinson, J., Harvey, G., McCormack, B., Baker, R. et al (2015) ‘Collective action for knowledge mobilisation: a realist evaluation of the Collaborations for Leadership in Applied Health Research and Care’, Health Services & Delivery Research, 3(44), https://doi.org/10.3310/hsdr03440

  • Sacks, O. (2014) The Man Who Mistook His Wife for a Hat, London: Pan Macmillan.

  • Santesso, N., Rader, T., Nilsen, E.S., Glenton, C., Rosenbaum, S., Ciapponi, A. et al (2015) ‘A summary to communicate evidence from systematic reviews to the public improved understanding and accessibility of information: a randomized controlled trial’, Journal of Clinical Epidemiology, 68(2): 182–90.

  • Santesso, N., Morgano, G.P., Jack, S.M., Haynes, R.B., Hill, S., Treweek, S. and Schünemann, H.J. (2016) ‘Dissemination of clinical practice guidelines: a content analysis of patient versions’, Medical Decision Making, 36(6): 692–702.

  • Scales, K., Bailey, S., Middleton, J. and Schneider, J. (2017) ‘Power, empowerment, and person-centred care: using ethnography to examine the everyday practice of unregistered dementia care staff’, Sociology of Health & Illness, 39(2): 227–43.

  • Shaw, I. and Lunt, N. (2018) ‘Forms of practitioner research’, British Journal of Social Work, 48(1): 141–57.

  • Sheikh, K. (2019) ‘How much nature is enough? 120 minutes a week, doctors say’, The New York Times, 13 June. Available from: www.nytimes.com/2019/06/13/health/nature-outdoors-health.html (accessed 17 March 2021).

  • Shojania, K.G., Sampson, M., Ansari, M.T., Ji, J., Doucette, S. and Moher, D. (2007) ‘How quickly do systematic reviews go out of date? A survival analysis’, Annals of Internal Medicine, 147(4): 224–33.

  • Smith, K.E., Bandola-Gill, J., Meer, N., Stewart, E. and Watermeyer, R. (2020) The Impact Agenda: Controversies, Consequences and Challenges, Bristol: Policy Press.

  • Smith, R. (2006) ‘Peer review: a flawed process at the heart of science and journals’, Journal of the Royal Society of Medicine, 99(4): 178–82.

  • Soares-Weiser, K. (2011) ‘Audit of the abstract, plain language summary and summary of findings tables in published Cochrane reviews’, Cochrane Collaboration. Available from: www.dropbox.com/s/39mp8t1jc7817ik/Abstract%20audit%20report%20CEU%202012.pdf (accessed 17 March 2021).

  • Squires, J.E., Hutchinson, A.M., Boström, A.M., O’Rourke, H.M., Cobban, S.J. and Estabrooks, C.A. (2011) ‘To what extent do nurses use research in clinical practice? A systematic review’, Implementation Science, 6(1): 1–17.

  • Staley, K., Crowe, S., Crocker, J.C., Madden, M. and Greenhalgh, T. (2020) ‘What happens after James Lind Alliance priority setting partnerships? A qualitative study of contexts, processes and impacts’, Research Involvement and Engagement, 6(1): 41.

  • Storr, W. (2019) The Science of Storytelling, London: William Collins.

  • Straus, S., Tetroe, J. and Graham, I.D. (eds) (2013) Knowledge Translation in Health Care: Moving from Evidence to Practice (2nd edn), Chichester: John Wiley & Sons.

  • Sugimoto, C.R., Work, S., Larivière, V. and Haustein, S. (2017) ‘Scholarly use of social media and altmetrics: a review of the literature’, Journal of the Association for Information Science and Technology, 68(9): 2037–62.

  • Sumner, P., Vivian-Griffiths, S., Boivin, J., Williams, A., Venetis, C.A., Davies, A. et al (2014) ‘The association between exaggeration in health related science news and academic press releases: retrospective observational study’, BMJ, 349.

  • Sumner, P., Vivian-Griffiths, S., Boivin, J., Williams, A., Bott, L., Adams, R.C. et al (2016) ‘Exaggerations and caveats in press releases and health-related science news’, PLoS One, 11(12): e0168217.

  • Swan, J., Clarke, A., Nicolini, D., Powell, J., Scarbrough, H., Roginski, C. et al (2012) Evidence in Management Decisions (EMD): Advancing Knowledge Utilization in Healthcare Management, NIHR Service Delivery and Organisation Programme.

  • Sword, H. (2012) Stylish Academic Writing, Cambridge, Mass: Harvard University Press.

  • Synnot, A.J., Lowe, D., Merner, B. and Hill, S.J. (2018) ‘The evolution of Cochrane evidence summaries in health communication and participation: seeking and responding to stakeholder feedback’, Evidence & Policy, 14(2): 335–47.

  • Terkel, S. (1970) Hard Times: An Oral History of the Great Depression, New York: Pantheon Press.

  • Thompson, G.N., Estabrooks, C.A. and Degner L.F. (2006) ‘Clarifying the concepts in knowledge transfer: a literature review’, Journal of Advanced Nursing, 53(6): 691–701.

  • Thomson, H. (2013) ‘Improving utility of evidence synthesis for healthy public policy: the three Rs (relevance, rigor, and readability [and resources])’, American Journal of Public Health, 103: e17–23.

  • Tierney, S., Wong, G., Roberts, N., Boylan, A.-M., Park, S., Abrams, R. et al (2020) ‘Supporting social prescribing in primary care by linking people to local assets: a realist review’, BMC Medicine, 18(1): 1–15.

  • Timmins, N., Rawlins, M. and Appleby, J. (2017) ‘A terrible beauty: a short history of NICE the National Institute for Health and Care Excellence [version 1; not peer reviewed]’, F1000Research, 6: 915.

  • Tricco, A.C., Cardoso, R., Thomas, S.M., Motiwala, S., Sullivan, S., Kealey, M.R. et al (2016) ‘Barriers and facilitators to uptake of systematic reviews by policy makers and health care managers: a scoping review’, Implementation Science, 11(1): 1–20.

  • Turnbull, J., McKenna, G., Prichard, J., Rogers, A., Crouch, R., Lennon, A. and Pope, C. (2019) ‘Sense-making strategies and help-seeking behaviours associated with urgent care services: a mixed-methods study’, Health Services & Delivery Research, 7(26), https://doi.org/10.3310/hsdr07260

  • Van de Ven, A.H. (2007) Engaged Scholarship: A Guide for Organizational and Social Research, Oxford: Oxford University Press.

  • Van de Ven, A.H. and Johnson, P.E. (2006) ‘Knowledge for theory and practice’, Academy of Management Review, 31(4): 802–21.

  • Van Noorden R. (2014) ‘Global scientific output doubles every nine years’, Nature news blog (online) 7 May. Available from: http://blogs.nature.com/news/2014/05/global-scientific-output-doubles-every-nine-years.html (accessed 17 March 2021).

  • Vogel, J.P., Oxman, A.D., Glenton, C., Rosenbaum, S., Lewin, S., Gülmezoglu, A.M. and Souza, J.P. (2013) ‘Policymakers and other stakeholders perceptions of key considerations for health system decisions and the presentation of evidence to inform those considerations: an international survey’, Health Research Policy and Systems, 11(1): 1–9.

  • Vosoughi, S., Roy, D. and Aral, S. (2018) ‘The spread of true and false news online’, Science, 359(6380): 1146–51.

  • Wallace J., Byrne, C. and Clarke, M. (2012) ‘Making evidence more wanted: a systematic review of facilitators to enhance the uptake of evidence from systematic reviews and meta-analyses’, International Journal of Evidence Based Healthcare, 10(4): 338–46.

  • Wallace, J., Byrne, C. and Clarke, M.J. (2014) ‘Improving the uptake of systematic reviews: a systematic review of intervention effectiveness and relevance’, BMJ Open, 4: e005834.

  • Waller, R. (2011) Simplification: What Is Gained and What Is Lost. Technical report. Available from: www.academia.edu/3385977/Simplification_what_is_gained_and_what_is_lost (accessed 17 March 2021).

  • Walshe, K. and Rundall, T.G. (2001) ‘Evidence-based management: from theory to practice in health care’, The Milbank Quarterly, 79(3): 429–57.

  • Ward, V., House, A. and Hamer, S. (2009) ‘Knowledge brokering: the missing link in the evidence to action chain?’ Evidence & Policy, 5(3): 267–79.

  • Weick, K. (1995) Sensemaking in Organisations, Thousand Oaks, California: Sage Publications.

  • Welsh, J., Lu, Y., Dhruva, S.S., Bikdeli, B., Desai, N.R., Benchetrit, L. et al (2018) ‘Age of data at the time of publication of contemporary clinical trials’, JAMA Network Open, 1(4): e181065.

  • Westen, D. (2008) The Political Brain: The Role of Emotion in Deciding the Fate of the Nation, New York: PublicAffairs Books.

  • White, M.P., Alcock, I., Grellier, J., Wheeler, B.W., Hartig, T., Warber, S.L. et al (2019) ‘Spending at least 120 minutes a week in nature is associated with good health and wellbeing’, Scientific Reports, 9(1): 1–11.

  • Whitty, C.J.M. (2015) ‘What makes an academic paper useful for health policy?’, BMC Medicine, 13: 301.

  • Wickremasinghe, D., Kuruvilla, S., Mays, N. and Avan, B.I. (2016) ‘Taking knowledge users’ knowledge needs into account in health: an evidence synthesis framework’, Health Policy and Planning, 31(4): 527–37.

  • Widdowson, H.G. (1979) Explorations in Applied Linguistics, London: Oxford University Press.

  • Wieringa, S. and Greenhalgh, T. (2015) ‘10 years of mindlines: a systematic review and commentary’, Implementation Science, 104(12): 501–9.

  • Williams, O. and Annandale, E. (2020) ‘Obesity, stigma and reflexive embodiment: feeling the “weight” of expectation’, Health, 24(4): 421–41.

  • Williams, O., Sarre, S., Papoulias, S.C., Knowles, S., Robert, G., Beresford, P. et al (2020) ‘Lost in the shadows: reflections on the dark side of co-production’, Health Research Policy and Systems, 18: 1–10.

  • Wilsdon, J.R. (2017) ‘Responsible metrics’, in T. Strike (ed) Higher Education Strategy and Planning: A Professional Guide, Abingdon: Routledge, pp 247–53.

  • Wilson, P. and Sheldon, T.A. (2019) ‘Using evidence in health and healthcare’, in A. Boaz, H. Davies, A. Fraser and S. Nutley (eds) What Works Now? Evidence-Informed Policy and Practice, Bristol: Policy Press, pp 67–88.

  • Wye, L., Brangan, E., Cameron, A., Gabbay, J., Klein, J. and Pope, C. (2015) ‘Knowledge exchange in health-care commissioning: case studies of the use of commercial, not-for-profit and public sector agencies’, Southampton: NIHR Journals Library, Health Services Delivery & Research, 3(19).

Content Metrics

May 2022 onwards Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 419 279 12
PDF Downloads 174 71 7

Altmetrics