Improving the use of evidence in legislatures: the case of the UK Parliament

Authors:
David Christian Rose University of Reading, UK

Search for other papers by David Christian Rose in
Current site
Google Scholar
Close
,
Caroline Kenny University College London, UK

Search for other papers by Caroline Kenny in
Current site
Google Scholar
Close
,
Abbi Hobbs University College London, UK

Search for other papers by Abbi Hobbs in
Current site
Google Scholar
Close
, and
Chris Tyler University College London, UK

Search for other papers by Chris Tyler in
Current site
Google Scholar
Close
Open access
Get eTOC alerts
Rights and permissions Cite this article

Despite claims that we now live in a post-truth society, it remains commonplace for policy makers to consult research evidence to increase the robustness of decision making. Few scholars of evidence-policy interfaces, however, have used legislatures as sites of study, despite the fact that they play a critical role in modern democracies. There is thus limited knowledge of how research evidence is sourced and used in legislatures, which presents challenges for academics and science advisory groups, as well as to others interested in ensuring that democratic decisions are evidence-informed. Here, we present results from an empirical study into the use of research in the UK Parliament, obtained through the use of a mixed methodology, including interviews and surveys of 157 people in Parliament, as well as an ethnographic investigation of four committees. Here we are specifically interested in identifying the factors affecting the use of research evidence in Parliament with the aim of improving its use. We focus on providing advice for the Higher Education Sector, which includes improving knowledge of, and engagement in, parliamentary processes, reform of academic incentives to stimulate the production of policy-relevant information and to assist engagement, and working with trusted knowledge brokers. Implementing this advice should improve the chances that parliamentary decision making is informed by research evidence.

Abstract

Despite claims that we now live in a post-truth society, it remains commonplace for policy makers to consult research evidence to increase the robustness of decision making. Few scholars of evidence-policy interfaces, however, have used legislatures as sites of study, despite the fact that they play a critical role in modern democracies. There is thus limited knowledge of how research evidence is sourced and used in legislatures, which presents challenges for academics and science advisory groups, as well as to others interested in ensuring that democratic decisions are evidence-informed. Here, we present results from an empirical study into the use of research in the UK Parliament, obtained through the use of a mixed methodology, including interviews and surveys of 157 people in Parliament, as well as an ethnographic investigation of four committees. Here we are specifically interested in identifying the factors affecting the use of research evidence in Parliament with the aim of improving its use. We focus on providing advice for the Higher Education Sector, which includes improving knowledge of, and engagement in, parliamentary processes, reform of academic incentives to stimulate the production of policy-relevant information and to assist engagement, and working with trusted knowledge brokers. Implementing this advice should improve the chances that parliamentary decision making is informed by research evidence.

Key messages

  • The terms ‘research’ and ‘evidence’ are interpreted broadly by parliamentarians and the staff supporting them.

  • The use of research evidence in the UK Parliament is influenced by four key factors: credibility, relevance, accessibility, and timing.

  • Academic research evidence is valued, but its use was reported to be limited because of perceptions that it: is overly specialised for a policy audience (lacks relevance); has low visibility as an information source and can be difficult to obtain or understand (lacks accessibility); and that it is often poorly attuned to the timing of parliamentary decision making processes, such as select committee inquiries.

  • We argue that deeper engagement between the higher education sector and legislatures could enhance each other’s ability to address key challenges, but that achieving this would require changes to incentive and support structures in academia.

1. Introduction

The relationship between evidence and decision making has been a popular topic of study in the policy sciences. Scholars are well aware that interactions between evidence and policy are complex and non-linear, and dependent on a number of contextual factors (Cairney, 2016; Parkhurst, 2017). Finding a way through the messy reality of evidence-policy interfaces has been the subject of research in many different disciplines, with several recommendations being provided to those who wish to improve the uptake of evidence into policy (see, for example, Oliver and Cairney, 2019). Despite the ever-growing body of work, however, the specific policy venue of legislatures has tended to be overlooked in favour of the executive or science advisory groups (Tyler, 2013; Geddes et al, 2018; Kenny et al, 2017a).

A neglect of legislatures underplays the important role that they play in shaping policy and legislation (Padilla and Hobbs, 2013; Goodwin and Bates, 2015; Kenny et al, 2017a). Legislatures around the world represent citizens’ interests and play a significant role in democratically setting the boundaries within which executives design, execute, and iterate policy plans (Russell and Cowley, 2016). They tend to perform the same core functions: (i) to facilitate public deliberation over any and all matters of societal concern (debate); (ii) to create and change legal frameworks that guide how those matters should be addressed (legislation); (iii) to oversee and challenge programmes enacted by government (scrutiny); and (iv) to check and approve government spending and taxation (budget). Processes operating here can set the tone of government policy or lead to policy change; for example, in the UK Government’s negotiations to leave the European Union, the UK Parliament has demonstrated its ability to assert itself over the executive and influence Government policy.

Since legislatures can play an important role, it is important to consider how and why evidence is used to underpin debate, scrutiny and legislation, and the contribution that it makes to effective decision making, alongside other important influences, such as ideology, stakeholder interests and values, and public opinion (Nutley et al, 2019). Aspects of parliamentary decision-making processes have received scholarly attention, such as the role of committees in scrutinising government policy and legislation (Brazier et al, 2008; Benton and Russell, 2013; Turnpenny et al, 2013; Fisher, 2015; Thompson, 2016; White, 2016). However, it is only more recently that studies have begun to investigate aspects of the wider culture of the work of Parliament, including the micro-level rules, processes and norms that govern the day-to-day work of parliamentarians (and the staff that support them), and which affect how research feeds into parliamentary work (Crewe, 2015; 2017). There have also been very few empirical studies specifically focused on the role of research evidence (that is, evidence generated by the scientific method, see section 2.2) in parliamentary decision making, and those few tend to be focused on a specific Bill (Kettell, 2010; Bates et al, 2014; Goodwin and Bates, 2015). More recently, Geddes et al. (2018) undertook a workshop with eight parliamentary staff from across the UK Parliament to examine knowledge requirements, and concluded that academic strategies for engagement needed to acknowledge that the UK Parliament was not a homogeneous institution but that different sites within it have different knowledge requirements.

To our knowledge, the study reported here and conducted by researchers embedded in the Parliamentary Office of Science and Technology (POST) in the UK Parliament is the first of its kind to try to understand empirically both how research evidence feeds into a specific legislature and how it is subsequently used in that legislature – across different groups and processes.

This paper presents results on how the terms ‘research’ and ‘evidence’ are used in UK Parliament and the role of research evidence within the different aspects of parliamentary work undertaken by parliamentarians and the staff that support them. Our aim is to contribute to academic discussion on evidence-informed policy in a legislative context, and to consider whether evidence use in the UK Parliament is reflective of broader debates about the use of evidence in public policy in other domains in the UK as well as other geographical areas, such as those detailed in the recent expert review by Boaz et al. (2019). In so doing, we seek to highlight the subtle processes by which evidence can and does inform parliamentary decision making, in order to make suggestions as to how evidence supply from the research community could be more effectively aligned to parliamentary needs.

2. Methods

This study sought to capture perspectives towards research from a wide variety of parliamentarians and the staff supporting them. In total, the study included 157 participants, through a survey of 125 people and semi-structured interviews with 87 people1, as well as four case studies of select committee and legislative work, using participant observation and documentary analysis. A steering group formed of Members of Parliament, internal representatives from across both Houses, and external academics and practitioners, guided the study and provided expert advice.2

The case of the UK Parliament

The UK Parliament is currently composed of 650 Members of the House of Commons (MPs) and (at the time of writing) 776 eligible Members of the House of Lords (Peers: used as a shorthand to refer to all eligible members of the House of Lords, including Bishops). Individual MPs and Peers also come together through a variety of groups and formal processes to make decisions. Select committees are composed of cross-party groups of parliamentarians given a specific remit to investigate and report back to the relevant House. They are the main mechanism through which government policy and spending are scrutinised, and usually gather evidence from a range of sources, and have the power to receive written and oral evidence (Geddes et al, 2018). Parliamentarians also come together in the committee stage to scrutinise legislation between the Second and Third Reading of a Bill. At this stage in the Commons, a Public Bill Committee of MPs is established (which has the power to receive written and oral evidence), while in the Lords, a Grand Committee of Peers is formed (without a formal evidence-taking session). Other groups of MPs and Peers include political parties and All-Party Parliamentary Groups, both of which are supported by institutional apparatus external to Parliament.

A range of bodies provide evidence support to parliamentarians in their individual and group roles. These include political staff that Members employ directly to support them, and research services provided through the parliamentary political parties. It also includes a range of politically impartial services provided directly by the House of Commons and the House of Lords. These include the Parliamentary libraries, which provide research services to individual parliamentarians and their staff, producing briefings to support parliamentary debate and legislation. It also includes Committee staff, who provide research services to members of select committees to support parliamentary scrutiny. There are also a number of in-house bodies which provide support on issues of a more specialist or technical nature, including the House of Commons Scrutiny Unit (legal and economic) and the bicameral POST (science and technology, and research evidence).

‘Research’ and ‘evidence’ in Parliament

We note the challenges of defining what is meant by ‘evidence’ in the content of evidence-informed policy making. This is a problem, for example, considered by Nesta and the Alliance for Useful Evidence3. They note the dictionary definition of ‘evidence’ as the available body of facts or information indicating whether a belief or proposition is true or valid, but go on to highlight a particular form of evidence, ‘research evidence’, as having advantages in the way it is produced and assessed:

The conduct and publication of research involves the explicit documentation of methods, peer review and external scrutiny, resulting in rigour and openness. These features contribute to its systematic nature and help provide a means to judge the trustworthiness of findings. (Nutley et al, 2013)

However, in a pre-study workshop for this project with various representatives from Parliament4, it was clear that the terms ‘research’ and ‘evidence’ had different connotations from those dominant in academia and the wider external research community, and that there were also multiple definitions in use across different groups in the UK Parliament. ‘Evidence’ was generally used in a legalistic sense to refer to information received in an official capacity by, for example, select committees. ‘Research’ on the other hand was generally perceived as the process of sourcing information of all kinds needed to support parliamentary business, such as the work of the parliamentary libraries and Members’ staff, though sometimes it was used to refer to material either specifically from academic sources or produced through particular methods (see Figure 1). This understanding was used to inform the methods used in this study.

Figure 1:
Figure 1:

Venn diagram showing dominant definitions of ‘research’ and ‘evidence’ used in the UK Parliament

Citation: Evidence & Policy 16, 4; 10.1332/174426420X15828100394351

Survey/interviews

A survey was developed based on the workshop and previous literature on the use of research in policy settings, including studies by Avey and Desch (2014) and Talbot and Talbot (2014). The survey posed questions relating to the sourcing and use of research in Parliament (full set of questions available at https://www.parliament.uk/documents/post/The-Role-of-Research-in-the-UK-Parliament-2017-Volume-2.pdf). Because of the multiple definitions of research and evidence in use across the UK Parliament, and our aim to understand the role of a specific type of evidence: research evidence (as defined by Nutley et al, 2013 above), across different groups in Westminster, we opted to use the wider term ‘research’ in the survey and to use the interviews to explore more fully participants’ interpretation of the term and how they construed the role of research evidence within this broader context.

A sample of MPs, MPs’ staff, and parliamentary staff was developed. Our objective was to understand the perspectives of a wide range of people in Parliament with differing characteristics. For MPs, the sampling frame took into account political party, gender, and years of experience in the sampling method, while sampling for parliamentary staff was conscious of gender, as well as spread across different departments in the Houses of Parliament (for example, Commons Library, Lords Library). There was not an accessible list of MPs’ staff from which to draw a sample, so this group was recruited via emails and letter sent to constituency offices or through MPs. Peers were unavailable for this survey. In total, 125 responses were gained from the survey (Table 1).

Table 1:

Breakdown of study participants

Method MPs Peers MPs’ staff Parliamentary staff Total number of respondents to each method
Commons Lords Anon
Survey 24 0 35 27 16 23 125
Interviews 36 16 5 14 16 - 87
Follow-up interviews to survey* 24 0 5 10 16 - 55
Case-study interviews 12 16 0 4 0 - 32
Total number of unique individuals 36 16 35 31 16 23 157

* These are a subset of the individuals who undertook the survey; however, all MPs chose to undertake the survey face-to-face so the number of follow-up interviews with MPs is equal to the number of survey responses from MPs.

In addition to the survey, 87 semi-structured interviews were conducted; 55 as follow-up to the survey, and 32 with individuals involved in the case studies (Table 1). Follow-up interviews focused on exploring participants’ responses to particular questions, and the reasons for their selection of multiple-choice responses; case-study interviews (see section 2.4) focused on the sourcing and use of evidence in the particular context of the case study. Interviews lasted between 15 and 30 minutes and were recorded and transcribed in full. These were coded and analysed in NVivo.

Case studies

Case studies followed two select committees for a period of three months each, and two Bills as they progressed through the Committee stages of the UK Parliament (Table 2). Case studies involved three methods: participant observation, documentary analysis, and semi-structured interviews with relevant Members and parliamentary staff. By observing processes, and in some cases participating in activities, the researcher (CK) was able to see and understand the everyday behaviour of parliamentarians and parliamentary staff (for example, on select committees, or in public bill or grand committees), both in terms of the decisions they took and the language they used. Participant observation does raise issues of bias and reactivity, but has been highlighted as an appropriate approach for both the study of evidence use in Parliament previously (for example, Crewe, 2017). Detailed ethnographic notes were made and analysed thematically by hand.

Table 2:

Case studies

Case study type Name Dates
Select Committee House of Commons Communities and Local Government Committee Oct 2014–Feb 2015
Select Committee House of Lords National Policy for the Built Environment Committee Jul–Oct 2015
Legislation Small Business, Enterprise and Employment Bill Jun 2014–Mar 205
Legislation National Insurance Contributions Bill Jul 2014–Feb 2015

Ethical considerations

The study was approved by UCL Research Ethics Committee (project ID 6468/001) and was registered with the UCL Data Protection Officer. We took necessary steps to obtain informed consent from study participants, to ensure confidentiality and participants’ anonymity and to store the data securely.

Limitations

The study adopted a considered approach to sampling, and for the survey, where possible, the study populations were stratified according to characteristics judged to be relevant, with samples drawn at random from these. However, the survey findings are not representative of Parliament as a whole, so specific findings cannot be generalised to other persons, times, or settings than those directly studied. There was a low response rate to the survey amongst MPs (18%) and MPs’ staff (5%), and a possibility that self-selection occurred. Further, it was not possible to survey Peers because of leadership changes at the time; as such the sample is biased towards the House of Commons. However, the breadth and depth of the methodology compares favourably with previous studies (for example, Brazier et al, 2008; Geddes et al, 2018), and importantly also included access to parliamentary processes that are usually beyond the scope of research.

Finally, the position of the principal researcher in POST may have encouraged participants to appear positive about research and its usefulness and role within parliamentary processes. However, the study team took various steps to protect against subjectivity, triangulating data from interviews and participant observation with other studies, double-coding at the data-analysis stage, and establishing a steering group of internal and external experts to ‘sense-check’ the approach taken and findings drawn out.

3. Results

In the survey, 85 people answered a question about how useful research had been to them over the last 12 months, with 98% of respondents strongly agreeing or agreeing that research was useful (11/11 MPs; 24/24 MPs’ staff; 48/50 parliamentary staff); 75 people answered a question about how often they had consulted research as part of their job over the last 12 months, with about half (53%) reporting that they used it daily (3/3 MPs; 9/25 MPs’ staff; 28/47 parliamentary staff). It is important to note that the term ‘research’ tended to be understood broadly by respondents, so these top-level findings likely refer to the importance of information of all types to parliamentary work.

What is research used for?

In the survey, MPs and their staff were asked to rank areas of parliamentary work where research had been the most important. Of the 16 MPs who responded to this question, the top three areas were debates, All-Party Paliamentary Groups, and select committees. Of the 26 MPs’ staff who responded to this question, the top three areas were constituency work, parliamentary debates and stakeholder engagement. Across the interviews with MPs, Peers, MPs’ staff and parliamentary staff, some participants discussed the purpose of research use in more detail. Overall, the dominant theme, put forward by around a third of interviewees, was the use of research to directly inform their immediate representative, legislative or scrutiny work, such as questioning of select committee witnesses, tabling an amendment to a Bill, or dealing with a constituency issue. For example, one MP discussed the role of research as “one weapon in our armoury to do more effective scrutiny” (MP, interview 6).

The second dominant theme overall, put forward by about a fifth of participants, was for background knowledge, for example providing a comprehensive overview of a policy issue in a select committee inquiry or ahead of a debate. A Peer commented:

I am speaking in two debates on Thursday… and I can’t personally imagine how you can stand up and speak without going away and doing some background research. (Peer, interview 14).

Further uses were put forward with evident differences between groups. For example, about a sixth of parliamentarians strongly emphasised the use of research to substantiate preexisting views. One MP (interview 75) commented that “[T]he most frequent and obvious… purpose of [accessing research] is… to back up something that we are advocating from a policy perspective.” (MP, interview 75)

In contrast, maintaining strict political impartiality was perceived as fundamental to all work undertaken by parliamentary staff. For example, a member of staff in the Lords Library commented that:

There is a lot of information out there and we always try to balance that information, so that if you have a left-wing think tank you will make sure you have got a centre and right-wing think tank as well (Lords Library staff, interview 55).

The process of sourcing information to support these diverse purposes occurs under significant time pressure. In our survey, 82 people responded to a question about whether they had enough time to find and use research in their parliamentary work. Overall, half (49%) agreed or strongly agreed that they did; however there were clear differences between groups, with 64% (30/47) of parliamentary staff agreeing that they did, but no MPs agreeing (0/11). Although we note the sample size for MPs is small, the need to gather information to be able to respond to new issues very quickly was a key theme across the interviews. Participants noted the restrictions on their time for several reasons. This included the nature of parliamentary timetabling, especially in the Commons, meaning that MPs and the staff supporting them usually only have a few days’ notice of Chamber business and hence have very little time to prepare to deal with a vast array of different policy areas. For example, one MP (interview 75) highlighted how they needed to get hold of information “for something that is happening, if you are lucky, tomorrow, and if you are not lucky it is happening in a couple of hours’ time”.

What types of research are used?

Since research was interpreted broadly by actors in Parliament, the types of research reported to be used by participants to meet the purposes outlined above were similarly varied. In the survey, respondents were provided with 13 types of research to choose from and asked which were most useful in their parliamentary work (based on Avey and Desch, 2014, Talbot and Talbot, 2014). There was no restriction placed on how many they could choose and 94 respondents answered this question (see Table 3). Overall, the two top types of research reported to be used most frequently were statistics (77%) and expert opinion (75%), and these were also the top two types for each subgroup (MPs, MPs’ staff, parliamentary staff). Across all groups, this was followed by public opinion polls, used by just under half of respondents (44%), with around a quarter citing research with service users or those affected, whether by survey (31%) or interviews (27%), as well as observations (28%). The numbers suggest that there may be slight differences between groups, with parliamentary staff citing expert opinion above statistics.

Table 3:

Types of research used most often in parliamentary work

Type of research MPs (n=12) MPs’ staff (n=33) Parliamentary staff (n=49) Total (n=94)
Statistics 11 28 33 72 (77%)
Expert opinion 3 21 46 70 (75%)
Public opinion polls 1 18 22 41 (44%)
Surveys with service users or those affected 2 12 15 29 (31%)
Observations 2 13 11 26 (28%)
Interviews with service users or those affected 1 10 14 25 (27%)

The interviews with MPs, Peers, MPs’ staff and parliamentary staff provided some interesting commentary on users’ perspectives about statistics. For example, one MP said that they “trust numbers more than opinions” (MP, interview 70), and another could not think of a single piece of work where statistical research was not used to support their argument (MP, interview 64). Statistics were portrayed by MPs staff as “objective” (MPs’ staff, interview 46), to “imply more gravity and fact” (MPs’ staff, interview 90), and provide a “comfort blanket” (MPs’ staff, interview 84) for their MP. However, it is important to note that interviewees in our study tended not to reflect critically on how statistical information had been derived.

Interviews also highlighted two further types of research considered key to parliamentary work – especially select committee work – case studies and internationally comparable data. Case study examples of ‘what works’ were considered highly valuable, although participants tended not to reflect on the underlying methodology of these, for example whether they were case studies derived from randomised controlled trials. Participant observation of the two select committees suggested that formal written submissions that contained success stories gave Members more confidence in taking ideas forward. One MP explained that “one example… [can be] more powerful… [than] all sorts of figures”, especially when communicating to constituents (MP, interview 86). International case studies of ‘what works’, and other forms of internationally comparable evidence, were also considered highly valuable. For example, one Peer reported that having international evidence was “immensely useful” but that obtaining it was like “gold dust” (Peer, interview 28). One MP explained the value as being able to “search around the world to see what is working and then reapply it” (MP, interview 50).

How is research sourced?

The survey asked respondents how research was received and sourced, with 91 people answering this question. Overall, 70% of respondents said that they actively searched for research; however, there were interesting differences within groups. For example, while actively searching for research was the dominant route for parliamentary staff (81%, 39/47) and MPs’ staff (69%, 18/26), the majority of MPs reported using research they received unsolicited via post or email (61%, 11/18). While noting that the sample size for MPs is small, in interviews MPs talked about being “inundated” (MP, interview 61), “overloaded” (MP, interview 79), and “bombarded” (MP, interview 86) with information from external organisations, personal contacts, and constituents. A few MPs and Peers reflected that while unsolicited information may be biased, it was generally useful to have.

Respondents to the survey were also asked to indicate which sources of information they consulted, from a list of 15 options, and asked to select as many as applied; 107 people answered this question (see Table 4). This showed that the most popular sources, used by around two-thirds of respondents overall, were government departments (69%), the parliamentary libraries (66%), named organisations (65%), and mainstream media (newspapers, TV, radio) (61%). About half of respondents also reported using experts (53%) and websites of international organisations, such as the EU and Organisation for Economic Co-operation and Development (OECD) (51%). This was followed by information received by post or email (43%), or produced by select committees (41%), as well as specific individuals, for example advisers or academics (41%), and academic books or articles (37%).

Table 4:

Top ten sources of research reported to be used regularly for parliamentary work5

Source MPs (n=23) MPs’ staff (n=32) Parliamentary staff (n=52) Total (n=107) (%)
Government departments 8 27 39 74 (69%)
Commons/ Lords Library* 18 25 28 71 (66%)
Specific organisations 20 17 33 70 (65%)
Media (news, TV, radio) 4 22 39 65 (61%)
Experts in the specific area 8 16 33 57 (53%)
Websites of international organisations (for example, EU, OECD, WHO) 3 23 29 55 (51%)
Information received by post and/or email 7 21 18 46 (43%)
Select Committees 9 12 23 44 (41%)
Specific individuals (for example policy makers, advisers, academics) 3 16 25 44 (41%)
Academic books and/or articles 5 9 26 40 (37%)

* MPs and MPs’ staff can only use directly services of the Commons Library, so the majority of this use is likely to be of the Commons Library.

There are some differences by respondent role. For example, when looking at top five sources within each group, select committees feature highly for MPs, and international organisations feature highly for both parliamentary staff and MPs’ staff, the latter of whom also reported using information sent to them in the post or by email regularly; however, we note the small sample sizes for some subgroups.

What factors affect the use of research?

The survey asked respondents to rank factors that were important to them in deciding whether to read or use a piece of research; 88 people answered this question and relevance, credibility, and ease of finding were ranked in the top five by all groups (Table 5). For MPs, the importance of the issue to them personally and to their constituents was also key, whereas for MPs’ staff and parliamentary staff, presentation and the appropriateness of the methods and approaches used were in the top five.

Table 5:

Top five factors shaping use of research based on survey ranking*

MPs (n=22) MPs’ staff (n=20) Parliamentary staff (n=46)
Relevance Credibility Credibility
Ease of finding Presentation Relevance
Credibility Ease of finding Appropriateness of approaches/ methods used
Personal importance of issue Relevance Presentation
Importance of issue in constituency Appropriateness of approaches/ methods used Ease of finding

* Ranking was calculated in SurveyMonkey based on the average ranking for each answer choice.

The question of influential factors was explored in much greater breadth and depth in the interviews. Analysis of these data identified four key factors affecting the use of research, which reflect the findings from the survey. Aspects of these factors were cited by more than half of interviewees, and nearly all interviews emphasised at least one of these factors as being particularly important to their use. These are summarised in Table 6 and explored in more detail below.

Table 6:

Four key factors shaping use of research based on interviews

Factor Description
Credibility Perceived believability of the information, related to the perceived quality and validity of the information. Credibility was typically judged on the basis of the information source, rather than on the knowledge production process.
Relevance The salience of the information to the purpose of use, for example, in providing a comprehensive overview of a policy issue, or for parliamentarians, to help them to form or substantiate a position on an issue, and for parliamentary staff, to provide balance.
Accessibility The extent to which research can be easily found, comprehended and digested by non-specialists. Incorporates visibility of research, use of jargon, structure, length and presentation.
Timing Whether research is available or communicated during a policy ‘window’ of opportunity. For example, whether it is readily available and promoted when an issue is prominent in the media or submitted as a formal evidence submission to a select committee inquiry.

3.4.1. Credibility

Interviewees tended not to explicitly define credibility, but it was generally constructed as the believability of the information, based on perceived quality and validity. In turn, these were generally judged on the basis of the perceived independence of the knowledge-production process and on the relevance and accessibility of the information source. For example, a member of staff in the Commons Library explained:

Good quality, what does that mean? For me, good quality research has to be research that has good methodology behind it but also research that is written up in an easily digestible way… which is directly relevant to the areas that I am [working on]… that can answer my question. (Commons Library staff, interview 33)

Interestingly, the dominant notion of independence was often related to political impartiality and balance in representation of stakeholder perspectives, rather than the perceived robustness of the research methodology. In this context, sources that had built up a good reputation in Parliament for withstanding challenge in the public eye, and sources that had been recommended by colleagues or existing contacts, were more likely to be considered credible. For example, an MP said that they looked for “well known, national, reputable sources that few people would challenge the veracity of the findings” (MP, interview 79). The parliamentary libraries were frequently put forward as particularly credible and were the most prominent source identified by parliamentarians in the context of seeking independent research. For example, one MP said that the “great thing about the Library note, of course, is that it is accepted across the House as a statement; it is beyond dispute” (MP, interview 17).

Importantly however, many interviewees highlighted how research could lack credibility and nonetheless be highly valuable if it offered understanding about the perspectives of a particular group of stakeholders. For example, research by charities was often reported to be used to inform legislative or scrutiny work, even though it was not considered impartial. One MP talking about written questions reflected that:

[W]e rely on charities in those areas but we don’t really stand up and say, ‘We believe X,’ we will say, ‘This charity informs me of X,’ because we don’t trust the data 100%. Whereas if POST [an internal service] did some documents on that, the data would be trusted. (MP, interview 64)

3.4.2. Relevance

Perceived relevance was crucial for use, for example in providing a comprehensive background overview of a policy issue and stakeholder perspectives or, for parliamentarians, to help them to form or substantiate a position on an issue. As one MP commented, “It’s got to be… relevant for you to… stick time into [reading] it… otherwise it’s a waste of the hours” (MP, interview 81). In this context, interviewees noted the need for more user-friendly summaries and overviews of a body of research findings, rather than detailed information about single studies.

3.4.3. Accessibility

The extent to which research could be easily found, comprehended and digested by non-specialists was also fundamental to its use. This incorporated several aspects, including the visibility of research, use of jargon and technical terms, structure, length and presentation. For example, one MP (interview 3) emphasised that: “[I]t needs to be user-friendly, recognising that politicians are all running around like headless chickens trying to do more than it will ever be feasible to do in any one day”.

3.4.4. Timing

Whether research was available or communicated during a policy ‘window of opportunity’ (Rose et al, 2017) was another key factor, because of the breadth and amount of issues that parliamentariams had to make decisions about at any one time. For example, a Peer noted:

For most people in politics, their interest in the subject is like a lighthouse beam. When the beam is on that subject it is only on that subject, but then their focus moves, and that subject goes to outer darkness. (Peer, interview 16)

This meant that whether it was readily available and promoted in an accessible way, or whether it was a formal evidence submission to a select committee inquiry, was crucial to its use. As one MP (interview 93) explained: “If you think about the vast array of things that could have an impact on policy development or decision making, one of the key things is whether it comes to your attention”.

Several interviewees commented that because charities wanted to have their voice heard about the issue they focused on, they tended to invest resources in understanding parliamentary procedures and were very willing to provide support to parliamentarians interested in the issue. For example, one Peer stated:

Hats off to them [charities]; they do know how to influence what is going on because they are very attuned to the legislative process… they understand the timing and the scheduling and what is going to be helpful and what isn’t going to be helpful. (Peer, interview 5)

The use of academic research evidence

Academic research evidence was used by people in Parliament, but a number of criticisms were raised along the lines of the four interrelated factors presented above. Firstly, interviewees perceived it to lack relevance, finding it too specialised or focused to be of real use. This included perceptions that there was a lack of available research evidence on issues of relevance to them. For example, one Peer commented that Committees were often “ahead of the game compared to the academic community… wrestling with things where there is no body of evidence.” (Peer, interview 28). Furthermore, one staff member in the Commons Library commented that “the problem with academic research is that it might be very interesting… but it is often not very connected with the immediate political concerns of the day” (Commons Library staff, interview 38). Interviewees also strongly emphasised the need for broader syntheses or reviews of a body of research, rather than single articles. For example, a member of staff in the Commons Committee office (interview 37) noted that “we would be looking for the meta-analysis or the review article at most”, and an MP reflected:

Academic research feeds in a very limited capacity because it’s probably too specialised. What [I] need to know, in practical terms, is 80% of the high-level subject and I don’t need to know, or haven’t got the time to know, the other 20%. (MP, interview 70)

Second, academic research evidence was often criticised on the grounds of accessibility. This included reference to academic research being behind paywalls, having low visibility, and not featuring highly on search engines when users were looking for information. For example, one MP’s staff commented that universities had “closed doors… nine times out of ten we don’t know what they are working on and we don’t know who to call” (MP’s staff, interview 77). It also included the lack of user-friendliness of research evidence. Interviewees noted that research evidence could be “opaque” (Lords Library staff, interview 62) and could be difficult for a non-expert to understand:

Particularly when you are reading academic reports, people can assume a lot of knowledge; I was looking at a piece of research only yesterday, which assumed a lot of understanding of statistics to be able to accurately interpret it. (MP, interview 66)

Third, academic research evidence was criticised for lack of timing. Interviewees were less likely to report receiving academic research unsolicited or proactively, especially in comparison to research conducted by the charitable sector, and several commented on the low levels of participation by the academic sector in formal committee processes, such as through submitting written evidence. Interviewees’ perceptions of the reasons for academics’ low engagement varied, but included a lack of understanding of how to engage with the UK Parliament and its processes, and a lack of motivation to do so. For example, one Peer commented:

I am very involved in the Private Members’ Bill on [x] and it is the campaign group called [y] that is doing most of the work; it is not academics because, in a sense, you have to believe in the cause. (Peer, interview 4).

Several interviewees, however, noted that they thought that there was room for academic research evidence to play a stronger role in the UK Parliament, for example, in supporting parliamentarians to develop expertise on issues they needed to make decisions about:

By definition, most politicians are not experts on most subjects; on the one hand, we have to be generalists, on the other you are expected to be expert, and there is a massive tension between those two that you would hope academic research could arbitrate. (MP, interview 89)

Other interviewees noted that they felt that a stronger culture of engagement was beginning to emerge, and several noted the ‘impact agenda’ as a driving factor. For example, a staff member in the Commons Committee office reflected:

I think university departments and other bodies are producing more and more glossy stuff which distils key results… I think researchers… are alert to the fact that they have a role to explain their work to the public and the members of parliament who represent the public. (Commons committee staff, interview 37)

4. Implications for academic engagement with Parliament

Our in-depth investigation of how research is sourced and used in the UK Parliament suggests that the key factors shaping the use of research in the UK Parliament overlap substantially with those at play in other policy arenas, since the salience of concepts such as credibility, relevance, accessibility and timing have been well documented outside of legislatures (Oliver et al, 2014; Heink et al, 2015; Boaz et al, 2019). However, it also supports findings (Heink et al, 2015) that the meanings attached to these concepts are context-specific. Here, in a space largely occupied by academics, we devote attention to suggesting how we think the academic community can – and should - better engage with Parliament to enhance each other’s ability to address key social and economic challenges.

The question of how the research community can improve evidence uptake into policy has, of course, received widespread attention out of legislative settings, with a number of suggestions being made (see, for example, Cairney and Oliver, 2018; Oliver and Cairney, 2019). We consider that many commonly suggested solutions, including making research open access (see Plan S project; Else, 2018); improving researcher communication skills (Tinch et al, 2018); incentivising policy engagement and the provision of synthesised knowledge; as well as providing policy support units in universities (Tyler, 2017; Donnelly et al, 2018; Gavine et al, 2018); and working with trusted knowledge brokers (Bednarek et al, 2018); would likely all make contributions to better evidence use in the UK Parliament, despite the fact that most of the research has not been undertaken in this policy venue. However, we consider that it is worth reiterating some of these solutions to highlight how individual researchers and the wider Higher Education sector could have more influence, specifically on the work of the UK Parliament.

The first suggestion we make is to improve knowledge and engagement with the UK Parliament across the research community. Academics should not expect their research to be found by people in Parliament who are unlikely to know it has been conducted, may not have time to look for it, or may not find it even when looking. Better awareness of Parliamentary processes and timetables could increase awareness of existing ‘windows of opportunity’6 (Rose et al, 2017) in which academics can engage. For example, with regard to select committees, research evidence may be influential early on in a new inquiry when parliamentary staff are scoping a topic and setting terms of reference, and again when a formal call for evidence has been issued. Academics need to be aware of calls for evidence and supported to respond to them, and to understand that submitting a written response could result in further opportunities for impact through being called to give evidence orally. Futhermore, our research showed that information gathered by parliamentary staff and MPs’ staff feeds into Parliamentary work at different levels, and thus the research community could seek to build professional relationships with these staff (for example, evidence specialists in select committees), as well as sending information directly to Members when it is relevant to an issue being considered in Parliament. Part of this networking should involve seeking to establish a credible reputation, since this tended to be more influential in determining research use in parliament than users’ perceptions of the quality of methodology.

However, encouraging more proactive engagement with Parliament will require incentives. For example, the shift to impact case studies comprising 25% of the overall score in Research Excellence Framework 2021 provides a broad framework to strengthen engagement, and anecdotally has already led to a notable increase in the number of academics contacting parliamentary offices in order to get involved. But it also places increasing demands on academics’ time and to be sustainable, mechanisms and approaches that facilitate enduring relationships and recognise differences in individuals’ skills and strengths are also required. Knowledge exchange staff and public policy support functions in universities can play a key role here, by having a dedicated resource to monitor and publicise relevant opportunities and sources of support to academics, and to equip them to take advantage of these, for example by providing advice or editoral support to academics making evidence submissions. We consider that there is also a need for more universities to include knowledge exchange in promotion criteria, to develop specific career pathways in knowledge exchange or other means of incentivising and enabling staff to have the time and resources to dedicate to maximising societal and policy impact from their research (for example, Tyler, 2017). New networks like UPEN (Universities Policy Engagement Network; www.upen.ac.uk) and new offices like STEaPP’s Policy Impact Unit (Department of Science, Technology, Engineering and Public Policy; www.ucl.ac.uk/steapp/collaborate/policy-impact-unit; Tyler, 2017) are a step in the right direction.

Secondly, reform is clearly needed to alter the value of particular types of academic output, in order to improve relevance of academic research for legislative and non-legislative policy settings. Despite long-standing criticism of the failure of academia to recognise the value of critical appraisal and systematic reviews, and the scientific and ethical importance of ensuring that new research builds on existing knowledge (Chalmers, 2005; Donnelly et al, 2018), such outputs are still generally perceived as less valuable than novel, primary research. There are good examples in academia where a body of evidence has been systematically reviewed and summarised so that it is available quickly for policy makers to consult (Sutherland et al, 2017). However, further incentive is needed so that academics can be encouraged and rewarded for synthesising evidence proactively, rather than reactively. This will require funders to prioritise policy-relevant research more than is currently practised (Tyler, 2017).

Finally, even with improved training and support from within the research community to engage with Parliament, trustworthy knowledge brokers will still be required to enhance influence by acting as skilled intermediaries between research and policy (Bednarek et al, 2018). Analysis of Research Excellence Framework 2014 impact case studies showed that third-party organisations are an important route for many academics to engage with, and have impact on, Parliament (Kenny, 2015). Working with such organisations, including charities, learned societies, and specialist boundary organisations (for example, POST), could enable academics to benefit from their policy and public affairs expertise and resoruces, and to build the informal networks necessary for establishing credibility in Parliament.

Concluding remarks

For those interested in improving the use of research evidence in the UK Parliament, our research has identified a number of concrete suggestions that could be implemented. Perhaps concerningly, many of the suggestions made by people in Parliament, including making research open access, improving the presentation of research evidence, and seizing windows of opportunity for evidence uptake, have been made several times before in the policy sciences literature. The fact that the same barriers to evidence use (for example, paywalls) are still being identified by those in policy positions suggests that there is much progress to be made. However, we argue that a better understanding of the unique processes operating in the UK Parliament, which have thus far received little scholarly attention, is the first step towards more productive engagement with the research community.

POST has already taken significant steps to make the process of engaging with Parliament more streamlined, through establishing a Knowledge Exchange Unit (www.parliament.uk/get-involved/research-impact-at-the-uk-parliament/knowledge-exchange-at-uk-parliament); developing a new web hub for academics (www.parliament.uk/research-impact); and training to support engagement. And putting our suggestions into action, these research findings have formed the basis of evidence submissions to the 2019 inquiry on effective scrutiny by the House of Commons Liaison Committee, and the review of House of Lords investigative and scrutiny committees by the House of Lords Liaison Committee7. These reports include recomendations for Parliament to enhance cooperation, collaboration and partnerships with the wider research community, and for the publicly-funded research sector to recognise the value in contributing to public debate and parliamentary scrutiny, and to reward academic institutions which contribute to this goal. This opens up substantial opportunities for the research community, including academics and other experts on ‘what works’ in evidence-informed policy, individual Higher Education Institutes, knowledge mobilisers, and research funders to work with POST.

These developments open up a space for deeper reflection between the Higher Education sector and the UK Parliament on the purpose of academic engagement, which takes into account the different goals and values of scientists and policy makers, and the desirability of public scrutiny and accountability of both scientists and policy makers in democratic societies (SAPEA, 2019). It also provides concrete opportunities to make much-needed progress in addressing barriers to research evidence use, starting with improving understanding and cooperation and working towards deeper change, such as the creation of thematic research hubs of proactively synthesised and accessible bodies of evidence.

Research ethics statement

The study was approved by UCL Research Ethics Committee (project ID 6468/001) and was registered with the UCL Data Protection Officer. We took necessary steps to obtain informed consent from study participants, to ensure confidentiality and participants’ anonymity, and to store the data securely.

Acknowledgements

This research was generously funded by the ESRC. The Houses of Parliament kindly granted wide-ranging access to undertake this research.

Contribution statement

DCR wrote the first draft of the manuscript and led the drafting of revisions, with comments from CT, CK, and AH. AH made an important contribution to the first revision. All authors were involved in data analysis, with CK leading this. CK and AH were involved in data collection and project management. CT oversaw the research.

Conflict of interest

CT was director of POST from 2012 to 2017. CK worked at POST from 2014 to 2019. AH has worked at POST from 2013 until present.

Notes

1

Some of the semi-structured interviews were a follow-up to the survey, see section 2.3 for more information.

4

Representatives included: staff from the Committee Office, the House of Commons Scrutiny Unit, the House of Commons and Lords Libraries, the House of Commons Outreach and Engagement Service, House of Commons and Lords Clerks of Committees, and the Director of Research Development. These eight staff were also joined by a group of academics and researchers and also representatives from ESRC and the National Audit Office.

5

Figures here may differ from Kenny et al (2017b) as further categorisation was carried out and checked for this paper.

6

We use ‘windows of opportunity’ in this sense to refer to the value of timing of evidence submission to Parliament, and not to refer, for instance, to Kingdon’s (2003) work on policy windows brought about by the converging streams of politics, policies, and problems.

References

  • Avey, P.C. and Desch, M.C. (2014) What do policymakers want from us? Results of a survey of current and former senior national security decision makers, International Studies Quarterly, 58: 22746. doi: 10.1111/isqu.12111

    • Search Google Scholar
    • Export Citation
  • Bates, S., Jenkins, L. and Amery, F. (2014) (De)politicisation and the Father’s Clause parliamentary debates, Policy & Politics, 42: 24358.

    • Search Google Scholar
    • Export Citation
  • Bednarek, A.T., Wyborn, C., Cvitanovic, C., Meyer, R., Colvin, R.M. et al. (2018) Boundary spanning at the science-policy interface: the practitioners’ perspectives, Sustainability Science, 13(4): 117583. doi: 10.1007/s11625-018-0550-9

    • Search Google Scholar
    • Export Citation
  • Benton, M. and Russell, M. (2013) Assessing the impact of parliamentary oversight committees: the select committees in the British House of Commons, Parliamentary Affairs, 66(4): 77297. doi: 10.1093/pa/gss009

    • Search Google Scholar
    • Export Citation
  • Boaz, A., Davies, H., Fraser, A. and Nutley, S. (2019) What Works Now? Evidence Informed Policy and Practice, Bristol: Bristol University Press.

    • Search Google Scholar
    • Export Citation
  • Brazier, A., Kalitowski, S., Rosenblatt, G. and Korris, M. (2008) Law in the Making: Influence and Change in the Legislative Process, London: Hansard Society.

    • Search Google Scholar
    • Export Citation
  • Cairney, P. (2016) The Politics of Evidence-Based Policymaking, London: Palgrave Pivot.

  • Cairney, P. and Oliver, K. (2018) How should academics engage in policymaking to achieve impact?, Political Studies Review, https://doi.org/10.1177/1478929918807714

    • Search Google Scholar
    • Export Citation
  • Chalmers, I. (2005) Academia’s failure to support systematic reviews, The Lancet, 365: 945869. doi: 10.1016/S0140-6736(05)17854-4

  • Crewe, E. (2015) The House of Commons: An Anthropology of MPs at Work, Bloomsbury Academic.

  • Crewe, E. (2017) Ethnography of Parliament: finding culture and politics entangled in the Commons and the Lords, Parliamentary Affairs, 70: 15572. doi: 10.1093/pa/gsw012

    • Search Google Scholar
    • Export Citation
  • Donnelly, C.A., Boyd, I., Campbell, P., Vallance, P., Walport, M., Whitty, C.J.M., Woods, E., and Wormald, C. (2018) Four principles to make evidence synthesis more useful for policy, Nature, 558: 3614. doi: 10.1038/d41586-018-05414-4

    • Search Google Scholar
    • Export Citation
  • Else, H. (2018) Radical open-access plan could spell end to journal subscriptions, Nature, 561: 178. doi: 10.1038/d41586-018-06178-7

  • Fisher, L. (2015) The growing power and autonomy of House of Commons select committees: causes and effects, Political Quarterly, 86: 41926. doi: 10.1111/1467-923X.12190

    • Search Google Scholar
    • Export Citation
  • Gavine, A., Macgillivray, S., Ross-Davie, M., Campbell, K., White, L. and Renfrew, M. (2018) Maximising the availability and use of high-quality evidence for policymaking: collaborative, targeted and efficient evidence reviews, Palgrave Communications, doi: 10.1057/s41599-017-0054-8

    • Search Google Scholar
    • Export Citation
  • Geddes, M., Dommett, K. and Prosser, B. (2018) A recipe for impact? Exploring knowledge requirements in the UK Parliament and beyond, Evidence & Policy, 14(2): 25976, https://doi.org/10.1332/174426417X14945838375115

    • Search Google Scholar
    • Export Citation
  • Goodwin, M., and Bates, S. (2015) The ‘powerless parliament’? Agenda-setting and the role of the UK Parliament in the Human Fertilisation and Embryology Act 2008, British Politics, 11(2): 23255. doi: 10.1057/bp.2015.37

    • Search Google Scholar
    • Export Citation
  • Heink, U., Marquard, E., Heubach, K., Jax, K., Kugel, C. et al. (2015) Conceptualizing credibility, relevance and legitimacy for evaluating the effectiveness of science-policy interfaces: challenges and opportunities, Science and Public Policy, 42(5), 67689 doi: 10.1093/scipol/scu082

    • Search Google Scholar
    • Export Citation
  • Kenny, C. (2015) The impact of academia on Parliament: 45 percent of Parliament-focused impact case studies were from social sciences, https://blogs.lse.ac.uk/impactofsocialsciences/2015/10/19/the-impact-of-uk-academia-on-parliament/

    • Search Google Scholar
    • Export Citation
  • Kenny, C., Washbourne, C-L., Tyler, C., and Blackstock, J.J. (2017a) Legislative science advice in Europe: the case for international comparative research, Palgrave Communications, 3, doi: 10.1057/palcomms.2017.30

    • Search Google Scholar
    • Export Citation
  • Kenny, C., Rose, D.C., Hobbs, A., Tyler, C., and Blackstock, J. (2017b) The Role of Research in the UK Parliament, Volume One and Volume Two, London: House of Commons.

    • Search Google Scholar
    • Export Citation
  • Kettell, S. (2010) Rites of passage: discursive startegies in the 2008 Human Fertilisation and Embryology Bill Debate, Political Studies, 58: 789808. doi: 10.1111/j.1467-9248.2010.00847.x

    • Search Google Scholar
    • Export Citation
  • Kingdon, J. (2003) Agendas, Alternatives, and Public Policies, 2nd edn., New York: Longman Press.

  • Nutley, S., Powell, A. and Davies, H. (2013) What counts as good evidence? https://www.alliance4usefulevidence.org/assets/What-Counts-as-Good-Evidence-WEB.pdf

    • Search Google Scholar
    • Export Citation
  • Nutley, S., Boaz, A., Davies, H. and Fraser, A. (2019) New development: what works now? Continuity and change in the use of evidence to improve public policy and service delivery, Public Money and Management, 39(4), 31016. doi: 10.1080/09540962.2019.1598202

    • Search Google Scholar
    • Export Citation
  • Oliver, K., and Cairney, P. (2019) The dos and don’ts of influencing policy: a systematic review of advice to academics, Palgrave Communications, 5

    • Search Google Scholar
    • Export Citation
  • Oliver, K., Innvar, S., Lorenc, T., Woodman, J. and Thomas, J. (2014) A systematic review of barriers to and facilitators of the use of evidence by policymakers, BMC Health Services Research, 14. doi: 10.1186/1472-6963-14-2

    • Search Google Scholar
    • Export Citation
  • Padilla, A., and Hobbs, A. (2013) Science and Technology Related Induction Needs in the House of Lords, London: Parliamentary Office of Science and Technology (POST).

    • Search Google Scholar
    • Export Citation
  • Parkhurst, J. (2017) The Politics of Evidence: from Evidence-based Policy to the Good Governance of Evidence, Abingdon: Routledge.

  • Rose, D.C., Mukherjee, N., Simmons, B.I., Tew, E.R., Robertson, R.J. et al. (2017) Policy windows for the environment: tips for improving the uptake of scientific knowledge, Environmental Science and Policy, https://doi.org/10.1016/j.envsci.2017.07.013

    • Search Google Scholar
    • Export Citation
  • Russell, M., and Cowley, P. (2016) The policy power of the Westminster Parliament: the ‘parliamentary state’ and the empirical evidence, Governance, 29(1), 12137. doi: 10.1111/gove.12149

    • Search Google Scholar
    • Export Citation
  • Science Advice for Policy by European Academies (SAPEA) (2019) Making sense of science for policy under conditions of complexity and uncertainty, https://www.sapea.info/topics/making-sense-of-science/

    • Search Google Scholar
    • Export Citation
  • Sutherland, W.J., Dicks, L.V., Ockendon, N., and Smith, R.K. (2017) What Works in Conservation, Cambridge: OpenBook Publishers.

  • Talbot, C. and Talbot, C. (2014) Sir Humphrey and the Professors: What does Whitehall want from Academics?, Manchester: Policy@Manchester, University of Manchester, http://hummedia.manchester.ac.uk/faculty/policy/1008_Policy@Manchester_Senior_Civil_Servants_Survey_v4(1).pdf

    • Search Google Scholar
    • Export Citation
  • Thompson, L. (2016) Debunking the myths of Bill committees in the British House of Commons, Politics, 36(1): 3648. doi: 10.1111/1467-9256.12094

    • Search Google Scholar
    • Export Citation
  • Tinch, R., Balian, E., Carss, D., De Blas, E., Geamana, N.A. et al. (2018) Science-policy interfaces for biodiversity: dynamic learning environments for successful impact, Biodiversity and Conservation, 27(7): 1679702. doi: 10.1007/s10531-016-1155-1

    • Search Google Scholar
    • Export Citation
  • Turnpenny, J., Russel, D. and Rayner, T. (2013) The complexity of evidence for sustainable development policy: analysing the boundary work of the UK parliamentary environmental audit committee, Transactions of the Institute of British Geographers, 38, 58698. doi: 10.1111/j.1475-5661.2012.00549.x

    • Search Google Scholar
    • Export Citation
  • Tyler, C. (2013) Scientific advice in parliament, in R. Doubleday and J. Wilsdon (eds) Future Directions for Scientific Advice in Whitehall, Cambridge/Sussex: University of Cambridge Centre for Science and Policy; Science Policy Research Unit (SPRU) and ESRC STEPS Centre at the University of Sussex; Alliance for Useful Evidence; Institute for Government; Sciencewise-ERC.

    • Search Google Scholar
    • Export Citation
  • Tyler, C. (2017) Wanted: academics wise to the needs of government, Nature, doi: 10.1038/d41586-017-07744-1

  • White, H. (2016) Select Committees Under Scrutiny: The Impact of Parliamentary Committees Inquiries on Government, London: Institute for Government.

    • Search Google Scholar
    • Export Citation
  • Figure 1:

    Venn diagram showing dominant definitions of ‘research’ and ‘evidence’ used in the UK Parliament

  • Avey, P.C. and Desch, M.C. (2014) What do policymakers want from us? Results of a survey of current and former senior national security decision makers, International Studies Quarterly, 58: 22746. doi: 10.1111/isqu.12111

    • Search Google Scholar
    • Export Citation
  • Bates, S., Jenkins, L. and Amery, F. (2014) (De)politicisation and the Father’s Clause parliamentary debates, Policy & Politics, 42: 24358.

    • Search Google Scholar
    • Export Citation
  • Bednarek, A.T., Wyborn, C., Cvitanovic, C., Meyer, R., Colvin, R.M. et al. (2018) Boundary spanning at the science-policy interface: the practitioners’ perspectives, Sustainability Science, 13(4): 117583. doi: 10.1007/s11625-018-0550-9

    • Search Google Scholar
    • Export Citation
  • Benton, M. and Russell, M. (2013) Assessing the impact of parliamentary oversight committees: the select committees in the British House of Commons, Parliamentary Affairs, 66(4): 77297. doi: 10.1093/pa/gss009

    • Search Google Scholar
    • Export Citation
  • Boaz, A., Davies, H., Fraser, A. and Nutley, S. (2019) What Works Now? Evidence Informed Policy and Practice, Bristol: Bristol University Press.

    • Search Google Scholar
    • Export Citation
  • Brazier, A., Kalitowski, S., Rosenblatt, G. and Korris, M. (2008) Law in the Making: Influence and Change in the Legislative Process, London: Hansard Society.

    • Search Google Scholar
    • Export Citation
  • Cairney, P. (2016) The Politics of Evidence-Based Policymaking, London: Palgrave Pivot.

  • Cairney, P. and Oliver, K. (2018) How should academics engage in policymaking to achieve impact?, Political Studies Review, https://doi.org/10.1177/1478929918807714

    • Search Google Scholar
    • Export Citation
  • Chalmers, I. (2005) Academia’s failure to support systematic reviews, The Lancet, 365: 945869. doi: 10.1016/S0140-6736(05)17854-4

  • Crewe, E. (2015) The House of Commons: An Anthropology of MPs at Work, Bloomsbury Academic.

  • Crewe, E. (2017) Ethnography of Parliament: finding culture and politics entangled in the Commons and the Lords, Parliamentary Affairs, 70: 15572. doi: 10.1093/pa/gsw012

    • Search Google Scholar
    • Export Citation
  • Donnelly, C.A., Boyd, I., Campbell, P., Vallance, P., Walport, M., Whitty, C.J.M., Woods, E., and Wormald, C. (2018) Four principles to make evidence synthesis more useful for policy, Nature, 558: 3614. doi: 10.1038/d41586-018-05414-4

    • Search Google Scholar
    • Export Citation
  • Else, H. (2018) Radical open-access plan could spell end to journal subscriptions, Nature, 561: 178. doi: 10.1038/d41586-018-06178-7

  • Fisher, L. (2015) The growing power and autonomy of House of Commons select committees: causes and effects, Political Quarterly, 86: 41926. doi: 10.1111/1467-923X.12190

    • Search Google Scholar
    • Export Citation
  • Gavine, A., Macgillivray, S., Ross-Davie, M., Campbell, K., White, L. and Renfrew, M. (2018) Maximising the availability and use of high-quality evidence for policymaking: collaborative, targeted and efficient evidence reviews, Palgrave Communications, doi: 10.1057/s41599-017-0054-8

    • Search Google Scholar
    • Export Citation
  • Geddes, M., Dommett, K. and Prosser, B. (2018) A recipe for impact? Exploring knowledge requirements in the UK Parliament and beyond, Evidence & Policy, 14(2): 25976, https://doi.org/10.1332/174426417X14945838375115

    • Search Google Scholar
    • Export Citation
  • Goodwin, M., and Bates, S. (2015) The ‘powerless parliament’? Agenda-setting and the role of the UK Parliament in the Human Fertilisation and Embryology Act 2008, British Politics, 11(2): 23255. doi: 10.1057/bp.2015.37

    • Search Google Scholar
    • Export Citation
  • Heink, U., Marquard, E., Heubach, K., Jax, K., Kugel, C. et al. (2015) Conceptualizing credibility, relevance and legitimacy for evaluating the effectiveness of science-policy interfaces: challenges and opportunities, Science and Public Policy, 42(5), 67689 doi: 10.1093/scipol/scu082

    • Search Google Scholar
    • Export Citation
  • Kenny, C. (2015) The impact of academia on Parliament: 45 percent of Parliament-focused impact case studies were from social sciences, https://blogs.lse.ac.uk/impactofsocialsciences/2015/10/19/the-impact-of-uk-academia-on-parliament/

    • Search Google Scholar
    • Export Citation
  • Kenny, C., Washbourne, C-L., Tyler, C., and Blackstock, J.J. (2017a) Legislative science advice in Europe: the case for international comparative research, Palgrave Communications, 3, doi: 10.1057/palcomms.2017.30

    • Search Google Scholar
    • Export Citation
  • Kenny, C., Rose, D.C., Hobbs, A., Tyler, C., and Blackstock, J. (2017b) The Role of Research in the UK Parliament, Volume One and Volume Two, London: House of Commons.

    • Search Google Scholar
    • Export Citation
  • Kettell, S. (2010) Rites of passage: discursive startegies in the 2008 Human Fertilisation and Embryology Bill Debate, Political Studies, 58: 789808. doi: 10.1111/j.1467-9248.2010.00847.x

    • Search Google Scholar
    • Export Citation
  • Kingdon, J. (2003) Agendas, Alternatives, and Public Policies, 2nd edn., New York: Longman Press.

  • Nutley, S., Powell, A. and Davies, H. (2013) What counts as good evidence? https://www.alliance4usefulevidence.org/assets/What-Counts-as-Good-Evidence-WEB.pdf

    • Search Google Scholar
    • Export Citation
  • Nutley, S., Boaz, A., Davies, H. and Fraser, A. (2019) New development: what works now? Continuity and change in the use of evidence to improve public policy and service delivery, Public Money and Management, 39(4), 31016. doi: 10.1080/09540962.2019.1598202

    • Search Google Scholar
    • Export Citation
  • Oliver, K., and Cairney, P. (2019) The dos and don’ts of influencing policy: a systematic review of advice to academics, Palgrave Communications, 5

    • Search Google Scholar
    • Export Citation
  • Oliver, K., Innvar, S., Lorenc, T., Woodman, J. and Thomas, J. (2014) A systematic review of barriers to and facilitators of the use of evidence by policymakers, BMC Health Services Research, 14. doi: 10.1186/1472-6963-14-2

    • Search Google Scholar
    • Export Citation
  • Padilla, A., and Hobbs, A. (2013) Science and Technology Related Induction Needs in the House of Lords, London: Parliamentary Office of Science and Technology (POST).

    • Search Google Scholar
    • Export Citation
  • Parkhurst, J. (2017) The Politics of Evidence: from Evidence-based Policy to the Good Governance of Evidence, Abingdon: Routledge.

  • Rose, D.C., Mukherjee, N., Simmons, B.I., Tew, E.R., Robertson, R.J. et al. (2017) Policy windows for the environment: tips for improving the uptake of scientific knowledge, Environmental Science and Policy, https://doi.org/10.1016/j.envsci.2017.07.013

    • Search Google Scholar
    • Export Citation
  • Russell, M., and Cowley, P. (2016) The policy power of the Westminster Parliament: the ‘parliamentary state’ and the empirical evidence, Governance, 29(1), 12137. doi: 10.1111/gove.12149

    • Search Google Scholar
    • Export Citation
  • Science Advice for Policy by European Academies (SAPEA) (2019) Making sense of science for policy under conditions of complexity and uncertainty, https://www.sapea.info/topics/making-sense-of-science/

    • Search Google Scholar
    • Export Citation
  • Sutherland, W.J., Dicks, L.V., Ockendon, N., and Smith, R.K. (2017) What Works in Conservation, Cambridge: OpenBook Publishers.

  • Talbot, C. and Talbot, C. (2014) Sir Humphrey and the Professors: What does Whitehall want from Academics?, Manchester: Policy@Manchester, University of Manchester, http://hummedia.manchester.ac.uk/faculty/policy/1008_Policy@Manchester_Senior_Civil_Servants_Survey_v4(1).pdf

    • Search Google Scholar
    • Export Citation
  • Thompson, L. (2016) Debunking the myths of Bill committees in the British House of Commons, Politics, 36(1): 3648. doi: 10.1111/1467-9256.12094

    • Search Google Scholar
    • Export Citation
  • Tinch, R., Balian, E., Carss, D., De Blas, E., Geamana, N.A. et al. (2018) Science-policy interfaces for biodiversity: dynamic learning environments for successful impact, Biodiversity and Conservation, 27(7): 1679702. doi: 10.1007/s10531-016-1155-1

    • Search Google Scholar
    • Export Citation
  • Turnpenny, J., Russel, D. and Rayner, T. (2013) The complexity of evidence for sustainable development policy: analysing the boundary work of the UK parliamentary environmental audit committee, Transactions of the Institute of British Geographers, 38, 58698. doi: 10.1111/j.1475-5661.2012.00549.x

    • Search Google Scholar
    • Export Citation
  • Tyler, C. (2013) Scientific advice in parliament, in R. Doubleday and J. Wilsdon (eds) Future Directions for Scientific Advice in Whitehall, Cambridge/Sussex: University of Cambridge Centre for Science and Policy; Science Policy Research Unit (SPRU) and ESRC STEPS Centre at the University of Sussex; Alliance for Useful Evidence; Institute for Government; Sciencewise-ERC.

    • Search Google Scholar
    • Export Citation
  • Tyler, C. (2017) Wanted: academics wise to the needs of government, Nature, doi: 10.1038/d41586-017-07744-1

  • White, H. (2016) Select Committees Under Scrutiny: The Impact of Parliamentary Committees Inquiries on Government, London: Institute for Government.

    • Search Google Scholar
    • Export Citation
David Christian Rose University of Reading, UK

Search for other papers by David Christian Rose in
Current site
Google Scholar
Close
,
Caroline Kenny University College London, UK

Search for other papers by Caroline Kenny in
Current site
Google Scholar
Close
,
Abbi Hobbs University College London, UK

Search for other papers by Abbi Hobbs in
Current site
Google Scholar
Close
, and
Chris Tyler University College London, UK

Search for other papers by Chris Tyler in
Current site
Google Scholar
Close

Content Metrics

May 2022 onwards Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 2251 1182 115
PDF Downloads 1184 646 55

Altmetrics

Dimensions