Abstract

Background:

Government-funded knowledge brokering organisations (KBOs) are an increasingly prevalent yet under-researched area. Working in the space between knowledge and policy, yet framing themselves as different from think tanks and academic research centres, these organisations broker evidence into policy.

Aims and objectives:

This article examines how three organisations on different continents develop similar narratives and strategies to attempt to inform policymaking and build legitimacy.

Methods:

Using documentary analysis and semi-structured interviews, it shows how the organisations construct their credibility and legitimacy, and make sense of their emergence, activities and relationships with policymakers.

Findings:

The study responds to the lack of political focus on many existing studies, examining how KBOs make sense of their origins and roles, articulating notions of evidence, and mobilising different types of legitimacies to do so. The research also addresses an empirical gap surrounding the emergence and activities of KBOs (not individuals), analysing organisations on three different continents.

Discussion and conclusions:

KBOs developed similar narratives of origins and functions, despite emerging in different contexts. Furthermore, they build their legitimacy/ies in similar ways. Our research improves our understanding of how a new ‘tool’ in the evidence-informed policymaking (EIPM) arsenal – KBOs – is being mobilised by different governments in similar ways.

Key messages

  • Government-funded KBOs are an increasingly prevalent yet under-researched area.

  • KBOs mobilise similar emergence narratives in different contexts.

  • Credibility is built by KBOs in changing ways, tapping into legitimacies, hinging on their origins, contexts, tools and staff.

  • KBOs are a new EIPM tool that seems to be mobilised in similar ways by different governments.

Background

Evidence-informed policymaking (EIPM) has become a major concern for academics and policymakers since the 1990s. Considerable resources have been invested by governments to increase and organise the role and use of evidence in policymaking.

Within this trend, what we term Knowledge Brokering Organisations (KBOs) or evidence intermediaries (Gough et al, 2018) have emerged in the last decade or so, operating alongside traditional think tanks and academic policy institutes, brokering evidence into policy (Stone, 1996; Rich, 2004; Bell and Head, 2017). KBOs are distinguished by a combination of three criteria, some of which may be found in other bodies, but their combination is key to KBOs. First is KBOs’ articulation of evidence, with the latter being central to their work, mission, and practices. The centrality of evidence in their everyday work distinguishes them from other bodies. Second are the structures, relationships and practices set up by KBOs, which include staff with multiple/boundary-spanning backgrounds and their knowledge-brokering tools. Third is their closeness to government, despite being separate from the latter. Existing research has examined knowledge brokering (Knight and Lightowler, 2010; Bandola-Gill and Lyall, 2017) with a particular focus on individuals (for example, Pielke Jr, 2007) rather than KBOs. Furthermore, the politics of knowledge brokering and EIPM more broadly are often not acknowledged (MacKillop et al, 2020). Existing studies also examine knowledge brokering within single organisations rather than across organisations and countries (Hoeijmakers et al, 2013; Bednarek et al, 2016). This article’s contribution is that it focuses on how three KBOs – the Africa Centre for Evidence, the Mowat Centre in Ontario, Canada, and the Wales Centre for Public Policy – operating in different contexts and in three different continents emerge and conduct their brokering work by mobilising different types of legitimacies to appeal to different audiences. We show how as new policy actors all three emphasise the need to establish and maintain their credibility, because their work exists within an inherently uncertain political context which could see a KBO abolished by government at any moment.

The paper addresses three key research questions:

  1. Why do KBOs emerge?

  2. What roles are played by KBOs in informing policymaking and how do they build their credibility/legitimacy?

  3. What are the relationships at play between KBOs and policymakers?

There is a developing literature taking a critical approach to examining EIPM research (for example, Smith, 2013; Oliver et al, 2014a; Cairney, 2016; Parkhurst, 2017). Our study makes two contributions to this. First, responding to the lack of political focus in existing studies of knowledge brokering (Meyer and Kearnes, 2013; Oliver et al, 2014b), it examines how KBOs articulate their origins and role. It analyses the politics of KBOs by assessing how ideas of evidence and knowledge are mobilised in these organisations’ narratives and why, and how they build their legitimacy. Second, the study addresses an empirical gap surrounding the emergence and activities of knowledge brokering organisations (as opposed to individuals). It analyses three cases, including one from the less-studied African continent, and finds surprising similarities in origins, activities and relations with policymakers.

We begin by reviewing the literature on credibility/legitimacy, before outlining our research methods. We then present findings from our three cases, before setting out our conclusions and areas for future research.

Knowledge brokering and credibility

In this section, we examine how credibility and legitimacy is discussed in the literature on think tanks and other evidence intermediaries to start problematising how KBOs construct their credibility in their work.

One way in which credibility is discussed is regarding independence, usually understood as financial independence (Rich, 2004; McLevey, 2014; Abelson, 2018), with consequences for credibility when governments provide funding. The advocacy-evidence prism, as developed to differentiate between partisan think tanks and ‘old guard’ university institutes (Stone, 1996), also examines credibility. The emergence of KBOs is shrouded in the same argument, with the attempt at hybridising the ‘old guard’ academic think tanks into bodies tailored to synthesise evidence for use in policymaking. Medvetz (2007) adds to this body of evidence by examining how think tanks help to shift the political debate on certain issues. Rather than a traditional and academically-rooted definition of credibility for knowledge, they convince policymakers and the public through ideas and narratives. It is important to examine the stories told by organisations in building their credibility.

Doberstein’s work on credibility takes a more positivist approach, mobilising randomised controlled trials (RCTs) to test whether different types of knowledge, specifically research from think tanks and academia, are seen as more or less credible to policymakers (Doberstein, 2017a; 2017b). In this vein, for others, credibility depends on a number of factors such as how the evidence is presented, the logic of the argument, and cues such as authority and credentials (Landsbergen and Bozeman, 1987). However, how can researchers access these ‘real’ perceptions?

Another way of looking at credibility is to examine how it is constructed by KBOs in convincing or persuading others. Scientific authority has been eroded, meaning that what counts as credible varies. KBOs can be appealing to both the scientific authority of academia – for example, systematic reviews, peer-review processes – and think tank narratives – for example, of being close to policymakers and brokers of evidence and mobilising different discursive tools in doing so. The growing use of certain words such as ‘evidence’ rather than ‘opinion’ or ‘ideas’, as well as quantitative methods which are perceived by some as being more scientific and objective in accessing ‘truth’, and linkages with academia – for example, hiring staff with PhDs or visiting fellows from academia – are tools mobilised by KBOs in conferring credibility onto their activities. Similar boundary work takes place according to the Boundary Organisation literature (Guston, 2001). In this context, what matters is to understand how credibility is defined and practiced (Plehwe, 2014). Of particular interest are recent studies examining how organisations build legitimacy. Williams analyses research actors in the international development field, describing the process of gaining legitimacy as a constant and shifting one where actors ‘must inscribe knowledge, compile evidence and gain resources in balance with diverse audiences with competing interest and rules of the game’ (Williams, 2018: 54). These actors, like our KBOs, must satisfy different masters, including funders and stakeholders, at the same time as fulfilling their mission and building their legitimacy through ‘language and action’ (Williams, 2018: 55). Williams examines the cognitive (whether organisations pursue goals seen as proper), moral (via looking at processes to maintain independence, integrity and transparency), and pragmatic (creating the right outputs, audience and impact) legitimacies of these research actors. In another study, Williams (2021) examines how brokerage is performed to gain credibility, notably how they mobilise conceptual ‘distances’ drawn from the ‘world of ideas’ and the ‘world of policy and practice’. In their pursuit of legitimacy, policy-research actors play different positions in different ways – disciplinary vs un-disciplinary, complex specialist research vs direct digestible outputs, and slow rigorous research vs agile responsive analysis.

Bandola-Gill (2021) also looks at legitimacy by examining how experts such as those at the World Bank build their legitimacy between politics and expertise. She conceptualises this as ‘constructed via navigation between specific practices of knowledge production’ (Bandola-Gill, 2021: 1) balancing relevant but robust knowledge, constantly working the distance between experts and policymakers, and establishing clear institutional cultures of evidence of the organisation within which expert advice is given. Bandola-Gill observes how ‘the institutional setting in which knowledge is produced and disseminated is one of the central factors shaping the format of expert legitimacy’ but is least explored in the current legitimacy literature (Bandola-Gill, 2021: 7).

We help address these gaps by focusing on how specific organisations – KBOs – construct and mobilise that legitimacy.

Critically investigating knowledge brokering organisations

A focus on meaning-making can help to examine how these organisations articulate their existence, activities and relationships. Although we focus mainly on the voices of these organisations in this paper, they provide significant insight into the complex muddling together of multiple narratives, for example, linked to both EIPM and the politics of evidence. Communication is important here, and it relies on how people speak about themselves, their actions and the world. Fischer and Forester (1993) stress the role of language in policymaking, where arguments and stories about the real world are attempts to make sense of it.

Often new spaces and entities appear in a governance context, not subject to specific rules about how they are governed (Hajer and Wagenaar, 2003). Åm (2013) suggests a focus on organisations’ practices by conducting ‘an analysis of context-based case studies and to address specific questions: what were the conditions of possibility for a particular intermediary institution to emerge, what characterizes its practices, and how can we evaluate its success in this particular context?’ Åm (2013: 468).

Our paper follows a similar approach, interviewing members of these organisations to understand their meaning-making practice, how they make sense of their emergence, roles and relationships. These strategies include the storytelling around seamless and cost-effective EIPM, the mobilisation of legitimacy from think tanks and academic narratives, and how evidence is mobilised in shifting ways to fit the demands of brokers themselves and their policymaking clients.

Methods

It is necessary to look, not only into the history of KBOs, but also the context within which they emerged and work. Thereby, it becomes possible to identify the meaning-making practices mobilised by different players in the policy community, including KBOs themselves, in legitimising the creation of these organisations, constructing their credibility, and the relationships forged with policymakers.

We produced a long-list of KBOs in different countries. We looked for organisations not previously examined in the literature and which dealt with issues broader than healthcare and public health as they dominate existing analysis. We narrowed our choice of potential cases using documentary analysis and ultimately who responded to our requests for interviews. KBOs in three countries on three different continents were selected to analyse the politics of knowledge brokering: the Africa Centre for Evidence (ACE) at the University of Johannesburg, South Africa, the Mowat Centre in Ontario, Canada, and the Wales Centre for Public Policy (WCPP) in Wales, UK. Table 1 outlines the three organisations and how they compare according to a range of key criteria. These case studies were selected for three principal reasons. First, each body works across social policy areas, whereas most KBOs are specialised in health or social care. Second, they are based on a demand-led approach, meaning that policymakers and other clients would commission them for outputs such as reports, events or evidence syntheses. Third, the three bodies had emerged and worked in policy communities with different characteristics. For instance, Canada is a federal state and the Mowat Centre was set up by the Ontario government. In contrast, WCPP was set up in cooperation with the devolved Welsh government1, and ACE emerged from a UK Department for International Development (DFID) initiative in South Africa and now works with several governments and other decision makers across Africa. Overall, the selection of the case studies was based on whether they would help us to understand how KBOs emerge and work in different policy contexts and develop critical findings regarding these bodies (Flyvbjerg, 2006;  Yin, 2009).

Table 1:

The three case study knowledge brokering organisations

Comparators*/ Centres
Date of creation 2016 (2014) 2010 (partly disbanded in 2019) 2017 (2013§)
Organisational setup University research institute Independent public policy think tank University research institute
Number of staff 15 14 20
Policy area(s) Development, health, women’s empowerment, labour, environment, agriculture, water, human settlements, child protection, and more Social policy with two research hubs focusing on not-for-profit and the energy sector Social policy areas with relevance to the Welsh Government
Audience Decision makers in Africa: politicians, civil servants, practitioners Ontario policymakers; Ontario public Welsh Government Ministers and sub-national levels of decision making
Outputs Evidence maps and syntheses (mostly but not only commissioned by government departments); evidenced capacities (capacity development); evidence communities (relationship building including via Africa Evidence Network); Academic research Ontario government commissioned reports presenting options and recommendations for policymakers based on evidence review; Reports commissioned by other public, private or third sector bodies Welsh Government commissioned reports presenting evidence review on a specific policy question;** Events, for example, roundtables, expert panels; Academic research
Aim Reduce poverty and inequality in Africa A better fiscal deal for Ontario; federal policy that serves Ontario; enhanced policy development capacity for Ontario government Providing ministers, civil servants and public services with high quality evidence and independent advice that helps them improve policy decisions and outcomes
Example of impact Ensuring national policy on human settlements was informed by evidence Reduction in Ontario’s tax contribution to the Federation Report on how proposed childcare tax credits would not have reached the expected social groups
Measurement of impact Case studies relying on quantitative and qualitative data, for example, ‘stories of change’ Impact methodology developed in 2015 blending qualitative and quantitative data; Performance indicators related to three roles of Mowat, for example, publications, citations, access Research Excellence Framework (REF) impact case study; Impact methodology in development combining qualitative and quantitative data
Engagement with users Evidence users are engaged in every stage of their work Section in annual reports dedicated to ‘who is using our work’ Evidence users are engaged in every stage of their work
Funding source(s) (2018) Ontario government grant (35%) Welsh Government (40%)
Hewlett Foundation (39%) Grants and sponsorships (26.8%) ESRC‡‡ (40%)
ESPA†† (32%) Research contracts (14.4%) Cardiff University (30%)
DFID (13%) Training courses (1.4%)
Universities (10%) Events (0.5%)
3ie (3.4%) Cash reserves (21.8%)
RSA government (1%)
Other (1.6%)
Funding model Ad hoc, contracts and longer-term funding Grants, contracts, ad hoc Five-year core grant and smaller ad hoc grants and contracts for specific projects
Budget £534,740 (annual) (2018) £1.63 million (annual) £1.6 million (annual)

* Data are for 2017 except if specified otherwise.

Sources: (Gough et al, 2018; Lalande et al, 2019; websites’ data and interviews)

UK Department for International Development (DFID) grant from 2014–2016.

§ The Public Policy Institute for Wales (PPIW) was set up in 2013 with Welsh Government funding only. The WCPP saw ESRC and Cardiff University become funders alongside the Welsh Government.

** WCPP works with Welsh Government (WG stream) and with public services (PS stream). This paper, in its empirical data, focuses on the Welsh Government stream.

†† Ecosystem Services for Poverty Alleviation programme, partly funded by DFID and ESRC.

‡‡ Economic and Social Research Council, an academic funding body in the UK.

We conducted ten semi-structured interviews with senior members of the three organisations including directors, policy and research leads. These interviews were conducted between January and August 2019 either face-to-face or remotely. In 2020–2021, three more interviews were conducted with experts in the field of evidence and policy in the UK, Canada and the US to get alternative views on KBOs’ work and roles (see Table 2). All interviews were recorded and fully transcribed. Even though the case study organisations are identified, we agreed that participants would be anonymised to provide a confidential environment to share their views. Our interviewees included members of these organisations which reflects our focus on how these organisations make sense of their roles and activities. Our questions focused on what they thought of the wider political mechanisms and agendas that KBOs are entwined in, and how they reconciled competing agendas and purposes, such as being seen as independent, yet being close enough to policymakers to influence policy (see Appendix 1 for topic guide).

Table 2:

List of interviewees (anonymised)

Code Organisation Country
ACE 1 KBO South Africa
ACE 2 KBO South Africa
Mowat 1 KBO Canada (Ontario)
Mowat 2 KBO Canada (Ontario)
Mowat 3 KBO Canada (Ontario)
Mowat 4 KBO Canada (Ontario)
WCPP 1 KBO UK (Wales)
WCPP 2 KBO UK (Wales)
WCPP 3 KBO UK (Wales)
WCPP 4 KBO UK (Wales)
UK 1 External stakeholder UK
UK 2 External stakeholder UK
Canada 1 Higher Education Policy Institute Canada

These questions allowed us to interrogate the workings and politics – that is, negotiation over meaning – of KBOs, how they mobilised their knowledge as credible, how their outputs were used and why, as well as their normative positions on the evidence-policy relationship.

Additionally, we examined documents produced by each KBO, including annual reports, key outputs and academic publications, other literature on these bodies, and government and newspaper publications mentioning the organisations. These documents helped to analyse further the official stories told by these organisations, and to contextualise and compare the different events mentioned by interviewees. This comparison allowed us to understand how the same event or anecdote is made sense of in different ways according to different narratives.

The data were reflected upon and discussed between the co-authors and key scholars in the fields of EIPM and policy studies, seeking to problematise and critically explain the phenomenon of KBOs in different countries. We used NVivo to create general codes or themes following the interview questions, before outlining more detailed codes such as the difficulty these organisations had in measuring their impact or how they built trust with policymakers. Inspired by an iterative approach which moves back and forth between empirics and theory in order to ensure that the data addresses the study’s objectives (Bassett, 2010), we continued to analyse and refine these codes, and our framework, to develop the best possible explanation, which we present in the following section. As critical researchers, we were engaged and reflective of our own positions as members of one of the organisations during the research process.

Findings

In this section, we discuss our evidence on the three research questions: (1) the stories told about their emergence; (2) their roles and activities and how they build their credibility; and (3) their relationships with policymakers.

Emergence of the knowledge brokering organisations

We begin by examining why and how these bodies emerged; the rationale for their creation and the key people involved in their setup; and their funding sources.

There has been a growth in the number of KBOs worldwide since the late 2000s. Their emergence has been articulated or framed by KBOs themselves and others in the policy community as solutions to a number of problems affecting policymaking and society. Three major sets of conditions were articulated across the case studies, all leading to the same solution of KBOs.

The first reason given for their emergence related to the perceived lack of capacity for advice within government. Interviewees talked about the decreasing “capacity internally” within government (Mowat 4) and the need for quick access to evidence. For WCPP, interviewees understood that government ministers felt that they would benefit from greater access to external expertise and evidence to complement internal policy advice. The Labour Party manifesto stated that: ‘In order to create greater critical mass in high quality, strategic public policy making and research. We will: Establish a pan-Wales public policy institute’ (Welsh Labour Party, 2011: 22). This was supported by other interviewees outside of these KBOs:

‘Wouldn’t it be great if I, as a policymaker, could just pick up the phone and speak to an academic and they could tell me exactly what I wanted that day. I think it is that kind of rationale that might help to explain the demand for those kinds of brokerage organisations’. (UK 1)

Secondly was the desire to create a “one-stop evidence shop” to provide flexibility and readily available outsourced evidence for policymakers. For ACE, it was the attraction of “draw[ing] on that expertise on the African continent” rather than from somewhere else (ACE 1), combined with “the pressure on resources” (ACE 2). For WCPP, they identified the need for the best available evidence or expertise worldwide: “A need for public services to be helped to think about independent authoritative evidence and expertise and gain access to them” (WCPP 4).

The third set of conditions related to the provision of credible and independent evidence for policymakers in an age of fake news and wicked policy issues. One interviewee talked about the atmosphere of “fake news… that is increasingly polarised and sceptical about the motivations of government” (Mowat 4). For ACE, it was the novelty of democracy which pushed for new evidence solutions: “I think there’s some really exciting and interesting stuff if you look across the world at some of the new emerging democracies and how they are formalising policy development processes and incorporating evidence into that process” (ACE 2).

The role played by key individuals, within and outside government, was also framed as being important in the creation of all three bodies. For instance, at Mowat, it “came down to the right individual being in the right place and the right time” and without him, the idea “probably wouldn’t have come to pass” (Mowat 4). This emergence was also linked to projects preceding the creation of these organisations, such as a UK Department for International Development (DFID) funded project leading to the introduction of ACE.

In this story of emergence, the question of funding was central. Even though they were all funded (in)directly by governments, our KBOs stressed their independence, calling on academic-type legitimacy tools such as rigour, their methodology, peer review and right to publish, to counterbalance any potential conflicts. As depicted in Table 1, the three bodies had different funding sources, from long-term grants from research bodies or charitable foundations, to a plethora of individually-funded projects such as with Google (for example, Mowat). Many of our respondents cited the difficult “balance to strike between what you want to do and what funding is available” with “sometimes, the funders steer[ing] the agenda” because of the increasingly short-term nature of funding (ACE 2). For Mowat, their neutrality was counterbalanced with being funded by particular parties in government: “The real background story to Mowat is it’s always been funded by liberal governments. We are neutral and bipartisan but our funding came initially from a liberal government” (Mowat 3).

On this funding question, our KBOs stressed how their potential lack of credibility regarding their funding was balanced by the necessity for their funders – especially governments using their evidence – that these KBOs continued to be seen as independent by the wider policy community. This illustrates how different levels of legitimacy were being counterbalanced by KBOs (Bandola-Gill, 2021).

The context of each country also determined how and why individual KBOs emerged. For instance, at ACE, there were specificities linked to the African context in why evidence mattered: “[T]here’s a difference between supporting evidence use in low, middle-income countries than there is in northern countries because the pressures… of poverty and inequality mean that the motivations for evidence use are much stronger and urgent” (ACE 2). Whereas in Ontario, it was argued that there was a dearth of good evidence producers: “The private sector is not involved as they should. The government is under pressure to cut expenditure. It’s not a big constituency, so that makes the landscape tough” (Mowat 4).

Overall, the interviewees told similar stories of lack of capacity within government, the need for flexibility in sourcing credible evidence, and a renewed EIPM narrative of the potential of evidence to resolve wicked policy problems. However, more than a homogeneous story of the rise of KBOs, our findings also document the importance of local conditions and specificities in how and why these bodies emerged (such as the role played by a key individual), being mobilised differently in local discourses to reinforce or renew established government powers and agendas. This latter point illustrates how these bodies are created to respond to specific policy and political needs and are the product of politics despite the omnipresent narrative of evidence.

Roles played by knowledge brokering organisations and types of legitimacy

Despite playing different roles and mobilising different practices, one strategy transcended all organisations: the need to build legitimacy. Similarly to Williams’ cases (Williams, 2018; 2021), the KBOs were in a constant process of constructing and recalibrating their legitimacy in balance with different audiences with competing interests. The KBOs were at times generating their own evidence via traditional academic research, compiling and synthesising evidence, advising decision makers and helping to formulate policy questions and solutions, and even advocating for particular interventions. For instance, regarding their identities, or cognitive legitimacy (Williams, 2018), KBOs were generally loosely defined, straddling the different worlds of policy and research. Yet simultaneously, they all aimed to deploy a clear and simple organisational identity such as key messages, missions and theories of change, which built on more corporate/managerial type of legitimacies.

Many of the strategies deployed by KBOs in articulating their credibility mobilised elements from academic and think tank narratives. Interviewees articulated the academic background of their staff (WCPP 4); the composition of their Advisory Board (Mowat 2); the types of evidence-linked activities that they undertook such as evidence syntheses, evidence reviews and capacity building (ACE 1 and 2); their peer-reviewed research activities which emphasised their academic credibility (WCPP 3); and their rigorous methodology which, simultaneously, chimed with academic canons, but was relevant and timely: “There’s a whole lot of methodological geeky stuff where academics define rigour on those criteria and the line I use is, ‘If your beautiful systematic review, or whatever the research report is, is one hour late for a government policy decision-making committee meeting, it’s not rigorous’” (ACE 2).

In relation to the think-tank narrative, interviewees articulated their closeness to government, illustrating how connected they were to what policymakers wanted and needed: “[W]ith the Welsh Government, we are part of the landscape… We talk regularly to Ministers, and that access is all pretty straightforward” (WCPP 1).

This closeness was, for instance, mobilised by an Ontario interviewee as equivalent to trust and their funder not finding them threatening: “That wasn’t because we did what they said, or produced evidence that they liked, but they knew we weren’t trying to screw them over or embarrass them” (Mowat 2).This relationship with government was articulated as coproduction by some respondents: “So our focus, and a lot of what you will read about us, focuses on relationships and it focuses on coproduction, and the way we externally relate to our stakeholders” (ACE 1).

In constructing their pragmatic legitimacy (for example, outcomes such as outputs, audiences and impact), the three KBOs blended different types of legitimacies in managing their outcomes, combining evidence reviews, university assessment of research (for example, Research Excellence Framework impact case studies for WCPP), and academic publications/conferences, with policy briefings, executive summaries, roundtables, and informal meetings with decision makers. These activities allowed them to tap into different types of legitimacies, keeping in mind the balance within the organisation itself. These documents generally also established clear rules of engagement with stakeholders and clients such as right to publish, the types of evidence valued, and how they collated and presented evidence.

Because of the two types of narratives – academic and think-tank – and credibilities being articulated, many respondents spent time comparing themselves to other organisations. They discussed how they competed or collaborated with others. This competition-collaboration discussion allowed KBOs to differentiate themselves from both academia and think tanks, as illustrated here: “I think we’re unique in that regard. We’re different from think tanks and we’re different from academic centres” (WCPP 2).

One of our academic experts discussed the careful balancing involved in building these organisations’ credibility, which went beyond academic vs think tank, to include credibility as being part of a network:

‘When you talk to people about why they trust evidence, they talk about the credibility of the brokers or of the source – and/or the source of that information, so we know how important it is. For academics, a lot of what we trade on in terms of credibility is our academic base, so it’s our institutional home.… Credibility comes from the fact that people know people who know you, or know of the work you do…’. (UK 2)

In summary, a number of strategies and tools were mobilised by the bodies in negotiating and constructing their credibility, often involving the articulation of evidence and academic and think-tank signifiers and strategies to illustrate the quality of their work and their closeness to policymakers. Rather than this closeness to government being seen by our participants as in contrast with their independence, interviewees emphasised how they could have both, “dancing the dance” (Mowat) and “walking the tight rope” between independence and influence (Phipps and Morton, 2013). This ambivalent credibility also illustrated the constant renegotiation and reworking of these KBOs’ roles and activities, constantly evolving according to policy and political demands, and reworking or ‘blurring’ the legitimacies they were tapping into: “I made some active choices in those conversations, to guide those conversations to a particular place. I think that that’s the right outcome, but arguably I’m advocating for playing an honest broker role. So some of these things break down and blur” (WCPP 1).

In some cases, even seemingly clear-cut responses about their role were ambiguous and illustrated the mixing of different legitimacies – for example, think-tank and academic – and different tools – for example, evidence versus recommendations: “I think [we’re an] honest broker. We tended not to do advocacy, obviously if you write a paper and there are recommendations connected to it there’s a soft, or implicit, advocacy” (Mowat 2). We now discuss relationships with policymakers.

Relationships with policymakers and consequences for policy

In this final section, we analyse how KBOs relate to policymakers, notably regarding how they construct the impact of their work on policy. We also discuss broader questions of the role that evidence plays in policymaking, and how KBOs perceive policymakers to understand and use evidence in their everyday practice.

In addition to the closeness versus independence dichotomy already discussed, KBOs emphasised the freedom and agency they enjoyed while also recognising the imperatives of meeting the needs of their policy ‘clients’ and being relevant to them: “[W]e had a lot of freedom and the processes around were extremely informal. There was no written mandate. We didn’t get a letter every year saying, ‘Here’s what we think your priorities should be’” (Mowat 4). “[I]n practice, I don’t think I’ve ever felt constrained because we get so much influence over what it is that we work on. I think what it does is to rule out some things” (WCPP 4).

The KBOs dealt with potential moral credibility issues by stressing their (real or rhetorical) agency, for instance with their decision to say ‘no’ to certain policymaking demands: “The kinds of questions we get asked have changed, and our approach to picking up or not particular questions, or reframing questions that we do decide to pick up, has changed” (WCPP 1).

‘I think with [particular project], we had a funder that really wanted us to take a different direction with the paper and we didn’t. We convinced them to let us do something else but there was some pushback. You need to have the credibility with them too to say no’. (Mowat 3)

This freedom was in opposition with savvy KBO strategies such as not “deal[ing] with public policy issues that were provincial in jurisdiction and were currently heavily contested” (Mowat 2). Above all, they focused on providing relevant knowledge to what decisionmakers wanted, keeping in mind the “chances of them renewing funding” (Mowat 4). Others made similar points: “[T]he way that we present our work is that it always needs to be responsive to what people want” (ACE 1); or even being relevant to the point of policymakers anticipating KBOs’ answers: “[I]s there sufficient challenge around the table to Welsh Government, or are we just providing them with answers that they’re sort of already expecting?” (WCPP 3).

Thus, KBOs’ legitimacy needs to be carefully curated: they must be seen as useful to government – providing relevant evidence – but also not to ‘rock the boat’ for fear of reduced funding or even abolition. Similarly, this legitimacy depends on other stakeholders in the research community who, in contrast, would not always see usefulness to government as constituting legitimacy. Thus, the objects and processes of legitimacy that these KBOs must address are different, involving complexity and contradictions.

KBOs’ views on the best ways of informing policymaking were among the most discussed topics in our interviews. This clearly illustrates the importance of impact for these bodies in the construction of their image and that of their ‘right’ outputs (Williams, 2018). Among their top tips were ways of building trusted relationships with policymakers, connections and networks, relevance and thoroughness, leadership, and communication; all factors present in the literature on the enablers of EIPM (Oliver et al, 2014b). What was novel, however, was when interviewees were asked to discuss a most impactful case of their organisation. Throughout the examples provided, what transpired as being most important about KBOs’ role was that of sense-making, clarifying research questions, highlighting new solutions in messy and complicated policy issues, and depoliticising a situation: “Of course, it’s not always a policy announcement or a funding announcement. It’s just improved their knowledge when they go into conversations or discussions with others” (WCPP 2). “Some examples are where our knowledge brokerage has shaped how decision makers are talking and engaging with evidence” (ACE 2).

Illustrating with the example of Mowat’s work being used by the Ontario Government to contest a Federal change in taxation, a Mowat interviewee explained: “So why were we successful? Because we depoliticised this issue and explained it” (Mowat 4). This more informal impact is far from the clear and usually quantifiable examples of impact that knowledge brokers usually discuss (the three KBOs also part-took in traditional, more quantified evaluations of their impact with annual and mid-term reviews, external evaluations, altmetrics, and stories of impact (Bandola-Gill and Smith, 2021). In fact, this informal impact can be pictured as the blend of their multiple sources of legitimacies – academic rigour, knowledge brokering networks, evidence, trust and closeness to government. These findings point to the value of ‘informal brokering’ – that is achieved by meeting and negotiating with policymakers – alongside the more formal brokering roles. They also speak to how the credibility of KBOs becomes articulated in particular contexts, to fit desired policy solutions. It illustrates the need for these bodies to develop new ways of demonstrating their impact which take account of these subtleties using stories of impact and in-depth qualitative data. Furthermore, this impact must be viewed in the wider context within which KBOs exist, one dominated by policy and politics, where their impact will be limited and/or determined by what policymakers do with the evidence.

Finally, we are also interested in understanding how policymakers view and use evidence and KBOs, leading to interviewees discussing how they thought their research is sometimes used by policymakers. An interviewee framed the mobilisation of KBOs’ outputs by policymakers as potential “ammunition”, more so for civil servants than politicians: “[M]uch more on the coaching side of the public servants, like, ‘Okay, well, we understand your objective. Here are good ways of getting to your objective and here’s the evidence that shows that we’re not just making this up and we’re not going to lead you astray’” (Mowat 4). Linked to this was the role that evidence ought to play in policymaking, illustrating the politicisation of evidence in a policymaking context:

‘[E]vidence is always going to be contested, evidence is always going to be contextualised.’ (Mowat 2) ‘It ought to be part of the process, but I don’t think it will ever drive the process. What’s going to drive policymaking is going to be politics.’ (Canada 1)

Respondents with academic backgrounds (often in health) were more inclined towards an EIPM vision. For example, one interviewee argued that evidence “is absolutely essential to any policy and to decision making more broadly” (ACE 1), with another asking to “see policymaking, especially in these very contentious values-based areas [for example, climate change], be treated as something where it’s not optional, and that it really is used to guide our decision making” (Mowat 1).

Discussion and conclusions

With the growing influence of KBOs and increased investment in evidence and knowledge transfer, it is important to examine and critically assess how and why these phenomena are playing out and how they are linked to broader issues of technocratisation, government by experts, and depoliticisation (Fischer, 1990; Wood, 2015; Griggs et al, 2017). KBOs may even present a democratic problem dependent on why they were created and how they work, that is, in the politics of KBOs. With our three case studies in different countries, combined with additional data gathered from experts, we have analysed how a new manifestation of EIPM is being mobilised in similar ways in different political and socioeconomic contexts.

Despite emerging and working in different contexts, the three KBOs resembled each other in many ways, illustrating the almost hegemonic position of EIPM in governing evidence. The comparisons between them highlighted how KBOs develop related narratives of origins and functions, and play comparable roles in the policy process, whether that is in South Africa, Canada or Wales. Moreover, they all build their credibility and construct their impact stories similarly. Indeed, despite the different local conditions within which these bodies exist, how they discussed their emergence and functions drew on similar narratives rather than emphasising how unique their existence was. Our research improves understanding of how one new tool in the EIPM arsenal – KBOs – is being mobilised by different governments in similar ways and with similar tools.

On why and how KBOs emerge, interviewees discussed gaps in the current provision of evidence within government and the demand for an improvement in how policymakers can draw upon existing evidence. However, our analysis also highlighted how context matters, notably local histories, structures and policy communities, with particular takes on evidence and its role in policy. KBOs played diverse roles and activities across countries, depending on how they were set up and the role of individual leaders.

Answering our second research question, we have demonstrated how the role of these KBOs is a constant ‘work-in-progress’, drawing on different types of legitimacies, and balancing multiple activities, be that as evidence generator, evidence evaluator and broker, policy adviser, and advocate. We document and highlight how carefully the work of these KBOs must be curated to be ‘effective’: KBOs must be seen as independent and rigorous, yet useful to governments who usually [in]directly fund them. They must provide robust yet relevant, timely, and easily-digestible evidence (Bandola-Gill, 2021), but also not ‘rock the boat’ or run the risk of being defunded or abolished. The same conundrums exist within the research world, with any excessive closeness to one stakeholder or client potentially resulting in being delegitimised in the eyes of others. Adding to the research on credibility and legitimacy, and wider knowledge brokering research, we show how these KBOs go beyond articulating their legitimacies with different audiences. Indeed, the KBOs that we studied also constantly renegotiated what they did and how, for example, doing research, evidence synthesis, policy advice and even advocacy (thus going beyond the roles studied by other legitimacy work in this field), and so adapting the legitimacy narratives that they articulated. KBOs present a new setup and highlight how KBOs’ legitimacies and activities are almost limitless, changing according to demand, context and opportunities. The arsenal of tools and roles means that they constantly renegotiate their legitimacy, and by extension what kind of organisation they are, further blurring the boundaries between knowledge and policy. We also bring a focus on legitimacy from an organisational perspective, examining the context within which activities and legitimacies are constructed – an under-explored topic in the literature (Bandola-Gill, 2021: 7). The fact that these KBOs tended to do more and more, and develop new types of outputs, made their allegiances difficult to unpick, something which could cause difficulties for external observers examining their accountabilities.

It is significant that one of our three case studies (Mowat) was abolished during the course of our fieldwork by a new government. The interviewees from this KBO were consistent in promoting a narrative of neutrality, producing policy-relevant research that fed into the policymaking process, and having a constructive relationship with their government funder. Similar debates emerged in the WCPP case from opposition parties at the beginning of the electoral campaign for a new Welsh Parliament (Nation Cymru, 2020). These examples highlight the difficult work of these organisations in remaining close enough to influence policy, yet maintaining or cultivating an image of independence and objectivity. Ultimately, the Mowat case shows that, as the work of any government-funded KBO is not a permanent feature of the policymaking process, its existence will be partly dependent upon the changing political context within which it works. The balancing act and tension between their declared independence from governments and their simultaneous proximity was evident across these organisations which are funded to different levels by government. They must maintain their image of independence, as well as constantly emphasise their impact on and closeness to policy in order to benefit from future government funding.

Finally, we explored how KBOs relate to policymakers and the consequences for informing policymaking. Our findings reveal the complex power relationships with policymakers exhibited by the Mowat case and a familiar list of enablers of EIPM such as trust, leadership and good communications. The ways in which KBOs were able to influence policy were, however, much more subtle and informal than what the EIPM literature usually advocates, including shaping how policymakers talk about evidence and the value of depoliticising an issue. KBOs were especially sought after for their informal style, experiential and tacit knowledge, based on their staff experiences and institutional history and memory. This moved the idea of these bodies doing basic evidence brokering to a different and more complex understanding.

We recognise that our research has limitations. For example, our interviewees were predominantly from the three KBOs. There would be significant value in gathering data from other actors (both civil servants and politicians) on these bodies’ roles and impact. We intend to pursue questions such as what evidence means for these actors and whether they value the role of KBOs in the policymaking process in future research. Other concepts and frameworks could also produce useful findings for understanding knowledge brokering, notably the concept of policy entrepreneurs.

To conclude, our approach has allowed us to tackle the often taken-for-granted status that evidence frequently occupies in policymaking, highlighting the politics of how knowledge and evidence, and notably KBOs, are mobilised and understood. Secondly, the research has contributed an in-depth analysis of a new instance of EIPM – KBOs – which is the subject of increased government spending and practitioner interest, with new organisations being set up. Where most of what we know about these bodies tends to be produced by these bodies themselves, we endeavoured to contribute a more methodologically-robust analysis which will be of interest to academics, knowledge brokering practitioners and policymakers.

Note

1

WCPP provides evidence for Welsh Government as well as public services, the latter including Welsh local government, health, education and other organisations providing public services. For this paper however, we focus on WCPP’s work with the Welsh Government only, to be comparable to the two other cases.

Funding

This work was supported by the Welsh Government, the Economic and Social Research Council (ESRC) and Cardiff University who fund the Wales Centre for Public Policy (ES/R00384X/1).

Acknowledgements

We would like to thank the editor and reviewers for their thorough and helpful comments and suggestions. We would also like to thank colleagues at the Wales Centre for Public Policy for commenting on various drafts of this paper, notably Andrew Connell, Hannah Durrant, and Steve Martin. Finally, we would like to thank all the interview participants for their insights and finding time to speak to us.

Contributor statement

EM wrote the first draft with comments from JD and EM and JD wrote the subsequent drafts. EM and JD conceptualised the study. EM conducted most of the interviews, with JD conducting some of them with EM.

Research ethics statement

This research project obtained ethical approval from the Cardiff University Business School Ethics Committee on 12 February 2019.

Conflict of interest

The authors declare that there is no conflict of interest.

References

  • Abelson, D.E. (2018) Do Think Tanks Matter?, 3rd edn, Toronto: McGill-Queen’s University Press.

  • Åm, H. (2013) ‘Don’t make nanotechnology sexy, ensure its benefits, and be neutral’: studying the logics of new intermediary institutions in ambiguous governance contexts, Science and Public Policy, 40(4): 46678.

    • Search Google Scholar
    • Export Citation
  • Bandola-Gill, J. (2021) The legitimacy of experts in policy: navigating technocratic and political accountability in the case of global poverty governance, Evidence & Policy, 17(4): 61533, doi: 10.1332/174426420X16000980489195.

    • Search Google Scholar
    • Export Citation
  • Bandola-Gill, J. and Lyall, C. (2017) Knowledge brokers and policy advice in policy formulation, in M. Howlett and I. Mukherjee (eds) Handbook of Policy Formulation, Cheltenham: Edward Elgar, pp 249264.

    • Search Google Scholar
    • Export Citation
  • Bandola-Gill, J. and Smith, K.E. (2021) Governing by narratives: REF impact case studies and restrictive storytelling in performance measurement, Studies in Higher Education,  doi: 10.1080/03075079.2021.1978965.

    • Search Google Scholar
    • Export Citation
  • Bassett, R. (2010) Iterative, in A.J. Mills, G. Durepos and E. Weibe (eds) Encyclopedia of Case Study Research, London: Sage.

  • Bednarek, A.T., Shouse, B., Hudson, C.G. and Goldburg, R. (2016) Science-policy intermediaries from a practitioner’s perspective: the Lenfest Ocean Program experience, Science and Public Policy, 43(2): 291300. doi: 10.1093/scipol/scv008

    • Search Google Scholar
    • Export Citation
  • Bell, J. and Head, B.W. (2017) Knowledge mobilisation intermediaries operating at the research-policy-practice nexus in Australia, Developing Practice: The Child, Youth and Family Work Journal, 48: 829.

    • Search Google Scholar
    • Export Citation
  • Cairney, P. (2016) The Politics of Evidence-Based Policy Making, London: Palgrave Macmillan.

  • Doberstein, C. (2017a) The credibility chasm in policy research from academics, think tanks, and advocacy organizations, Canadian Public Policy, 43(4): 36375. doi: 10.3138/cpp.2016-067

    • Search Google Scholar
    • Export Citation
  • Doberstein, C. (2017b) Whom do bureaucrats believe? A randomized controlled experiment testing perceptions of credibility of policy research, Policy Studies Journal, 45(2): 384405. doi: 10.1111/psj.12166

    • Search Google Scholar
    • Export Citation
  • Fischer, F. (1990) Technocracy and the Politics of Expertise, Newbury Park, CA: Sage.

  • Fischer, F. and Forester, J. (1993) The Argumentative Turn in Policy Analysis and Planning, Durham, NC: Duke University Press.

  • Flyvbjerg, B. (2006) Five misunderstandings about Case-study research, Qualitative Inquiry, 12(2): 21945. doi: 10.1177/1077800405284363

    • Search Google Scholar
    • Export Citation
  • Gough, D., Maidment, C. and Sharples, J. (2018) UK What Works Centres: Aims, Methods and Contexts, London: EPPI Centre.

  • Griggs, S., Howarth, D. and MacKillop, E. (2017) The meta-governance of austerity, localism, and practices of depoliticization, Anti-Politics, Depoliticization, and Governance, Oxford: Oxford University Press.

    • Search Google Scholar
    • Export Citation
  • Guston, D.H. (2001) Boundary organizations in environmental policy and science: an introduction, Science, Technology, & Human Values, 26(4): 399408.

    • Search Google Scholar
    • Export Citation
  • Hajer, M. and Wagenaar, H. (2003) Deliberative Policy Analysis, Cambridge: Cambridge University Press.

  • Hoeijmakers, M., Harting, J. and Jansen, M. (2013) Academic collaborative centre Limburg: a platform for knowledge transfer and exchange in public health policy, research and practice?, Health Policy, 111(2): 17583. doi: 10.1016/j.healthpol.2013.04.004

    • Search Google Scholar
    • Export Citation
  • Knight, C. and Lightowler, C. (2010) Reflections of ‘knowledge exchange professionals’ in the social sciences: emerging opportunities and challenges for university-based knowledge brokers, Evidence & Policy, 6(4): 54356, doi: 10.1332/174426410X535891.

    • Search Google Scholar
    • Export Citation
  • Lalande, L., Cave, J. and Jog, A. (2019) Committing to Action: Next Steps for Canada’s Evidence Ecosystem, Toronto: Mowat Centre.

  • Landsbergen, D. and Bozeman, B. (1987) Credibility logic and policy analysis: is there rationality without science?, Science Communication, 8(4): 62548.

    • Search Google Scholar
    • Export Citation
  • MacKillop, E., Quarmby, S., Downe, J. and Martin, S. (2020) What is knowledge brokering and its implications for Policy-making and its study, Policy and Politics, 48(2): 33553. doi: 10.1332/030557319X15740848311069

    • Search Google Scholar
    • Export Citation
  • McLevey, J. (2014) Think tanks, funding, and the politics of policy knowledge in Canada, Canadian Review of Sociology, 51(1): 5475. doi: 10.1111/cars.12033

    • Search Google Scholar
    • Export Citation
  • Medvetz, T.M. (2007) Think Tanks and the Production of Policy-knowledge in America, PhD thesis, Berkeley, CA: University of California.

  • Meyer, M. and Kearnes, M. (2013) Introduction to special section: intermediaries between science, policy and the market, Science and Public Policy, 40(4): 42329. doi: 10.1093/scipol/sct051

    • Search Google Scholar
    • Export Citation
  • Nation Cymru (2020) Why Darren Millar’s plan to cull the quangos may not lead to better government in Wales, https://nation.cymru/opinion/why-darren-millars-plan-to-cull-the-quangos-may-not-lead-to-better-government-in-wales/.

    • Search Google Scholar
    • Export Citation
  • Oliver, K., Lorenc, T. and Innvaer, S. (2014a) New directions in evidence-based policy research: a critical analysis of the literature, Health Research Policy and Systems, 12: 34.

    • Search Google Scholar
    • Export Citation
  • Oliver, K., Innvaer, S., Lorenc, T., Woodman, J. and Thomas, J. (2014b) A systematic review of barriers to and facilitators of the use of evidence by policymakers, BMC Health Services Research, 14(1): 2. doi: 10.1186/1472-6963-14-2

    • Search Google Scholar
    • Export Citation
  • Parkhurst, J.O. (2017) The Politics of Evidence : From Evidence-Based Policy to the Good Governance of Evidence, London: Routledge.

  • Phipps, D. and Morton, S. (2013) Qualities of knowledge brokers: reflections from practice, Evidence & Policy, 9(2): 25565.

  • Pielke Jr, R.A. (2007) The Honest Broker, Cambridge: Cambridge University Press.

  • Plehwe, D. (2014) Think tank networks and the knowledge–interest nexus: the case of climate change, Critical Policy Studies, 8(1): 101115. doi: 10.1080/19460171.2014.883859

    • Search Google Scholar
    • Export Citation
  • Rich, A. (2004) Think Tanks, Public Policy, and the Politics of Expertise, Cambridge: Cambridge University Press.

  • Smith, K. (2013) Beyond Evidence-Based Policy-Making in Public Health, Basingstoke: Palgrave Macmillan.

  • Stone, D. (1996) Capturing the Political Imagination: Think Tanks and the Policy Process, London: Frank Cass.

  • Welsh Labour (2011) Standing up for Wales, Cardiff: Welsh Labour.

  • Williams, K. (2018) Three strategies for attaining legitimacy in policy knowledge: coherence in identity, process and outcome, Public Administration, 96(1): 5369. doi: 10.1111/padm.12385

    • Search Google Scholar
    • Export Citation
  • Williams, K. (2021) Credibility in policy expertise: the function of boundaries between research and policy, Policy Studies Journal, 49(1): 3766. doi: 10.1111/psj.12342

    • Search Google Scholar
    • Export Citation
  • Wood, M. (2015) Depoliticisation, resilience and the herceptin post-code lottery crisis: holding back the tide, British Journal of Politics & International Relations, 17(4): 64464.

    • Search Google Scholar
    • Export Citation
  • Yin, R. (2009) Case Study Research: Design and Methods, Essential Guide to Qualitative Methods in Organizational Research, London: Sage.

    • Search Google Scholar
    • Export Citation

Appendix

Knowledge broker organisations: a comparative study of practices to inform policymaking

Questions for interviews in knowledge broker organisations

  1. How would you describe what your organisation does? Policy areas?

  2. Could you tell us a little bit about the origins of the organisation?

    • a. Why was it created? A specific event or report?

    • b. Support from where?

    • c. Key people behind it?

    • d. Government project?

    • e. Are there further people you would recommend us to speak to?

  3. What are your main sources of funding? Do the funding arrangements (for example, single origin, short-term, multiple funders) influence the work you want to/can undertake?

  4. How do you think your structural and practical setup differ from other bodies in the EBPM sphere?

  5. Do you have any competitors? If so, how do you manage that competition?

  6. How do you present your evidence to government and other stakeholders? That is, do you promote and campaign for your reports to become policy or do you let policymakers get on with the report you submit to them? It is often not clear-cut but where do you think you are on the scale? (for example, Pielke’s advocate, honest broker, science adviser, pure scientist)

  7. What would say are the main activities undertaken by your organisation?

  8. Does the organisation play different roles in influencing/informing policymaking?

  9. How do you influence or inform policymaking?

  10. What would you say are the best ways of informing/influencing policymaking?

  11. How important is the role played by an individual knowledge broker?

  12. What would say are the main outputs produced by your organisation?

  13. What has been your biggest impact on policymaking? Why? (for example, context, window, support, good evidence, chance)

  14. How do you measure the impact of your work? Can you measure your influence?

  15. In your experience, how do policymakers understand and/or use evidence in their work? In what ways could this be improved?

  16. What role do you think should evidence/knowledge play in policymaking?

  17. What is a typical context/relationship with government? Other stakeholders?

  18. Why do you think there has been such a growth in the number of bodies providing evidence-based information or advice? (over 30 university-based bodies in the UK) How do these differ from think tanks?

  19. How do you deal with issues of accountability and independence raised by working with policymakers?