Building the concept of research impact literacy

View author details View Less
  • 1 University of Lincoln, UK
  • | 2 York University, Canada
Full Access
Get eTOC alerts
Rights and permissions Cite this article

Impact is an increasingly significant part of academia internationally, both in centralised assessment processes (for example, UK) and funder drives towards knowledge mobilisation (for example, Canada). However, narrowly focused measurement-centric approaches can encourage short-termism, and assessment paradigms can overlook the scale of effort needed to convert research into effect. With no ‘one size fits all’ template possible for impact, it is essential that the ability to comprehend and critically assess impact is strengthened within the research sector. In this paper we reflect on these challenges and offer the concept of impact literacy as a means to support impact at both individual and institutional levels. Opportunities to improve impact literacy are also discussed.

Abstract

Impact is an increasingly significant part of academia internationally, both in centralised assessment processes (for example, UK) and funder drives towards knowledge mobilisation (for example, Canada). However, narrowly focused measurement-centric approaches can encourage short-termism, and assessment paradigms can overlook the scale of effort needed to convert research into effect. With no ‘one size fits all’ template possible for impact, it is essential that the ability to comprehend and critically assess impact is strengthened within the research sector. In this paper we reflect on these challenges and offer the concept of impact literacy as a means to support impact at both individual and institutional levels. Opportunities to improve impact literacy are also discussed.

Background and previous work

With research impact being progressively and formally weaved into the academic landscape, it is ever more important that impact is comprehensively understood. In this paper we reflect first on a selection of challenges to creating and reporting on impact, following which we offer the concept of impact literacy as a means to support the emerging practice of research impact at individual and institutional levels sector-wide.

Within the UK, research impact has been substantially driven through its introduction to both arms of the dual funding system (Hughes et al, 2013). For the 2014 Research Excellence Framework assessment (REF; see www.ref.ac.uk for details and results), case studies on the social, economic, environmental and cultural benefits determined 20% of mainstream government quality-related (QR) research funding (see www.hefce.ac.uk/rsrch/funding/mainstream). Impact also features strongly in the parallel competitive funding arm, with Research Council UK (RCUK) grants requiring strong ‘Pathways to impact’ statements on the planned benefits of specific projects. Internationally there are varying practices and formal requirements for impact and knowledge mobilisation. For example, in contrast to the UK, Canada does not have a centralised system of research impact assessment. It is instead driven primarily by funders’ requirements; most Canadian academic research funding agencies require a strategy for knowledge mobilisation (in the social sciences and humanities; www.sshrc-crsh.gc.ca), knowledge translation (in health; http://www.cihr-irsc.gc.ca/e/193.html) and commercialisation (in natural sciences and engineering; www.nserc-crsng.gc.ca). Thus, while both countries align on the value of strategising impact from the funding stage, the prominence (UK) versus absence (Canada) of centralised assessment differentially prioritises, incentivises and legitimates the course of impact in practice.

A key first challenge to impact is that, while research evidence can be mobilised to support end beneficiaries (Nutley et al, 2007; Morton, 2015a; Phipps et al, 2016), impact fundamentally precludes templating. Analysis of the 2014 REF results (Kings College London and Digital Science, 2015) confirms that not only is there is no singular pathway to impact; of the 6,647 submitted impact cases, 3,709 impact pathways were unique. Whilst commercialisation and technology transfer have become well-established practices globally since the passage of the US Bayh Dole Act in 1981, such unidirectional approaches cannot be simply replicated in non-commercial, socially complex settings (Greenhalgh and Wieringa, 2011). Thus, impact relies on the development of pathways tailored to the precise study and social context, rather than drawing on models traditionally applied to economic effects.

A second concern is that measurement-centric approaches both (1) legitimise reductionist definitions of impact, and (2) arrogate governance on the endpoint of the research process. The Higher Education Funding Council of England (HEFCE) defines impact as ‘an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia’ (REF, 2011). However, this definition and the associated guidelines narrow the nature of eligible impact to only those meeting specific timeframe, research quality and project-relatedness criteria, and discount benefits to the academy, impacts not arising directly from demonstrably ‘excellent’ research or broader university activities. This is compounded by the process of institutional case study selection, through which the process of subjectively filtering out ‘poorer’ impact results in a disproportionately positive corpus of impact knowledge (see http://impact.ref.ac.uk/CaseStudies). RCUK definitions of impact (see www.rcuk.ac.uk/innovation/impacts) mirror – but are slightly broader than – those for REF, and include benefits within the scientific community. REF also stipulates precise measurement periods for the impact of research, broadly amounting to impact occurring in the seven years between assessments, generated by research stemming back over the previous 20 years. Impact however can take many years to achieve (Hughes and Martin, 2012) and reflections on the REF 2014 have thus raised concerns over short-termism (Stern and Nurse, 2014) and the (ir)responsible use of metrics (Wilsdon et al, 2015). With high rating (4*) case studies being worth an average of £46,311 in REF 2014 (Reed and Kerridge, 2017) financial incentives are likely to continue to drive behaviour in the UK. Impact delivery thus relies on individuals to not only generate and monitor effects, but to navigate strategically mediated definitions of what does, and does not, count.

Thirdly, as impacts do not usually occur serendipitously, assessment paradigms can overlook the scale of effort needed to convert knowledge into real-world effects and the active role of users. The implications of this are threefold. Firstly, impact functions (for example, planning, communication, external engagement, evidence gathering) are often diffused across many individuals, requiring multiple – potentially untrained – agents to understand impact and how their roles align. This multiple agency risks poor coordination of methods, disparate end-goals and duplication of effort within institutions. Secondly, overlooking skills and efforts can leave studies under-resourced for impact, ineffective articulation of ‘what works’, and impact-related staff with no routes for professional development (Lightowler and Knight, 2013). Thirdly, assessment-driven approaches centralise the academic and overlook the active role of users in ‘pulling’ research (Brown, 2012). Thus, impact relies on the often unaligned efforts of individuals, who must execute the appropriate skills, meet a measurement agenda and understand barriers and facilitators to user uptake.

Towards a more effective approach: the call for impact literacy

These challenges present a series of risks to impact delivery and reporting: non-prescriptive routes to impact preclude templating and require individuals to judge how best to mobilise knowledge; measurement-centric agendas coupled with narrow definitions require individuals to judge which (of many potential effects) are realistic, achievable and demonstrable; dispersed job functions risk missed opportunities and require individuals to develop and apply skills in isolation. While much of this stems from the UK’s assessment culture, process-focused countries such as Canada are not immune. Difficulty integrating impact into the research process, the challenge of building staff capacity, and complexity of articulating project-specific impact resonate across contexts. These factors, in combination with a breadth of other challenges inherent to a complex and changing social environment (for example, financial pressures, regulatory changes, political topicality), underscore the importance of supporting the growing community of practice to navigate these issues. For the purposes of this paper we will use the umbrella term practitioners of research impact (PRIs) to reflect all those who undertake work individually or in teams helping to support the translation of research to impacts. This includes, but is not restricted to, academic researchers (who may or may not also hold practitioner / teaching roles), impact officers, knowledge brokers, public engagement professionals, research support staff, and all those whose work aligns to realising non-academic benefits of research. Unless sufficient attention is paid to strengthening core comprehension of impact (primarily within PRIs), the sector will over-stretch, under-deliver and fail to build sufficient impact capacity.

Here therefore we present and advocate the concept of impact literacy as a central principle of impact practice. In summary, where PRIs are impact literate, they are able to identify appropriate impact goals and indicators, critically appraise and optimise impact pathways, and reflect on the skills needed to tailor approaches across contexts. This concept is derived from both authors’ extensive experience of supporting impact (UK) and knowledge mobilisation (Canada).

Impact literacy is conceptualised as the intersection of three elements of research impact:

  1. The practices that create impact (‘how’)

  2. The identification, assessment, evidencing and articulation of impact endpoints (‘what’)

  3. The successful integration of these by PRI’s (‘who’)

The ‘how’

The practice of research impact assessment (the ‘what’) is inextricably linked to the methods and means of creating research impact (the ‘how’). A review of systematic reviews of the literature on methods for creating impacts of research showed that multifaceted methods are more effective than individual interventions (Boaz et al, 2011). These methods for creating impacts of research fall into two broad categories: (1) dissemination or transfer methods; and (2) co-production or engaged methods. The Canadian Institutes of Health Research describes these as ‘end of grant’ and ‘integrated’ respectively (www.cihr-irsc.gc.ca/e/45321.html), indicating that practices can occur after the research has concluded or throughout the research process, including upstream to inform the research agenda using stakeholder engagement, as has been described in disability research (Camden et al, 2014). There is general agreement that integrated methods are more effective than end of grant methods (Gagnon, 2011; Ross et al, 2003). Indeed, van de Ven and Johnson (2006) and more recently Bowen and Graham (2013) have framed the knowledge to action gap as a problem not of knowledge transfer (that is, end of grant dissemination) but of knowledge production (that is, integrated or engaged scholarship). Drawing on evaluation science, if impact is what one is seeking to achieve (the dependent variable), then both knowledge production and the selected processes of knowledge mobilisation for a given context are what one changes to influence effect (the independent variable). Measuring research impact is arguably a measure of the effectiveness of knowledge mobilisation activities to connect research to impacts beyond the academy.

Canadian organisations have also developed impact logic frameworks to support planning and assessment. The Canadian Academy of Health Sciences (CAHS, 2009) framework traces the progress from research outputs to improved health and wellbeing and economic and social prosperity. The CAHS framework is being operationalised as the research impact planning and assessment framework for Canadian provincial health research funding organisations, as exemplified by the Alberta Innovates Health Solutions (Graham et al, 2012). However, such frameworks often position impact as an endpoint, without reference to accumulated benefits over time. Extending the CAHS framework by including a co-produced element throughout the logic model informed the co-produced pathway to impact that has been adopted as the research planning framework by large, multi-million dollar Networks of Centres of Excellence (Phipps et al, 2016). This model offers indicative indicators of impact at each stage of the research process, and in so doing helps nuance understanding of how impact progresses.

The ‘what’

Models for impact assessment and the theories of change are increasingly regular features of the impact landscape (Buxton and Hanney, 1996). However, as impact cannot be templated, the process of identifying and articulating discrete changes arising from research requires clarity on multiple factors. These include (but are not restricted to) identifying indicators of impact, understanding timescales and sequences of effects (Phipps et al, 2016), appraising appropriate measures, identifying non-academic research partners and end users (required for mediating impact; Morton, 2015b) and identifying suitable evidence of effect. Whilst analysis of the 2014 REF identified 60 discrete areas of impact (Kings College London and Digital Science, 2015), the nuanced nature of impact means that judgement of ‘what’ effects are possible and demonstrable falls to individuals to judge.

The ‘who’

A common feature of PRI roles (‘who’) is their support of activities that create and/or assess and articulate impacts of research beyond the academy. While communication skills are likely integral to many of these positions, these roles are distinct from communication professionals (Barwick et al, 2014). A systematic review of knowledge brokers identified 10 distinct tasks and 39 associated activities of knowledge broker practice (Bornbaum et al, 2015); however, this diversity of skills and foci has been cited as a challenge for the planning, training and sustainability of these roles (Lightowler and Knight, 2013; Chew et al, 2013). In addition to these tasks and activities, the qualities (Phipps and Morton, 2013) of knowledge brokers and their organisational context have received attention as described by Bowen and Graham:

Recognition of the importance of organizational context has resulted in a shift from focusing on individuals who broker knowledge between specific individuals to the concept of knowledge brokering as an organizational process. (Bowen and Graham, 2013, S5)

It is the effectiveness of these individuals (within their organisational context) in facilitating ‘what’ and ‘how’ collectively that leads to impact.

As expressed in Figure 1, impact literacy is only achieved through the comprehension of all three elements:

Figure 1:
Figure 1:

The intersect of What, Who and How to create Impact Literacy

Citation: Evidence & Policy 15, 4; 10.1332/174426417X15034894876108

This model also represents the implications of incomplete understanding:

  1. HOW and WHO in the absence of WHAT leads to insufficient understanding of the ultimate impacts, indicators, evidence and assessment thereofWHO and WHAT in the absence of HOW leads to insufficient understanding of the bespoke and nuanced processes by which impact is achieved

  2. HOW and WHAT in the absence of WHO leads to insufficient understanding of the roles and skills required to plan, generate, execute and assess impact and results in poorly informed and unsupported impact strategies

Drawing on earlier UK–Canada comparisons, arguably Canada’s focus on supporting impacts through knowledge mobilisation / translation (‘how’) with less focus on the evidence of impact places them at risk of (1). In contrast the UK’s focus on planning pathways and reporting demonstrable effects (‘what’) makes (2) the more likely limitation. In countries where the impact agenda is beginning to emerge, the concept of impact literacy can support strategic thinking to de-risk national, local and institutional approaches.

Developing individual and institutional impact literacy

The next challenge is to identify the most appropriate and effective means to integrate impact literacy into the research environment. Impact is only achievable (and sustainable) if operationalised into individual and institutional practice (Rycroft-Malone et al, 2016). While the formal concept of impact literacy is in its infancy, it already raises a series of developmental opportunities for PRIs and institutions alike.

PRIs can strengthen ‘how’ by investing time to explore the range of possible engagement and translation activities, facilitators, barriers and effectiveness of different knowledge brokerage tools. The impact community must also therefore share learning, both positive and negative, to build a stronger corpus of knowledge about successful processes. Institutional commitment to resourcing, recognising and supporting knowledge broker activities is crucial.

To strengthen ‘what’, PRIs must work collaboratively with end-users to determine meaningful and achievable benefits, and work in tandem to track effects. PRIs must also be mindful of the progression and sequence of impacts across a timeline. Institutions can support this by investing (financially, effortfully or otherwise) in mechanisms to support partnership working and develop intelligent means to manage impact information.

Strengthening ‘who’ requires firmer commitment to development of impact-related skills. PRIs must reflect on their own areas of competence and determine where they require training or development activities (for example, mentoring). This must be supported, even led by institutions to offer the opportunities for skills enhancement.

The conceptual model below (Figure 2) offers a first expression of how such areas of activity may be connectively built into individual and institutional culture.

Figure 2:
Figure 2:

Conceptual diagram for building impact literacy at the academic and institutional levels

Citation: Evidence & Policy 15, 4; 10.1332/174426417X15034894876108

Enhancing critical literacy

Alongside implementing elements of an impact literate culture, attention must also be paid to extending literacy above minimal levels. Drawing on parallels with health literacy (Guzys et al, 2015; Chinn, 2011), impact literacy must also be conceived not as binary (literate or not), but instead sitting along a continuum. Institutional strategies and individual development plans must support not only the development of basic understanding, but also the upward progression to a more advanced, critical level of literacy. Several sources exist to enable PRIs to enhance their literacy: practice-based guidelines, peer review and grey literature, skills-based training, drawing on tacit knowledge within the community, mentoring and shadowing schemes and many more. While practitioners must assess the robustness of such sources, they offer a breadth of insights into impact process and a wealth of opportunities to critically appraise ‘who’, ‘what’ and ‘how’.

Discussion

This paper presents the concept of impact literacy as a schema which aids the understanding of impact and associated processes. The intersection of ‘what’, ‘who’ and ‘how’ offers a simple description of the elements needed for research impact, and this schema may help inform training and development approaches for PRIs. Knowing how impact ‘works’ is central for guiding research impact practices and the people who support them. A model is a necessarily simplified description of complex processes such as those in implementation science where research is informing practice or policy (Nilsen, 2015). We recognise that the simplicity of the presented model risks masking the breadth of research impact and knowledge mobilisation processes required for effective research impact. Undoubtedly, attempting to singularly configure ‘literacy’ is open to criticism, particularly from those whose work does not align with all three elements, or for those PRIs for whom increasing comprehension is challenged by lack of training and mentorship.

While there is a natural overlap between literacy (knowing) and competencies (doing), they are deliberately decoupled here to allow separate examination of the former’s distinct characteristics. The present ‘know-do’ gap (Booth, 2011) is fuelled by insufficient recognition of the importance and interrelatedness of each. This is neatly encapsulated in Goethe’s assertion that ‘knowing is not enough; we must apply. Willing is not enough; we must do.’ (Jong-Wook, 2004: 3). Notwithstanding criticisms and ongoing debates on impact itself, the principle of an underlying literacy underscores any such discussions about the optimal ways to mobilise research knowledge into effect.

The concept of impact literacy presents an opportunity for individuals, institutions and – as governors of impact expectations – funders and policymakers to scrutinise how impact is best achieved.

Conflict of interest

The authors declare that there is no conflict of interest.

References

  • Barwick, M., Phipps, D., Myers, G., Johnny, M. and Coriandoli, R. (2014) Knowledge translation and strategic communications: Unpacking differences and similarities for scholarly and research communications, Scholarly and Research Communication, vol 5, no 3, pp 114, http://yorkspace.library.yorku.ca/xmlui/bitstream/handle/10315/28518/Barwick%20Phipps%20Comms%20KT%20SRC%202014.pdf?sequence=1

    • Search Google Scholar
    • Export Citation
  • Boaz, A., Baeza, J. and Fraser, A. (2011) Effective implementation of research into practice: An overview of systematic reviews of the health literature, BMC Research Notes, vol 4, p 212, www.biomedcentral.com/content/pdf/1756-0500-4-212.pdf

    • Search Google Scholar
    • Export Citation
  • Booth, A. (2011) Bridging the ‘know-do gap’: A role for health information professionals? Health Information and Libraries Journal, vol 28, pp 3314

    • Search Google Scholar
    • Export Citation
  • Bornbaum, C.C., Kornas, K., Peirson, K. and Rosella, L.C. (2015) Exploring the function and effectiveness of knowledge brokers as facilitators of knowledge translation in health-related settings: A systematic review and thematic analysis, Implementation Science, vol 10, p 162, www.implementationscience.com/content/10/1/162

    • Search Google Scholar
    • Export Citation
  • Bowen, S.J. and Graham, I.D. (2013), From knowledge translation to engaged scholarship: Promoting research relevance and utilization, Archives of Physical Medicine and Rehabilitation, vol 94, no 1, Suppl 1, S38, www.sciencedirect.com/science/article/pii/S0003999312009227

    • Search Google Scholar
    • Export Citation
  • Brown, C (2012) The ‘policy-preferences model’: A new perspective on how researchers can facilitate the take-up of evidence by educational policy makers, Evidence & Policy, vol 8, no 4, pp 45572

    • Search Google Scholar
    • Export Citation
  • Buxton, M. and Hanney, S. (1996) How can payback from health services research be assessed? Health Services Research and Policy, vol 1, no 1, pp 3543

    • Search Google Scholar
    • Export Citation
  • CAHS (Canadian Academy of Health Sciences) (2009) Making an impact: A preferred framework and indicators to measure returns on investment in health research, Ottawa, ON: CAHS, http://cahs-acss.ca/making-an-impact

    • Search Google Scholar
    • Export Citation
  • Camden, C., Shikako-Thomas, K., Nguyen, T., Graham, E., Thomas, A., Sprung, J., Morris, C. and Russell, D.J. (2014) Engaging stakeholders in rehabilitation research: A scoping review of strategies used in partnership and evaluation of impacts, Disability and Rehabilitation, vol 37, p 15, http://informahealthcare.com/doi/pdf/10.3109/09638288.2014.963705

    • Search Google Scholar
    • Export Citation
  • Chew, S., Armstrong, N. and Martin, G. (2013) Institutionalising knowledge brokering as a sustainable knowledge translation solution in healthcare: How can it work in practice? Evidence & Policy, vol 9, no 3, pp 33551

    • Search Google Scholar
    • Export Citation
  • Chinn, D. (2011) Critical health literacy: A review and critical analysis, Social Science & Medicine, vol 73, p 6067

  • Gagnon, M. (2011) Moving knowledge to action through dissemination and exchange, Clinical Epidemiology, vol 64, pp 2531

  • Graham, K.E.R., Chorzempa, P.A., Valentine, P.A. and Magnan, J. (2012) Evaluating health research impact: Development and implementation of the Alberta Innovates Health Solutions impact framework, Research Evaluation, vol 21, pp 35467 http://rev.oxfordjournals.org/content/early/2012/11/14/reseval.rvs027.full

    • Search Google Scholar
    • Export Citation
  • Greenhalgh, T. and Wieringa, S. (2011) Is it time to drop the ‘knowledge translation’ metaphor? A critical literature review, Royal Society of Medicine, vol 104, no 12, pp 50109

    • Search Google Scholar
    • Export Citation
  • Guzys, D., Kenny, A., Dickson-Swift, V. and Threlkeld, G. (2015) A critical review of population health literacy, BMC Public Health, vol 15, p 215, www.ncbi.nlm.nih.gov/pmc/articles/PMC4351936

    • Search Google Scholar
    • Export Citation
  • Hughes, A. and Martin, B. (2012) Enhancing impact: The value of public sector R&D, Council for Industry and Higher Education and UK Innovation Research Centre, www.cbr.cam.ac.uk/fileadmin/user_upload/centre-for-business-research/downloads/special-reports/specialreport-enhancingimpact.pdf

    • Search Google Scholar
    • Export Citation
  • Hughes, A., Kitson, M., Bullock, A. and Milner, I. (2013) The dual funding structure for research in the UK: Research Council and Funding Council Allocation methods and the pathways to impact of UK academics, Cambridge: Department of Innovation and Skills, UK Innovation Research Centre

    • Search Google Scholar
    • Export Citation
  • Jong-Wook, L. (2004) World report on knowledge for better health, World Health Organisation, http://apps.who.int/iris/bitstream/10665/43058/1/9241562811.pdf

    • Search Google Scholar
    • Export Citation
  • King’s College London and Digital Science (2015) The nature, scale and beneficiaries of research impact: An initial analysis of Research Excellence Framework (REF) 2014 impact case studies, www.kcl.ac.uk/sspp/policy-institute/publications/Analysis-of-REF-impact.pdf

    • Search Google Scholar
    • Export Citation
  • Lightowler, C. and Knight, C. (2013) Sustaining knowledge exchange and research impact in the social sciences and humanities: Investing in knowledge broker roles in UK universities, Evidence & Policy, vol 9, no 3, pp 31734

    • Search Google Scholar
    • Export Citation
  • Lord Stern and Nurse P. (2014) It’s our duty to assess the costs of REF, Times Higher Education, 11 December, https://www.timeshighereducation.com/comment/letters/its-our-duty-to-assess-the-costs-of-the-ref/2017479.article

    • Search Google Scholar
    • Export Citation
  • Morton, S. (2015a) Creating research impact: The roles of research users in interactive research mobilisation, Evidence & Policy, vol 11, no 1, pp 3555

    • Search Google Scholar
    • Export Citation
  • Morton, S. (2015b) Progressing research impact assessment: A ‘contributions’ approach, Research Evaluation, vol 24, no 4, pp 40519, http://rev.oxfordjournals.org/content/24/4/405

    • Search Google Scholar
    • Export Citation
  • Nilsen, P. (2015) Making sense of implementation theories, models and frame-works, Implementation Science, vol 10, no 53, pp 113

  • Nutley, S., Walter, I. and Davies, H. (2007) Using evidence: How research can inform public services, Bristol: Policy Press

  • Phipps, D.J. and Morton, S. (2013) Qualities of knowledge brokers: Reflections from practice, Evidence & Policy, vol 9, no 2, pp 25565

    • Search Google Scholar
    • Export Citation
  • Phipps, D.J., Cummings, J., Pepler, D., Craig, W. and Cardinal, S. (2016) The co-produced pathway to impact describes knowledge mobilisation processes, Community Engagement and Scholarship, vol 9, no 1, pp 3140

    • Search Google Scholar
    • Export Citation
  • Reed, M. and Kerridge, S. (2017) How much was an impact case study worth in the UK Research Excellence Framework? www.fasttrackimpact.com/single-post/2017/02/01/How-much-was-an-impact-case-study-worth-in-the-UK-Research-Excellence-Framework

    • Search Google Scholar
    • Export Citation
  • REF (Research Excellence Framework) (2011) Assessment framework and guidance on submissions, www.ref.ac.uk/pubs/2011-02

  • Ross, S., Lavis, J., Rodriguez, C., Woodside, J. and Denis, J.L. (2003) Partnership experiences: Involving decision makers in the research process, Health Services Research and Policy, vol 8, Suppl 2, pp 2634

    • Search Google Scholar
    • Export Citation
  • Rycroft-Malone, J., Burton, C.R., Wilkinson, J., Harvey, G., McCormack, B., Baker, R., Dopson, S., Graham, I.D., Staniszewska, S., Thompson, C., Ariss, S., Melville-Richards, L. and Williams, L. (2016) Collective action for implementation: A realist evaluation of organisational collaboration in healthcare, Implementation Science, vol 11, p 17

    • Search Google Scholar
    • Export Citation
  • van de Ven, A.H. and Johnson, P.E. (2006) Knowledge for theory and practice, Academy of Management Review, vol 31, no 4, pp 80221

  • Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., Jones, R., Kain, R., Kerridge, S., Thelwall, M., Tinkler, J., Viney, I., Wouters, P., Hill, J. and Johnson, B. (2015) The metric tide: Report of the independent review of the role of metrics in research assessment and management, www.hefce.ac.uk/media/HEFCE,2014/Content/Pubs/Independentresearch/2015/The,Metric,Tide/2015_metric_tide.pdf

    • Search Google Scholar
    • Export Citation
  • View in gallery

    The intersect of What, Who and How to create Impact Literacy

  • View in gallery

    Conceptual diagram for building impact literacy at the academic and institutional levels

  • Barwick, M., Phipps, D., Myers, G., Johnny, M. and Coriandoli, R. (2014) Knowledge translation and strategic communications: Unpacking differences and similarities for scholarly and research communications, Scholarly and Research Communication, vol 5, no 3, pp 114, http://yorkspace.library.yorku.ca/xmlui/bitstream/handle/10315/28518/Barwick%20Phipps%20Comms%20KT%20SRC%202014.pdf?sequence=1

    • Search Google Scholar
    • Export Citation
  • Boaz, A., Baeza, J. and Fraser, A. (2011) Effective implementation of research into practice: An overview of systematic reviews of the health literature, BMC Research Notes, vol 4, p 212, www.biomedcentral.com/content/pdf/1756-0500-4-212.pdf

    • Search Google Scholar
    • Export Citation
  • Booth, A. (2011) Bridging the ‘know-do gap’: A role for health information professionals? Health Information and Libraries Journal, vol 28, pp 3314

    • Search Google Scholar
    • Export Citation
  • Bornbaum, C.C., Kornas, K., Peirson, K. and Rosella, L.C. (2015) Exploring the function and effectiveness of knowledge brokers as facilitators of knowledge translation in health-related settings: A systematic review and thematic analysis, Implementation Science, vol 10, p 162, www.implementationscience.com/content/10/1/162

    • Search Google Scholar
    • Export Citation
  • Bowen, S.J. and Graham, I.D. (2013), From knowledge translation to engaged scholarship: Promoting research relevance and utilization, Archives of Physical Medicine and Rehabilitation, vol 94, no 1, Suppl 1, S38, www.sciencedirect.com/science/article/pii/S0003999312009227

    • Search Google Scholar
    • Export Citation
  • Brown, C (2012) The ‘policy-preferences model’: A new perspective on how researchers can facilitate the take-up of evidence by educational policy makers, Evidence & Policy, vol 8, no 4, pp 45572

    • Search Google Scholar
    • Export Citation
  • Buxton, M. and Hanney, S. (1996) How can payback from health services research be assessed? Health Services Research and Policy, vol 1, no 1, pp 3543

    • Search Google Scholar
    • Export Citation
  • CAHS (Canadian Academy of Health Sciences) (2009) Making an impact: A preferred framework and indicators to measure returns on investment in health research, Ottawa, ON: CAHS, http://cahs-acss.ca/making-an-impact

    • Search Google Scholar
    • Export Citation
  • Camden, C., Shikako-Thomas, K., Nguyen, T., Graham, E., Thomas, A., Sprung, J., Morris, C. and Russell, D.J. (2014) Engaging stakeholders in rehabilitation research: A scoping review of strategies used in partnership and evaluation of impacts, Disability and Rehabilitation, vol 37, p 15, http://informahealthcare.com/doi/pdf/10.3109/09638288.2014.963705

    • Search Google Scholar
    • Export Citation
  • Chew, S., Armstrong, N. and Martin, G. (2013) Institutionalising knowledge brokering as a sustainable knowledge translation solution in healthcare: How can it work in practice? Evidence & Policy, vol 9, no 3, pp 33551

    • Search Google Scholar
    • Export Citation
  • Chinn, D. (2011) Critical health literacy: A review and critical analysis, Social Science & Medicine, vol 73, p 6067

  • Gagnon, M. (2011) Moving knowledge to action through dissemination and exchange, Clinical Epidemiology, vol 64, pp 2531

  • Graham, K.E.R., Chorzempa, P.A., Valentine, P.A. and Magnan, J. (2012) Evaluating health research impact: Development and implementation of the Alberta Innovates Health Solutions impact framework, Research Evaluation, vol 21, pp 35467 http://rev.oxfordjournals.org/content/early/2012/11/14/reseval.rvs027.full

    • Search Google Scholar
    • Export Citation
  • Greenhalgh, T. and Wieringa, S. (2011) Is it time to drop the ‘knowledge translation’ metaphor? A critical literature review, Royal Society of Medicine, vol 104, no 12, pp 50109

    • Search Google Scholar
    • Export Citation
  • Guzys, D., Kenny, A., Dickson-Swift, V. and Threlkeld, G. (2015) A critical review of population health literacy, BMC Public Health, vol 15, p 215, www.ncbi.nlm.nih.gov/pmc/articles/PMC4351936

    • Search Google Scholar
    • Export Citation
  • Hughes, A. and Martin, B. (2012) Enhancing impact: The value of public sector R&D, Council for Industry and Higher Education and UK Innovation Research Centre, www.cbr.cam.ac.uk/fileadmin/user_upload/centre-for-business-research/downloads/special-reports/specialreport-enhancingimpact.pdf

    • Search Google Scholar
    • Export Citation
  • Hughes, A., Kitson, M., Bullock, A. and Milner, I. (2013) The dual funding structure for research in the UK: Research Council and Funding Council Allocation methods and the pathways to impact of UK academics, Cambridge: Department of Innovation and Skills, UK Innovation Research Centre

    • Search Google Scholar
    • Export Citation
  • Jong-Wook, L. (2004) World report on knowledge for better health, World Health Organisation, http://apps.who.int/iris/bitstream/10665/43058/1/9241562811.pdf

    • Search Google Scholar
    • Export Citation
  • King’s College London and Digital Science (2015) The nature, scale and beneficiaries of research impact: An initial analysis of Research Excellence Framework (REF) 2014 impact case studies, www.kcl.ac.uk/sspp/policy-institute/publications/Analysis-of-REF-impact.pdf

    • Search Google Scholar
    • Export Citation
  • Lightowler, C. and Knight, C. (2013) Sustaining knowledge exchange and research impact in the social sciences and humanities: Investing in knowledge broker roles in UK universities, Evidence & Policy, vol 9, no 3, pp 31734

    • Search Google Scholar
    • Export Citation
  • Lord Stern and Nurse P. (2014) It’s our duty to assess the costs of REF, Times Higher Education, 11 December, https://www.timeshighereducation.com/comment/letters/its-our-duty-to-assess-the-costs-of-the-ref/2017479.article

    • Search Google Scholar
    • Export Citation
  • Morton, S. (2015a) Creating research impact: The roles of research users in interactive research mobilisation, Evidence & Policy, vol 11, no 1, pp 3555

    • Search Google Scholar
    • Export Citation
  • Morton, S. (2015b) Progressing research impact assessment: A ‘contributions’ approach, Research Evaluation, vol 24, no 4, pp 40519, http://rev.oxfordjournals.org/content/24/4/405

    • Search Google Scholar
    • Export Citation
  • Nilsen, P. (2015) Making sense of implementation theories, models and frame-works, Implementation Science, vol 10, no 53, pp 113

  • Nutley, S., Walter, I. and Davies, H. (2007) Using evidence: How research can inform public services, Bristol: Policy Press

  • Phipps, D.J. and Morton, S. (2013) Qualities of knowledge brokers: Reflections from practice, Evidence & Policy, vol 9, no 2, pp 25565

    • Search Google Scholar
    • Export Citation
  • Phipps, D.J., Cummings, J., Pepler, D., Craig, W. and Cardinal, S. (2016) The co-produced pathway to impact describes knowledge mobilisation processes, Community Engagement and Scholarship, vol 9, no 1, pp 3140

    • Search Google Scholar
    • Export Citation
  • Reed, M. and Kerridge, S. (2017) How much was an impact case study worth in the UK Research Excellence Framework? www.fasttrackimpact.com/single-post/2017/02/01/How-much-was-an-impact-case-study-worth-in-the-UK-Research-Excellence-Framework

    • Search Google Scholar
    • Export Citation
  • REF (Research Excellence Framework) (2011) Assessment framework and guidance on submissions, www.ref.ac.uk/pubs/2011-02

  • Ross, S., Lavis, J., Rodriguez, C., Woodside, J. and Denis, J.L. (2003) Partnership experiences: Involving decision makers in the research process, Health Services Research and Policy, vol 8, Suppl 2, pp 2634

    • Search Google Scholar
    • Export Citation
  • Rycroft-Malone, J., Burton, C.R., Wilkinson, J., Harvey, G., McCormack, B., Baker, R., Dopson, S., Graham, I.D., Staniszewska, S., Thompson, C., Ariss, S., Melville-Richards, L. and Williams, L. (2016) Collective action for implementation: A realist evaluation of organisational collaboration in healthcare, Implementation Science, vol 11, p 17

    • Search Google Scholar
    • Export Citation
  • van de Ven, A.H. and Johnson, P.E. (2006) Knowledge for theory and practice, Academy of Management Review, vol 31, no 4, pp 80221

  • Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., Jones, R., Kain, R., Kerridge, S., Thelwall, M., Tinkler, J., Viney, I., Wouters, P., Hill, J. and Johnson, B. (2015) The metric tide: Report of the independent review of the role of metrics in research assessment and management, www.hefce.ac.uk/media/HEFCE,2014/Content/Pubs/Independentresearch/2015/The,Metric,Tide/2015_metric_tide.pdf

    • Search Google Scholar
    • Export Citation
  • 1 University of Lincoln, UK
  • | 2 York University, Canada

Content Metrics

May 2022 onwards Past Year Past 30 Days
Abstract Views 92 92 0
Full Text Views 341 341 105
PDF Downloads 210 210 44

Altmetrics

Dimensions