Abstract

Background:

Programmes that provide scientists and engineers with support to engage in public policy have proliferated in the United States, with many opportunities available for training, networking and placements within government and government-facing organisations. This trend suggests that an evolution may be occurring at the science–policy interface. However, there is little extant data on the structure, aims and impacts of these programmes.

Aims and objectives:

This study maps the current landscape of US programmes seeking to train researchers at all career stages to engage in policy. We focus on Virginia, a state with a substantial number and diversity of programmes, to assess: (1) how they conceptualise their audiences, activities and impacts; and (2) which roles in policy and types of evidence use they address.

Methods:

We developed a database of US policy programmes (n=174) and conducted a case study of those in Virginia through surveys and interviews with their leaders (n=12).

Findings:

The majority (57%) of science policy programmes are state-based. These programmes include student organisations, government placements and fellowships, and academic certificates, degrees, and other trainings. While these reflect diverse models for how to engage researchers in policy, Virginia programme leaders across these categories similarly conceived long-term impacts, audiences and activities, researcher roles in policy, and types of decision-maker evidence use. And they perceived limited ability to implement evidence-based approaches within their programmes.

Discussion and conclusion:

Building additional programmatic capacity – through shared learning and partnerships – could lend support to this emerging trend in science policy with implications for US research and governance.

Background

Since the Second World War, the relationship between science and policy has evolved in the face of continuous societal and technological change. The social contract for science and the prioritisation of basic research funding forged in the aftermath of the Manhattan Project promised ‘widely diffused benefits to society and the economy in return for according an unusual degree of intellectual autonomy and internal self-governance to the recipients of federal support’ (Brooks, 1990: 12). However, that compact lasted for only a brief period. While scholars debate the dates and precise characterisation of subsequent science policy paradigms (Guston, 2000: 141), the general consensus is that there was a marked shift towards valuing the production of knowledge for societal utilisation – including in governance – by a wide range of actors outside of academia (Gibbons et al, 1994). Facilitating these more diverse forms of knowledge creation and exchange requires new practices such as co-production (Lemos et al, 2018), boundary spanning (Bednarek et al, 2018; Goodrich et al, 2020) and collaboration with boundary organisations (Guston, 2001), activities for which scientists and engineers typically do not receive training or incentivisation.

Origins of US programmes engaging researchers in policy

The 1973 launch of the American Association for the Advancement of Science’s Science & Technology Policy Fellowships (AAAS STPF) programme established a US mechanism to place researchers in legislative offices – and eventually government agencies – for immersive learning experiences, facilitating knowledge translation, exchange and use for policy (Stine, 1995; Teich and Lita, 2003). The John A. Knauss Marine Policy Fellowship, run by the National Oceanic and Atmospheric Administration (NOAA) National Sea Grant College Program, followed in 1979 (NOAA Sea Grant, 2023). It was only 30 years later that the first programmes modelled after these forerunners arose at the state level. The California Council on Science and Technology (CCST) led the way, founded in 2009 to serve the California State Legislature (Alberts et al, 2018). This model inspired subsequent philanthropic planning grants (NCSL, 2024) for a multitude of other state-level programmes (Diasio et al, 2020).

The government placement model is arguably an effective way to increase policy makers’ access to scientific and technological (S&T) knowledge (Golden, 1988; Fainberg, 1994). Evidence from initial assessments indicates that fellows make meaningful contributions to the efforts of the offices in which they work (Alberts et al, 2018; Pearl and Gareis, 2020). However, this is not the only mechanism available to promote knowledge exchange between the research enterprise and policy makers. In another early model, the National Academies of Sciences, Engineering, and Medicine (NASEM) created the Christine Mirzayan Science & Technology Policy Graduate Fellowship Program in 1997 to provide hands-on science policy training and education through placements within the Academy (NASEM, 2024), an organisation mandated to provide unbiased government advice (Blair, 2016). Academic faculty, like University of Michigan physicist Homer Neal, have developed courses and degree programmes focused on the role of science in policy processes and how researchers can actively participate (Smith and McCormick, 2021). Neal’s initial foray into teaching science policy in 2002 led to authorship of a popular US science policy textbook, published in 2008 (Neal et al, 2008; Neal, 2009). Furthermore, student organisations within colleges and universities around the country have been created to support their members’ interests in science policy (see, for example, NSPN, 2024). Last but not least, scientific societies and non-profit organisations like the American Association for the Advancement of Science, American Geophysical Union and Union of Concerned Scientists have created workshops, bootcamps, webinars and other policy training opportunities often combining classroom learning with hands-on experiences for engaging with policy makers and their staff, such as Capitol Hill visits (Ham, 2014; UCS, 2024). Notably, these programmes have historically served not only a wide range of disciplines, but scientists and engineers in differing career stages, from undergraduate students to those in graduate or postdoctoral positions and even established faculty. As a result, our use of the term ‘researcher’ refers to scientists and engineers across all career stages.

The growth in programmes that engage researchers in policy has occurred not just in the United States, but also internationally (Gual Soler et al, 2017). In a study similar to this one, which focused largely on programmes in the United Kingdom, Oliver et al (2022) found an upswing starting in 2010. The UK’s Research Excellence Framework (REF), which assesses societal impacts for the purpose of allocating public funding for university research (Terämä et al, 2016), was first implemented in 2014 and has broadly incentivised policy engagement (Torrance, 2020). A sizeable number of other countries have also implemented performance-based systems for funding decisions, but the United States is not among them (Hicks, 2012; Sivertsen, 2017; Banal-Estañol et al, 2023). That being said, even without explicit institutional incentives, there have long been formal and informal routes for researchers to engage in policy in the United States, such as sending letters to elected officials, participating in an expert panel or assessment, or serving in government as a science advisor (Pielke and Klein, 2009; Pain, 2014).

Documenting the emerging landscape of programmes engaging researchers in policy

Oliver et al (2022) focused their study on ‘research-policy engagement’, casting a wider net than our investigation by including organisations and activities that seek to intervene not just in facilitating researchers’ capacity to engage, but also decision-makers’ access to and use of evidence, and any number of other intervention points within research and policy systems. The majority of organisations they identified (79%) focus on broader goals of disseminating and communicating research, as opposed to building researchers’ skills (50%) and professional partnerships (50%), which are the primary intents of the US policy engagement programmes described earlier. In alignment with the goals of our study, Oliver et al (2022) developed their dataset with the intent of expanding generalisable knowledge regarding the design and potential effects of policy engagement initiatives.

In this study, we examine a subset of initiatives that focus on training researchers to engage in policy. As short-hand, we call these ‘science policy programmes’ in the recognition that: (1) they seek to facilitate the boundary-spanning participation of researchers in policy processes; and (2) they focus on ‘science policy’ as a specific area of public policy, defined by Brooks (1990) as ‘policy for science’ and ‘science for policy.’ However, we do not mean to imply that these programmes are necessarily formal academic programmes within the research fields of science, technology and innovation policy (Guston and Sarewitz, 2007; Lane et al, 2011), which may or may not have applied training components to facilitate researcher engagement in policy.

Due to the applied nature of science policy researcher engagement programmes, databases have been created to make information about them more easily accessible to potential participants (Muindi and Luray, 2023; Sigma Xi, 2023). In the United States in recent years, these lists – whether consisting of fellowships, training courses or student organisations – have proliferated online with varying parameters for the types of programmes included and consistency of curation (Singel, 2021; Smith and McCormick, 2021; Collu et al, nd; NSPN, nd). Our study aims to address the lack of scholarly attention to database development and analysis of the growing number of US programmes engaging researchers across all career stages in policy. A comprehensive and up-to-date corpus of information on what programmes exist – as well as their aims and impacts – is needed to establish how these interventions evolve over time and are impacting the science–policy interface.

Training researchers to engage in policy processes

While research on the role of science – and scientists – in policy processes is not new (Sabatier, 1988; Bogenschneider and Corbett, 2010; Cairney, 2016), few studies on interventions to promote the engagement of scientists in policy in the United States have been conducted (Scott et al, 2019). Among the handful of empirical studies, it has been shown that experiential learning, both in classroom settings and through community engagement, enhances graduate students’ perceived competence in policy and significantly increases their involvement in policy-related activities (Rocha, 2000). Singh et al (2014) found that perceived competence was correlated with the likelihood of participating in policy, regardless of the role researchers chose to play, from reporting evidence to engaging in policy, advocacy and decision-making. Experimental testing of the Research-to-Policy Collaboration (RPC) model, based at Pennsylvania State University, also found effects from their two-stage initiative consisting of participant capacity-building followed by collaborative partnerships with policy makers (Crowley et al, 2021a). Not only did participating researchers evince less concern about federal funding and the use of scientific evidence in policy, they were also more likely to engage in policy in order to promote some types of evidence use by decision-makers. Furthermore, through the process, participants accrued benefits to their own policy knowledge, engagement and research (Crowley et al, 2021b).

Enhanced engagement of scientific and technical experts in policy also has impacts on decision-makers. In a study of researcher communication with civic association leaders about climate change, Levine (2020) found that when researchers conveyed respect for a decision-makers’ knowledge and time, it increased the probability of receiving a positive response to their requests to engage. The RPC team also evaluated their programme’s effects on policy processes, finding that more research was used in legislation by congressional offices randomly selected for the programme and that those offices also reported valuing research more highly (Crowley et al, 2021b).

In order for more programmes to evaluate how their activities affect science–policy engagement, they must first conceptualise how audiences and activities combine to produce desired long-term impacts, and identify which impacts they seek to achieve. Logic models are often used as an initial step to visualise how programme components relate to each other (Goldman and Schmalz, 2006; McLaughlin and Jordan, 2015). For example, the SPIRIT Action Framework Modified for Academic-Policy Engagement Interventions (SPIRIT-ME) – developed to evaluate a UK policy engagement programme – follows a similar structure relating perceived needs to actions and impacts (Mäkelä et al, 2024).

Our study builds upon the previously outlined research, focusing on science policy programmes in the United States, with the following research questions:

RQ1: What is the current landscape of U.S. programmes to train researchers to engage in policy? What types of programmes exist, and where are they located geographically?

As a second step in our exploratory study, we seek to understand how these science policy programmes are designed and for what purpose, focusing on one state with a diverse mix: the Commonwealth of Virginia.

RQ2: How do science policy programmes conceptualise their audiences, activities and impacts?

Scientists’ roles and policy makers’ evidence use

The Singh et al (2014) and Crowley et al (2021a; 2021b) studies highlight two dimensions of the science–policy interface long identified as important by scholars: (1) the differing ways that scientists and engineers can choose to engage in policy (Steel et al, 2000; 2004; Pielke, 2007); and (2) the various means by which decision-makers can use research expertise (Weiss, 1979; Nutley et al, 2007). There are a range of potential US and international social norms around whether researchers should engage in policy, and if so how (Akerlof, 2022; Akerlof et al, 2022). For example, the debate around whether researchers should not just provide evidence, but also advocate for specific policies – which necessarily entails a value judgement that cannot be based on data alone – has long divided the scientific community (Cairney and Oliver, 2020). Furthermore, because the benefits of invoking the authority of science manifest across a range of decision-making and political contexts, policy makers can use information in a variety of ways, with varying barriers (Akerlof et al, 2024). Researchers such as Weiss (1979) and Nutley et al (2007) have described various forms of research use as conceptual (to understand problems), instrumental (to make policy decisions), and strategic or tactical (to rhetorically support and justify preferred policy solutions). For programmes seeking to engage researchers in policy processes, for which roles do they prepare their participants, and for which types of decision-maker evidence use? Answering these questions could illuminate how the science–policy interface may be changing.

RQ3: Which roles for researchers in policy do programmes address?

RQ4: Which forms of evidence use do they address?

Methods

In alignment with the joint national and state-level focus of the research questions, we conducted exploratory research to: (1) map the landscape of science policy programmes across the United States; and (2) understand how leaders in Virginia’s ecosystem of science policy programmes conceptualise and prioritise audiences, activities and long-term impacts (7+ years). The study entailed developing a national database of policy programmes, and a pre-survey and interviews with Virginia’s programme leaders. Data and materials are available at https://osf.io/uj2s4/.

Programme database development

To develop the database of US science policy programmes, we first identified data repositories such as those hosted by the National Conference of State Legislatures (NCSL, 2024) and the National Science Policy Network (NSPN, 2024; nd), along with inputs from professional networks (see detailed methodology, Supplementary Methodological Materials, Table 1, available at https://osf.io/uj2s4/). We reviewed programme websites for eligibility and collected additional details, including by contacting programme leaders to verify the data. The database includes programmes seeking to promote researchers’ engagement in policy processes, categorised based on activities and governance focus. Due to the dynamic nature of these programmes, the data reflects a specific period (spring–summer 2024).

Virginia’s science policy programmes

We utilised the national database to identify existing science policy programmes in Virginia and conducted pre-surveys and interviews with 12 of the 13 programme leaders between 7 March and 26 April 2024 (the leader of one student programme chose not to participate). The Virginia programme sample included six academic programmes, two fellowships and four student organisations. The study protocols were approved by George Mason University’s Institutional Review Board (IRB) [#2135839-2] and Virginia Tech’s IRB [#24-132]. Study participants were provided with a US $50 gift card and invited as co-authors in publishing the research findings.

Online pre-survey and interviews

We utilised a pre-survey to collect programme and demographic information from leaders, followed by in-depth interviews to gather detailed insights (see detailed methodology, Supplementary Materials, Tables 2–3). The data collected via the pre-survey were used to develop logic models for each programme, which were reviewed and discussed during the interviews. The instruments and data can also be found at https://osf.io/uj2s4/.

Analysis and data visualisation

Science policy programmes within the national database were coded for programme type: government placements and fellowships; government affairs placements; academic certificates, degrees and other trainings; student organisations; professional development and training; and networks. They were also coded based on focus on policy issues and government at the national level, in specific US regions, at the state level, at lower levels of governance, or as geographically agnostic. Examples of typical ‘national-level’ programmes include federal government fellowships hosted by scientific societies. Some programmes focus on specific US regions, such as the Virginia Scientist-Community Interface, which includes members from across the Southeast. The ecosystem of state-level programmes often includes student organisations within universities, fellowship placements in state government, and academic curricular programmes in higher education institutions.

To develop initial exploratory typologies of the short-term and long-term impacts, audiences, and activities that science policy programme leaders conceptualise and prioritise, we inductively coded the qualitative pre-survey and interview data from the respondents both into overarching categories and more detailed sub-categories (Saldaña, 2021) (see Supplementary Materials, Tables 4–6). The logic model format applied prioritised audiences as primary or secondary. Each audience was associated with specific activities and long-term impacts, both of which were assigned a significance rating (from 1 to 5 in increasing order of importance).

To illustrate the relationships between activities and impacts, and importance ratings to the specific programme, we employed Sankey figures and an interactive dashboard to trace these flows (Riehmann et al, 2005). The percentages of each category in the figures – impacts, audiences, activity – reflect weighted scoring with the numerator equalling summed importance rating scores for all references to a specific impact or activity code and the denominator equalling the sum of all rating scores for the respective category.

Sample

Nine of the 12 Virginia programmes in this study are hosted at Carnegie-designated R1 ‘very high research activity’ institutions (American Council on Education, 2024). Our respondents – the dozen programme leads – included PhD graduates six years or more past their degree (50%), doctoral students (42%) and a lawyer (8%). Out of the six PhD graduates and five doctoral students, almost three-quarters were from the physical or life sciences and engineering (for example, biology, marine science, biomedical engineering, astronomy and biochemistry), with the remaining three from public policy and public administration. Half of the programme leaders described themselves as men, and the other half as women. The vast majority of respondents (92%) self-identified as white, and one as Asian.

Findings

US landscape of science policy programmes [RQ1]

As of September 2024, the database of science policy programmes in the United States included 174 unique entries (Figure 1) (data available at OSF, https://osf.io/uj2s4/). The majority of science policy programmes (n=99; 56.9%) are state rather than national-level, and hosted by universities, scientific societies, and NOAA National Sea Grant College Programs that often – though not restrictively – centre on local and state-level policy issues and governance. One of these includes a state and wider regional focus. Two others have a regional or multi-state focus. For state programmes, we found that 80.8% are hosted by universities. The states with the most programmes in our database are Virginia (13), North Carolina (9), New York (9) and California (8). High-level findings include:

  • Approximately half of the state programmes (50.5%) are student organisations.

  • Another 21.2% are government placement programmes, with 2.0% also doubling as government affairs opportunities.

  • Approximately one in four (23.2%) are academic minors, certificates and other trainings focused on engaging researchers in science policy.

  • Only one of the programmes (1.0%) serves as a state-level network.

A map shows the number of science policy programmes in each of the states. The most science policy programmes are in Virginia (13), North Carolina (9), New York (9) and California (8).
Figure 1:

As of September 2024, 28 states and the District of Columbia appeared to have at least one active science policy programme. Many host a range of programmes, including government placements; academic certificates, degrees and trainings; and/or student organisations. Below, darker colours indicate a higher number of programmes

Citation: Evidence & Policy 2025; 10.1332/17442648Y2025D000000046

Virginia hosts the highest number of state-level science policy programmes with 13 programmes in total, including the Virginia Scientist-Community Interface, which also focuses on the Southeastern region of the United States. Virginia has the most academic programmes by state (n=6), followed by North Carolina, California and the District of Columbia with three each. Virginia also has more than one government placement/fellowship programme, as do Pennsylvania, Missouri and Colorado. New York and Virginia host the highest number of student organisations, each with five, followed by California (4).

Approximately four in ten of the US programmes (n=75) focus on national-level governance. Among those, scientific societies lead 76.0% of them across a breadth of disciplines, including the American Mathematical Society, Society for Neuroscience, American Geophysical Union, American Psychological Association and the Institute of Electrical and Electronics Engineers. In assessing the national programmes engaging academic researchers, we found that:

  • The majority are fellowships that place researchers in government (64.0%). Many of these represent American Association for the Advancement of Science partnerships with scientific societies.

  • In a parallel fellowship model (21.33%), scientists and engineers join government relations offices, national academies, or other entities interacting with the federal government. Two of the national-level government placement programmes also place fellows in government affairs positions; another 14 focus solely on this latter model.

  • Another subset of programmes (14.7%) provide professional development and training through workshops, bootcamps, and engagement in Capitol Hill days.

  • The remaining 2.7% are programmes serving as networks connecting organisations across the science policy ecosystem, such as the National Science Policy Network (NSPN) and Engaging Scientists & Engineers in Policy (ESEP) coalition.

Virginia’s science policy programmes [RQ2]

The 13 science policy programmes in Virginia fall into three categories: academic programmes, including degrees, certificates and training programmes (6); student organisations (5); and government placements, including fellowships (2) (Figure 2). None of these programmes pre-date 2016, with the majority (7) established between 2021 and 2023. As a result, the dozen current programme leads we interviewed started the programmes themselves (n=7) or were involved relatively early in programme formation (n=5), with earlier US historical models serving as inspiration. Student organisation leads (n=3) described their programmes as modelled after NSPN’s chapters or other student groups. Those running fellowships (n=2) named the National Sea Grant John A. Knauss Marine Policy Fellowship, AAAS Science & Technology Policy Fellowships and state-level programmes like CCST as models. Academic programme representatives (n=2) cited both university programmes such as those at UC Irvine and Carnegie Mellon, and science policy professional development opportunities from Sea Grant programmes and the AAAS Catalyzing Advocacy in Science and Engineering Workshop as models.

The figure lists the 13 science policy programmes in Virginia and their affiliated organisations.
Figure 2:

Virginia hosts 13 science policy programmes, with the majority affiliated with research-intensive public universities, and some with institutes or academies

Citation: Evidence & Policy 2025; 10.1332/17442648Y2025D000000046

Describing the origins of their programmes, the leads often pointed to being inspired by national-level programmes, like AAAS and Knauss Fellowships, connections with growing networks, such as NSPN, or opportunities that arose within their institutions to build new collaborative transdisciplinary programmes. In addition, they voiced a desire to make more of these opportunities available at their institutions or in the state as a whole.

Three-quarters of the dozen programmes are hosted by three of Virginia’s largest public R1 universities: The University of Virginia (2 programs), George Mason University (3), and Virginia Tech (4). A combination of other types of research-focused institutions – the Virginia Academy of Science, Engineering, and Medicine (VASEM), Virginia Institute of Marine Science (VIMS)/William & Mary (W&M) and Virginia Sea Grant – rounded out the list, each supporting one policy programme. Interviews revealed that most programmes operate with low levels of staffing – less than two full-time equivalent positions, posing challenges for desired long-term programme sustainability.

One challenge to the creation and maintenance of these programmes is funding. Programme leads of the Science, Technology, and Engineering in Policy (STEP) graduate certificate at Virginia Tech receive some funding through a National Science Foundation Research Traineeship (NRT). The two fellowship programmes also have external sources of funding. For example, the Commonwealth Coastal & Marine Policy Fellowship, sponsored by Virginia Sea Grant and the Virginia Environmental Endowment, receives funding from the NOAA/National Sea Grant Office and the Virginia Coastal Zone Management office. Each host office also contributes a portion of the fellow’s salary. For student organisations, funding is typically provided by their respective universities, although some also receive external funding through successful grant applications to networks like the NSPN, especially when their events align with the grantee’s requirements. Virginia Tech’s Presidential Postdoctoral Fellowship programme, including the Science Policy and Research Ethics track, is supported via internal funding from the Office of Research and Innovation.

Logic models and evidence use by Virginia’s science policy programmes

We found that the formal process of developing logic models linking the long-term impacts aspired with the prioritisation of audiences and activities was generally new for Virginia’s science policy programme leaders. Only one had developed an independent logic model at the time of the interviews. Most leaders (75.0%) were not aware of prior research studies or evaluations linking programme activities with desired impacts. Respondents who identified those types of evidence (n=3) cited studies recommending more cross pollination of science and policy among their students, internal assessments conducted by organisations with similar goals, and policy change theories and models. Only one programme had a means for assessing long-term impacts: using data mining to track where individuals who had participated in programme activities ended up in their careers.

Desired programmatic impacts

At the start of each pre-survey, before delving into the details of the logic models, we asked the leads to describe the problem or set of problems that their programmes are designed to address. Each programme leader described conditions within academia and government that they perceived contribute to a ‘gap’ between science and policy. Two respondents described these respectively as ‘deficits in understanding and capacity of STEM-H scholars1 and practitioners to effectively engage in policy processes’, and ‘a lack of scientific knowledge in the policy-sphere’.

When we next asked about the desired long-term impacts of their programmes, respondents described desired programme goals in 7+ years in four domains spanning academic and policy contexts: academic institutions, government policy processes, researchers’ engagement in policy, and workforce development and training (Supplementary Materials, Table 4). Each of the four impact domains were differentially cited and rated by importance across the three types of programmes (academic certificates, degrees and trainings; student organisations; and government placements and fellowships). These four domains can be further divided into 20 more specific desired programme impacts (Figures 35; Supplementary Materials, Table 4), with ‘more scientists in policy’ and ‘government employment of scientists’ as the most frequently cited. The 73 coded impacts were rated by respondents in order of importance, ranging from somewhat unimportant (2) to very important (5), with an average score of 4.4.

A Sankey diagram illustrates that the six academic certificate, degree and other training programmes in Virginia highly prioritised long-term impacts of their activities to their own institutions.
Figure 3:

[Sankey diagram–frequency x importance rating] Among the three types of science policy programmes, academic certificate, degree and other training programmes (n=6) weighed the importance of the long-term impacts to their institutions most strongly

Citation: Evidence & Policy 2025; 10.1332/17442648Y2025D000000046

A Sankey diagram illustrates that the two government placement/fellowship programmes in Virginia are highly focused on external engagement with greater importance placed on policy-maker audiences than other types of programmes.
Figure 4:

[Sankey diagram–frequency x importance rating] Government placement/fellowship programmes (n=2) are highly focused on external engagement with greater importance placed on policy-maker audiences

Citation: Evidence & Policy 2025; 10.1332/17442648Y2025D000000046

A Sankey diagram illustrates that the four student organisations in Virginia prioritise graduate student audiences through career support and external engagement activities.
Figure 5:

[Sankey diagram–frequency x importance rating] Student organisations (n=4) prioritise graduate student audiences through career support and external engagement activities

Citation: Evidence & Policy 2025; 10.1332/17442648Y2025D000000046

When asked why they ranked certain impacts as very important (5) to their programmes, several respondents described training researchers to engage in policy as integral: ‘So [that] we have scientific and technical experts that are more effective participants within policy processes.’ But they also described larger, more systemic impacts on processes underlying policy and research practices, including at the state level:

[The programme] was really born out of trying to have more evidence-based policy making in the state of Virginia. So, if we want to do that, we need people who understand evidence. … So, by increasing the number of scientists and engineers in that space, hopefully there can be more evidence-based policy making occurring in Virginia.

Another respondent pointed to the importance of teaching researchers how to advocate for funding: ‘I think we do a disservice to academics and not preparing them to be able to advocate for research funding, especially … going up and being able to talk to staffers and policy makers.’

Audiences

The leaders of all three types of science policy programmes described a broad range of audiences. Across each of the types of programmes, 6–9 audiences were identified with differing levels of prioritisation (Figures 35). All but one of the programme leads named students as their highest priority audience. Programme leads most often considered graduate students their main focus, with undergraduate students and postdoctoral researchers also frequently cited. Other audiences included: academic faculty members, alumni, businesses, local communities, non-academic researchers, other science/policy programmes, policy makers, returning professionals, staff, and other universities (Supplementary Materials, Table 5).

Student programme participants were described as lacking awareness of science policy as a field and needing opportunities to engage. For example, one respondent stated, ‘there are students wanting to do something other than lab work or the traditional scientist role, thus [the need for] introducing them to this field where they understand that their scientific knowledge is not confined to the lab and they have many transferable skills in [science policy]’. Others stated that academia traditionally has not helped students in understanding ways in which science can be used in policy and governance, because ‘in graduate school you focus so much on the academic side of your topic, that understanding how the levers of change operate, who to talk to about what needs to be changed are unknowns for most’.

Activities

The same types of activities – active learning, career support, conveyance of information, external engagement, immersive experiences and administrative programmatic support (Figures 35) – were reported across all three types of science policy programmes. Of the six activity categories, external engagement – whether with host offices, sponsors, other organisations, mentors, networks, speakers, or for the purpose of advocacy – was most frequently cited across all types of programmes, followed by conveying information. These six categories of activities were further broken down into 33 more detailed descriptions, such as the type of information being conveyed, the focus of active learning activities, the nature of policy immersion experiences and varieties of career support (Supplementary Materials, Table 6). Of the 176 audience-specific activities described and rated for importance by programme leads, these ranged from somewhat unimportant (2) to very important (5), with an average score of 4.29.

Indeed, respondents rated many activities highly, saying that these play key roles in producing the long-term impacts they seek. ‘Each of these activities are needed for the short-term or long-term outcome, right? Like those outcomes wouldn’t happen without these activities. So, it’s interconnected. … If students don’t know how to write a policy memo or an op-ed, they wouldn’t be able to write it and meet a legislator’, said one interviewee. Others described important activities as occurring between individuals in the programme cohort itself. In speaking about multidisciplinary collaboration – an oft-cited factor in developing societally relevant research (Gibbons et al, 1994) – one programme leader described the need to establish common definitions spanning disciplinary perspectives and to facilitate participants’ ability to collaborate.

Next, we explore the aggregate models by type of programme within Virginia’s science policy training ecosystem: academic certificates, degrees and training programmes; government placements and fellowships; and student organisations. Links to interactive visualisations for each figure are provided to trace these relationships easily, with the findings more thoroughly explored in the following sections.

Academic certificates, degrees and training programmes

The six leaders of Virginia’s academic programmes emphasised varying aspects of researchers’ engagement in policy – more effective engagement, more scientists in policy and more policy-relevant science – as their most important impacts (36.8% of impact importance ratings; see Figure 3). They also highlighted the significance of science policy engagement for their own institutions (28.5%), emphasising the need for more knowledgeable research communities, increased internal and external collaborations, stronger institutional leadership, and the establishment of new academic curricula. By way of comparison, academic institutional impact received a much lower rating of programmatic importance from leaders of student organisations (8.6%) and government placement/fellowship programmes (5.8%) (Figures 45). While less emphasised, changes in government policy processes (17.4%) – better policy, more use of science in policy and policy processes better designed to use science – were also specified as desired academic programme impacts.

Of the nine audiences named by the programme leaders, students and faculty members were most likely to be prioritised, with the highest combined importance ratings for activities involving graduate students, 26.3%; students-general, 18.4%; and academic faculty, 15.8%. However, programme models differ substantially. For example, the Virginia Tech (VT) +Policy Network’s one-day Policy Camp focuses on faculty continuing education, with postdocs and graduate students as secondary audiences. The camp is targeted towards those interested in learning about policy engagement via collaborative activities while networking with colleagues from other disciplines and units. In contrast, the Science Policy and Research Ethics track of VT’s Presidential Postdoctoral Fellowship programme supports postdoctoral researchers. The programme provides experience in science policy and research ethics within both government and higher education with an opportunity to conduct original research. Similarly, the Science, Technology, and Engineering in Policy (STEP) programme at VT focuses primarily on graduate students, with its primary offering being a graduate certificate and accompanying courses.

The programme activities emphasised typify higher education: conveying information (35.4%) and active learning (25.9%). These two categories focus both on essential knowledge (that is, science communication, scientific process, basics of government) and skills development (that is, engagement, science policy, multidisciplinary collaboration, policy memos, data analysis). Science communication and policy engagement topped the list as both areas of necessary knowledge (7.2%) and skill (5.8%) among respondents. The George Mason University Science and Technology Policy undergraduate minor and Science Policy Graduate Certificate are illustrative. ‘We provide communication training and help them organise their complex policy memos into easy-to-understand action items’, said a Mason policy programme lead. Active learning also plays an important role. VT’s +Policy Camp and STEP graduate certificate programme both feature structured role-play simulation exercises that allow participants to engage with key concepts in low-stakes but vivid, illustrative and entertaining ways. These exercises help participants to engage with others as well as reflect on their own potential roles in policy conversations.

Programmes also feature opportunities to directly engage with policy makers and other stakeholders. A key part of the Mason programmes is student interaction with a member of Congress or their staff where they discuss a current issue. The University of Virginia’s (UVA) PhD Plus programme works closely with the university’s federal relations team to bring students to Capitol Hill each spring; students also meet with the Virginia delegation to share their graduate research and advocate for federal funding.

Notably, these academic programmes do not operate in silos separate from fellowships and student organisations. For example, UVA’s PhD Plus programmes are co-run with student organisations, including the Science Policy Initiative and Virginia Science-Community Interface, and they support student participation in the AAAS CASE Workshop and COVES Fellowship. The Science Policy and Research Ethics track within VT’s Presidential Postdoctoral Fellowship programme also partners with VASEM to provide its postdocs the ability to participate in the COVES Fellowship. Finally, VT’s STEP programme works very closely with the Science, Policy, Education and Advocacy (SPEAC) student organisation.

Government placements and fellowship programmes

The logic models of Virginia’s two government fellowship programmes more heavily emphasised the importance of long-term impacts on workforce development and training (39.1%) and government policy processes (27.5%), as compared to academic programmes (Figure 4). In particular, programme leaders ascribed high importance to government employment of scientists (31.9%), and also detailed the significance of parallel long-term desired impacts to policy from their programmes: policy change (14.5%), better policy (7.2%) and more use of science in policy (5.8%).

Indeed, this category of programmes focused most strongly on policy makers as an audience (27.2%), as compared to academic programmes and student organisations, which placed little to no emphasis on them. Correspondingly, the programme leaders described external engagement activities as particularly important (40.4%), including networking, mentorship and mentorship training, and supporting host offices and sponsors. Compared to academic programmes, conveying information and active learning were rated of lesser importance (2.9%, 10.7%), as compared to immersive experiences and career development (15.4% each).

For example, a core aim of the Commonwealth Coastal & Marine Policy Fellowship, sponsored by Virginia Sea Grant and the Virginia Environmental Endowment, is to increase the capacity of understaffed state agencies by placing high quality, well-educated fellows in support of specific mission-related projects. Similarly, the Commonwealth of Virginia Engineering and Science (COVES) Fellowship administered by the Virginia Academy of Science, Engineering, and Medicine (VASEM) was established to strengthen ties between the scientific community and Virginia’s state government.

These fellowship programmes also foster participants’ professional development. The COVES programme provides training to equip STEM-H graduate students and postdocs with public service skills. In addition to hands-on policy-making experience, fellows participate in a science policy orientation to develop fundamental science policy and communication skills, and attend weekly professional development seminars. Similarly, Commonwealth Coastal & Marine Policy fellows participate in Virginia Sea Grant’s professional development programme where they engage in interdisciplinary team science training, interactive workshops on effective science communication, and shadowing opportunities with policy professionals working at various Virginia state agencies.

Student organisations

The leaders of Virginia’s student organisations identified workforce development and training (32.4%), and researchers’ engagement in policy (32.4%) as their most important areas of desired impact. In contrast to government placement programmes, which emphasised government employment, student organisations cited getting more scientists involved in policy (22.9%) and broadly increasing their awareness and interest in the field (18.1%). But, they also said their programmes seek to have long-term impacts on government policy processes (26.7%), whether through policy change or increased use of science in policy.

Graduate students serve as student organisations’ largest priority audience (52.7%), for which programmes focus on career support (26.7%) and external engagement (24.1%). Developing career pathways in science policy for this audience includes activities such as providing career information (12.9%), writing policy-oriented resumes (5.5%), assistance with fellowship applications (5.5%), and other professional development activities (3.7%). While external engagement activities for graduate students in their programmes further help bridge the gap to the policy sphere through speakers (16.4%), they also collaborate with other organisations (4.8%) and participate in advocacy (2.9%).

Workshops and speaker series are common ways for student organisations to advance their objectives. For instance, the Science Policy Network at George Mason University organises SciPol 101 sessions for their members. The sessions cover the fundamentals of science policy, including distinctions between ‘policy for science’ and ‘science for policy’, and introduce science diplomacy, potential career paths and engagement opportunities. This is also the case for the Science Policy Initiative (SPI) at the University of Virginia, which organises informational panels on policy topics and career paths, providing greater depth on common themes of interest and laying the foundation for a network of science policy professionals. SPI’s annual three-day Science Policy Bootcamp unites students from across the Commonwealth to learn about science policy basics, participate in almost a dozen diverse sessions (for example, science policy at different levels of government, the role of non-profits and non-governmental organisations, how to advocate for research funding, passing legislation in a partisan era), and practice science policy skills.

Student organisations also facilitate member engagement in policy projects. For example, Virginia Tech’s SPEAC provides members with opportunities to collaborate with the Town of Blacksburg and other organisations on initiatives. Organisations are often savvy in developing partnerships. For example, the Science Policy Initiative at the Virginia Institute of Marine Science leverages expertise among staff and alumni to fulfil students’ training and professional development goals. The organisation hosts panels with recent participants from government placement and fellowship programmes to share their experiences. The organisation also develops workshops on navigating the federal hiring process, including crafting a government-focused resume and identifying relevant job postings.

Normative roles for scientists in policy among science policy programmes [RQ3]

Researchers can engage in policy under the guise of many different roles (Steel et al, 2000; 2004; Pielke, 2007). But for which roles are they being trained? Of the 12 science policy programme leads interviewed, 11 named ‘working closely with policy makers and others to closely integrate scientific results in policy decisions’ as most clearly aligned with their underlying programme philosophies (Figure 6). For a quarter of respondents, this was the only appropriate role. Another third identified ‘reporting and interpreting results’ and ‘actively advocating for specific policies’ – along with integrating scientific results – as important to their programmes, while the final third selected various other combinations including an integrator role.

Two figures, side by side, show five types of roles that scientists and engineers can play in informing policy processes. ‘Integrate scientific results in policy decisions’ is the most commonly selected by science policy programme leads as aligning with their programme’s underlying philosophy. But the majority say there are a range of acceptable roles.
Figure 6:

The most normative role for scientists engaging in policy is as integrators based on responses from leaders among Virginia’s programmes (A), but the majority perceive a range of acceptable roles (B)

Citation: Evidence & Policy 2025; 10.1332/17442648Y2025D000000046

Evidence use [RQ4]

Just as scientists can choose to engage in policy in various ways, decision-makers can also use the scientific and technological information that scientists provide in differing ways. Virginia science policy programme leads, said that they most strongly address instrumental and conceptual uses of research – that is, research evidence shaping the core of a decision or issue, or contributing to a general understanding of that issue. Two-thirds said that their programmes did so to a ‘great extent’. They were less likely to say that the strategic and symbolic use of information – to sway opinions or confirm ideas – is a core focus (Figure 7).

A figure lists four ways in which research evidence can be used in policy processes: shaping a decision, justifying a position that has already been adopted, contributing towards a general understanding of an issue, and confirming a pre-existing understanding of the issue. Programme leads (66.7%) say that they most strongly address evidence use to shape a decision (instrumental) and contribute towards general understanding (conceptual) in their programmes.
Figure 7:

Instrumental and conceptual uses of science in policy processes are most strongly addressed by various Virginia science policy programmes

Citation: Evidence & Policy 2025; 10.1332/17442648Y2025D000000046

Discussion

In our landscape mapping of US science policy programmes, we identify a substantial number (n=174) and broad diversity of programmes that have emerged to engage researchers in policy processes, with the majority operating at the state level (n=99, 56.9%). Approximately half of programmes at the state level are student organisations (50.5%), rounded out by government placements and fellowships (21.2%), academic certificates, degrees and trainings (23.2%), and a network serving several science policy groups (1.0%). In contrast, at the national level, the preponderance of science policy programmes are fellowships that place researchers in government (64.0%) or in government affairs offices (21.3%), followed by professional development and training opportunities (14.7%) and policy networks (2.7%). The growth in student organisations speaks to a rapid expansion of interest in science policy among early career researchers, driving the need for many other types of opportunities across the United States. However, it is important to acknowledge that the speed at which new programmes are arising and the turnover in their organisational leadership, particularly for student-led organisations, means that the number and foci of programmes is unlikely to remain static.

The nationwide science policy programme database includes 13 from the Commonwealth of Virginia – academic certificates/degrees/training (6), student organisations (5) and government placements/fellowships (2) – the highest number of programs in any state. Pre-surveys and interviews with programme leaders in Virginia suggest that, even though at first glance these programmes reflect diverse models for ways to engage researchers in policy, they have many of the same goals in terms of long-term impacts, similar conceptualisations of important audiences and activities, and shared philosophical orientations towards appropriate roles for scientists and engineers in policy and the nature of evidence use in policy. Furthermore, while they espouse evidence-based decision-making processes in governance, programme leaders find it challenging to identify tools that would allow them to implement long-lasting, effective approaches and to assess their impacts.

The finding that these programmes face barriers in adopting evidence-based approaches aligns with international findings (Oliver et al, 2022). Only 4% of activities identified by Oliver and colleagues as building researcher skills and practice were evaluated. The authors diagnose the issue as programmes lacking clear aims and an understanding of the problem they are trying to address. While we found that logic models and more formalised theories of change were not typical among Virginia’s science policy programmes, the leads were able to provide substantial detail in the surveys and interviews about how their initiatives specifically contribute to this evolving landscape.

The long-term desired impacts that Virginia’s science policy programmes are seeking, based on our interviews with programme leads, fall across four overarching domains: academic institutions, government policy processes, researchers’ engagement in policy, and workforce development and training. While the types of programmes may initially seem quite different based on audience, length and depth of engagement, and type of activities undertaken by participants, they have much in common. When subdivided into 20 desired programmatic impacts, having ‘more scientists in policy’ and facilitating ‘government employment of scientists’ were most frequently cited. And leaders of all programmatic types – academic, student-run, and immersive government placements – seek to influence policy change as a long-term programme impact.

These findings suggest possible opportunities to scale and disseminate best practices within this rapidly growing ecosystem to support evidence-based approaches with the potential to both maximise programmatic effectiveness and advance general knowledge around how changes occur at science-policy interfaces. The development of programme logic models may be a good place to start. Only by specifying desired long-term programmatic impacts for the state of Virginia and nationwide can potential forms of assessment be identified and implemented. New capacity could be marshalled by building a broader state-level network for shared learning between different types of programmes and through partnerships between programme leads and social scientists with expertise in the use of research evidence, policy processes, evaluation and education.

Moreover, by viewing the state-level ecosystem of programmes as a whole, valuable lessons can be drawn to understand how audiences – who may participate in various series of science policy engagement initiatives during their careers – experience them and, as a result, how programmes can better complement each other’s strengths to benefit these audiences. Lastly, the finding that science policy opportunities exist in Virginia primarily at the doctoral level and within high research-intensive universities highlights barriers to developing these programmes and the lack of access for scientists and engineers from less research-intensive universities, including Historically Black Colleges and Universities and Minority-Serving Institutions. Making the design and implementation of these programmes more transparent, and facilitating connections across experts and institutions may enable those with fewer resources to create similar programmes based on those with demonstrated state-level success.

Conclusion

The United States is currently experiencing an era of rapid growth in science policy programmes of diverse types, but the landscape is also fluid. While some new programmes are likely to survive over the long term, others may lose leadership and/or resources over time. Building capacity and additional support, including through networks for sharing best practices and partnerships with social scientists, may increase the odds of these programmes growing, multiplying and thriving, with implications for the research enterprise and policy at various levels of government. Significant opportunities exist to grow the breadth and depth of these programmes, including among institutions and communities currently underserved. Current and future programmes, and their leaders and participants, may also benefit from greater specialisation as well as better coordination to develop a sustainable foundation upon which to build the future of the field where everyone is able to participate without barriers.

Note

1

Science, Technology, Engineering, Math and Health Sciences.

Funding

This study was funded by a 4-VA Collaborative Research Grant made to George Mason University and Virginia Tech primary investigators Akerlof and Schenk. Lee Solomon was funded by a NSF CAREER Award BMAT 2041751.

Acknowledgements

Thank you to Virginia’s science policy programme leaders for their generous contributions of their time in participating in the study. Thank you also to Ryan McIntyre, Matthew Diasio and Danielle DaCrema and for assisting in testing early versions of the instruments.

Contributor statement

All authors commented on and approved drafts of the original manuscript. KLA developed the initial study conceptualisation, and led funding acquisition and project administration. KLA, TS, AB, KM, and AS developed the methodological approach, performed data collection and analysis, and wrote the original draft. LE, SH, NL, SJL, RBJO, JLR, MRS, AS, CS, LS, and ALKV contributed to data collection and drafts of the article.

Research ethics statement

The study protocol was approved by George Mason University’s Institutional Review Board [#2135839] and Virginia Tech’s IRB [#24-132].

Supplementary methodological materials

A Supplementary Methodological Materials file is located at https://osf.io/uj2s4/.

Conflict of interest

The authors declare that there is no conflict of interest.

References

  • Akerlof, K.L. (2022) Beyond the sheltering academic silo: norms for scientists’ participation in policy, Progress in Molecular Biology and Translational Science, 188(1): 2944. doi: 10.1016/bs.pmbts.2021.11.007

    • Search Google Scholar
    • Export Citation
  • Akerlof, K.L., Allegra, A., Nelson, S., Gonnella, C., Washbourne, C. and Tyler, C. (2022) Global perspectives on scientists’ roles in legislative policymaking, Policy Sciences, 55(2): 35167. doi: 10.1007/s11077-022-09457-3

    • Search Google Scholar
    • Export Citation
  • Akerlof, K.L., Lemos, M.C., Cloyd, E.T., Heath, E., Nelson, S., Hathaway, J., et al (2024) Science communication in Congress: for what use?, Evidence & Policy, 20(3): 30019. doi: 10.1332/17442648Y2023D000000013

    • Search Google Scholar
    • Export Citation
  • Alberts, B., Gold, B.D., Martin, L.L. and Maxon, M.E. (2018) Opinion: how to bring science and technology expertise to state governments, Proceedings of the National Academy of Sciences, 115(9): 19525. doi: 10.1073/pnas.1800543115

    • Search Google Scholar
    • Export Citation
  • American Council on Education (2024) Basic Classification, Carnegie Classification of Institutions of Higher Education, https://carnegieclassifications.acenet.edu/carnegie-classification/classification-methodology/basic-classification/.

    • Search Google Scholar
    • Export Citation
  • Banal-Estañol, A., Jofre-Bonet, M., Iori, G., Maynou, L., Tumminello, M. and Vassallo, P. (2023) Performance-based research funding: evidence from the largest natural experiment worldwide, Research Policy, 52(6): 104780. doi: 10.1016/j.respol.2023.104780

    • Search Google Scholar
    • Export Citation
  • Bednarek, A.T., Wyborn, C., Cvitanovic, C., Meyer, R., Colvin, R.M., Addison, P.F.E., et al (2018) Boundary spanning at the science–policy interface: the practitioners’ perspectives, Sustainability Science, 13(4): 117583. doi: 10.1007/s11625-018-0550-9

    • Search Google Scholar
    • Export Citation
  • Blair, P.D. (2016) The evolving role of the US national academies of sciences, engineering, and medicine in providing science and technology policy advice to the US government, Palgrave Communications, 2: 16030. doi: 10.1057/palcomms.2016.30

    • Search Google Scholar
    • Export Citation
  • Bogenschneider, K. and Corbett, T.J. (2010) Evidence-Based Policymaking: Insights from Policy-Minded Researchers and Research-Minded Policymakers, London: Taylor & Francis.

    • Search Google Scholar
    • Export Citation
  • Brooks, H. (1990) Lessons of history: successive challenges to science policy, in S.E. Cozzens, P. Healey, A. Rip and J. Ziman (eds) The Research System in Transition, Dordrecht: Springer Netherlands, pp 1122. doi: 10.1007/978-94-009-2091-0_2

    • Search Google Scholar
    • Export Citation
  • Cairney, P. (2016) The Politics of Evidence-Based Policy Making, London: Springer Nature.

  • Cairney, P. and Oliver, K. (2020) How should academics engage in policymaking to achieve impact?, Political Studies Review, 18(2): 22844. doi: 10.1177/1478929918807714

    • Search Google Scholar
    • Export Citation
  • Collu, G., Lescak, E., Clements, T. and Bairzin, J. (nd) Genetics society of America policy fellowship database, https://genetics-gsa.org/policy/policy-fellowship-database/.

    • Search Google Scholar
    • Export Citation
  • Crowley, D.M., Scott, J.T., Long, E.C., Green, L., Giray, C., Gay, B., et al (2021a) Cultivating researcher-policymaker partnerships: a randomized controlled trial of a model for training public psychologists, American Psychologist, 76(8): 130722. doi: 10.1037/amp0000880

    • Search Google Scholar
    • Export Citation
  • Crowley, D.M., Scott, J.T., Long, E.C., Green, L., Israel, A., Supplee, L., et al (2021b) Lawmakers’ use of scientific evidence can be improved, Proceedings of the National Academy of Sciences, 118(9): e2012955118. doi: 10.1073/pnas.2012955118

    • Search Google Scholar
    • Export Citation
  • Diasio, M.A., DaCrema, D.F., Dudek, R.B., Harris, C.R., Schmehl, M.N., Schuerger, C.L., et al (2020) Developing science and technology policy fellowships in state governments without full-time legislatures, Journal of Science Policy & Governance, 16(1).

    • Search Google Scholar
    • Export Citation
  • Fainberg, A. (1994) From the Lab to the Hill: Essays Celebrating 20 Years of Congressional Science and Engineering Fellows, Washington, DC: American Association for the Advancement of Science.

    • Search Google Scholar
    • Export Citation
  • Gibbons, M., Limoges, C., Nowotny, H., Schwartzman, S., Scott, P. and Trow, M. (1994) The New Production of Knowledge: The Dynamics of Science and Research in Contemporary Societies, London: SAGE.

    • Search Google Scholar
    • Export Citation
  • Golden, W.T. (1988) Science and Technology Advice to the President, Congress, and Judiciary, New York: Pergamon Books.

  • Goldman, K.D. and Schmalz, K.J. (2006) Logic models: the picture worth ten thousand words, Health Promotion Practice, 7(1): 812. doi: 10.1177/1524839905283230

    • Search Google Scholar
    • Export Citation
  • Goodrich, K.A., Sjostrom, K.D., Vaughan, C., Nichols, L., Bednarek, A. and Lemos, M.C. (2020) Who are boundary spanners and how can we support them in making knowledge more actionable in sustainability fields?, Current Opinion in Environmental Sustainability, 42: 4551. doi: 10.1016/j.cosust.2020.01.001

    • Search Google Scholar
    • Export Citation
  • Gual Soler, M., Robinson, C.R. and Wang, T.C. (2017) Connecting Scientists to Policy Around the World: Landscape Analysis of Mechanisms Around the World Engaging Scientists and Engineers in Policy, Washington, DC: American Association for the Advancement of Science.

    • Search Google Scholar
    • Export Citation
  • Guston, D. (2000) Between Politics and Science: Assuring the Integrity and Productivity of Research, Cambridge and New York: Cambridge University Press.

    • Search Google Scholar
    • Export Citation
  • Guston, D.H. (2001) Boundary organizations in environmental policy and science: an introduction, Science, Technology, & Human Values, 26(4): 399408. doi: 10.1177/016224390102600401

    • Search Google Scholar
    • Export Citation
  • Guston, D.H. and Sarewitz, D. (eds) (2007) Shaping Science and Technology Policy: The Next Generation of Research, Madison, WI: University of Wisconsin Press.

    • Search Google Scholar
    • Export Citation
  • Ham, B. (2014) Graduate Students Dive into Science Advocacy with New Workshop, American Association for the Advancement of Science, https://www.aaas.org/news/graduate-students-dive-science-advocacy-new-workshop.

    • Search Google Scholar
    • Export Citation
  • Hicks, D. (2012) Performance-based university research funding systems, Research Policy, 41(2): 25161. doi: 10.1016/j.respol.2011.09.007

    • Search Google Scholar
    • Export Citation
  • Lane, J.I., Husbands Fealing, K., Marburger, J.H. and Shipp, S.S. (2011) The Science of Science Policy: A Handbook, Stanford, CA: Stanford University Press. doi: 10.1515/9780804781602

    • Search Google Scholar
    • Export Citation
  • Lemos, M.C., Arnott, J.C., Ardoin, N.M., Baja, K., Bednarek, A.T., Dewulf, A., et al (2018) To co-produce or not to co-produce, Nature Sustainability, 1(12): 7224. doi: 10.1038/s41893-018-0191-0

    • Search Google Scholar
    • Export Citation
  • Levine, A.S. (2020) Why do practitioners want to connect with researchers? Evidence from a field experiment, PS: Political Science & Politics, 53(4): 71217. doi: 10.1017/s1049096520000840

    • Search Google Scholar
    • Export Citation
  • Mäkelä, P., Boaz, A. and Oliver, K. (2024) A modified action framework to develop and evaluate academic-policy engagement interventions, Implementation Science, 19(1): 31. doi: 10.1186/s13012-024-01359-7

    • Search Google Scholar
    • Export Citation
  • McLaughlin, J.A. and Jordan, G.B. (2015) Using logic models, in Handbook of Practical Program Evaluation, Hoboken, NJ: John Wiley & Sons, Ltd, pp 6287. doi: 10.1002/9781119171386.ch3

    • Search Google Scholar
    • Export Citation
  • Muindi, F. and Luray, J. (2023) Visualizing the landscape of training initiatives for scientists in public engagement in the United States, Research!America, https://www.researchamerica.org/wp-content/uploads/2023/11/Visualizing-the-Landscape-of-Training-Initiatives-for-Scienitists-in-Public-Engagement-in-the-U.S._-Nov-2023.pdf.

    • Search Google Scholar
    • Export Citation
  • NASEM (2024) The Christine Mirzayan Science and Technology Policy Graduate Fellowship Program, National Academies of Sciences, Engineering, and Medicine, https://mirzayanfellow.nas.edu/.

    • Search Google Scholar
    • Export Citation
  • NCSL (2024) Integrating Evidence and Data in the Policymaking Process Through Science Policy Fellowships, National Conference of State Legislatures, https://www.ncsl.org/center-for-results-driven-governing/science-policy-fellowships.

    • Search Google Scholar
    • Export Citation
  • Neal, H.A. (2009) Science policy 101: taking science policy out of Washington and into the classroom, https://sciencepolicy.us/uploads/3/4/8/7/34871902/aaas_talk_14feb09.pdf.

    • Search Google Scholar
    • Export Citation
  • Neal, H.A., Smith, T.L. and McCormick, J.B. (2008) Beyond Sputnik: U.S. Science Policy in the Twenty-First Century, Ann Arbor, MI: University of Michigan Press.

    • Search Google Scholar
    • Export Citation
  • NOAA Sea Grant (2023) John A. Knauss marine policy fellowship program: fall 2023, https://seagrant.noaa.gov/wp-content/uploads/2023/09/FINAL-Knauss-Fellowship-Factsheet-Sept2023-Instructions.pdf.

    • Search Google Scholar
    • Export Citation
  • NSPN (nd) National science policy network resource list, https://airtable.com/appKv1wWwGkr0OhGY/shr8J7s6QDqzX4uGz/tblCpmGJYqqtCcjS8.

  • NSPN (2024) National science policy network, https://www.scipolnetwork.org/.

  • Nutley, S.M., Walter, I. and Davies, H.T.O. (2007) Using Evidence: How Research Can Inform Public Services, Bristol: Policy Press.

  • Oliver, K., Hopkins, A., Boaz, A., Guillot-Wright, S. and Cairney, P. (2022) What works to promote research-policy engagement?, Evidence & Policy, 18(4): 691713. doi: 10.1332/174426421x16420918447616

    • Search Google Scholar
    • Export Citation
  • Pain, E. (2014) How scientists can influence policy, Science. doi: 10.1126/science.caredit.a1400042

  • Pearl, J. and Gareis, K. (2020) A retrospective evaluation of the STPF program, https://www.aaas.org/sites/default/files/2020-07/STPF%20Evaluation%20Presentation%20PDF.pdf.

    • Search Google Scholar
    • Export Citation
  • Pielke, R.A. (2007) The Honest Broker: Making Sense of Science in Policy and Politics, Cambridge and New York: Cambridge University Press.

    • Search Google Scholar
    • Export Citation
  • Pielke, R. and Klein, R. (2009) The rise and fall of the science advisor to the president of the United States, Minerva, 47(1): 729. doi: 10.1007/s11024-009-9117-3

    • Search Google Scholar
    • Export Citation
  • Riehmann, P., Hanfler, M. and Froehlich, B. (2005) Interactive Sankey diagrams, in IEEE Symposium on Information Visualization, London: Institute of Electrical and Electronics Engineers, pp 23340.

    • Search Google Scholar
    • Export Citation
  • Rocha, C.J. (2000) Evaluating experiential teaching methods in a policy practice course, Journal of Social Work Education, 36(1): 5363. doi: 10.1080/10437797.2000.10778989

    • Search Google Scholar
    • Export Citation
  • Sabatier, P.A. (1988) An advocacy coalition framework of policy change and the role of policy-oriented learning therein, Policy Sciences, 21(2): 12968. doi: 10.1007/bf00136406

    • Search Google Scholar
    • Export Citation
  • Saldaña, J. (2021) The Coding Manual for Qualitative Researchers, 4th edn, New York: SAGE.

  • Scott, J.T., Larson, J.C., Buckingham, S.L., Maton, K.I. and Crowley, D.M. (2019) Bridging the research–policy divide: pathways to engagement and skill development, American Journal of Orthopsychiatry, 89(4): 43441. doi: 10.1037/ort0000389

    • Search Google Scholar
    • Export Citation
  • Sigma Xi (2023) Policy Pathways, Civico, https://gocivico.com/policy-pathways/.

  • Singel, K. (2021) The list of #SciPol fellowships, https://docs.google.com/document/d/1-S407AQIu0cZI0HASAer1mBmZd57KXFMhKk79JnzBYk/edit.

    • Search Google Scholar
    • Export Citation
  • Singh, G.G., Tam, J., Sisk, T.D., Klain, S.C., Mach, M.E., Martone, R.G., et al (2014) A more social science: barriers and incentives for scientists engaging in policy, Frontiers in Ecology and the Environment, 12(3): 1616. doi: 10.1890/130011

    • Search Google Scholar
    • Export Citation
  • Sivertsen, G. (2017) Unique, but still best practice? The Research Excellence Framework (REF) from an international perspective, Palgrave Communications, 3(1): 16. doi: 10.1057/palcomms.2017.78

    • Search Google Scholar
    • Export Citation
  • Smith, T.L. and McCormick, J.B. (2021) Science policy degree programs, in Beyond Sputnik: U.S. Science Policy in the 21st Century, http://sciencepolicy.us/science-policy-degree-programs.html.

    • Search Google Scholar
    • Export Citation
  • Steel, B., Lach, D., List, P., et al (2000) The role of scientists in the natural resource and environmental policy process: a comparison of Canadian and American publics, Journal of Environmental Systems, 28(2): 13355.

    • Search Google Scholar
    • Export Citation
  • Steel, B., Lach, D., List, P. and Shindler, B. (2001) The role of scientists in the natural resource and environmental policy process: a comparison of Canadian and American publics, Journal of Environmental Systems, 28(2): 13355.

    • Search Google Scholar
    • Export Citation
  • Steel, B., List, P., Lach, D. and Shindler, B. (2004) The role of scientists in the environmental policy process: a case study from the American west, Environmental Science & Policy, 7(1): 113. doi: 10.1016/j.envsci.2003.10.004

    • Search Google Scholar
    • Export Citation
  • Stine, J.K. (1995) Twenty Years of Science in the Public Interest: A History of the Congressional Science and Engineering Fellowship Program, Washington, DC: American Association for the Advancement of Science.

    • Search Google Scholar
    • Export Citation
  • Teich, A.H. and Lita, S.J. (2003) Expanding the role of the congressional science and engineering fellowship program, in M.G. Morgan and J.M. Peha (eds) Science and Technology Advice for Congress, Washington, DC: Resources for the Future, pp 13456.

    • Search Google Scholar
    • Export Citation
  • Terämä, E., Smallman, M., Lock, S.J., Johnson, C. and Austwick, M.Z. (2016) Beyond academia–Interrogating research impact in the research excellence framework, PloS One, 11(12). doi: 10.1371/journal.pone.0168533

    • Search Google Scholar
    • Export Citation
  • Torrance, H. (2020) The research excellence framework in the United Kingdom: processes, consequences, and incentives to engage, Qualitative Inquiry, 26(7): 7719. doi: 10.1177/1077800419878748

    • Search Google Scholar
    • Export Citation
  • UCS (2024) Science Advocacy Training Series, Union of Concerned Scientists (UCS), https://www.ucsusa.org/resources/science-advocacy-training-series.

    • Search Google Scholar
    • Export Citation
  • Weiss, C.H. (1979) Many meanings of research utilization, Public Administration Review, 39(5): 42631. doi: 10.2307/3109916