There is growing interest in and recognition of the need to use scientific evidence to inform policymaking. However, many of the existing studies on the use of research evidence (URE) have been largely qualitative, and the majority of existing quantitative measures are underdeveloped or were tested in regional or context-dependent settings. We are unaware of any quantitative measures of URE with national policymakers in the US.
Explore how to measure URE quantitatively by validating a measure of congressional staff’s attitudes and behaviors regarding URE, the Legislative Use of Research Survey (LURS), and by discussing the lessons learned through administering the survey.
A 68-item survey was administered to 80 congressional staff to measure their reported research use, value of research, interactions with researchers, general information sources, and research information sources. Confirmatory factor analyses were conducted on each of these five scales. We then trimmed the number of items, based on a combination of poor factor loadings and theoretical rationale, and ran the analyses on the trimmed subscales.
We substantially improved our model fits for each scale over the original models and all items had acceptable factor loadings with our trimmed 35-item survey. We also describe the unique set of challenges and lessons learned from surveying congressional staff.
This work contributes to the transdisciplinary field of URE by offering a tool for studying the mechanisms that can bridge research and policy and shedding light into best practices for measuring URE with national policymakers in the US.
Bogenschneider, K. (1995) Roles for professionals in building family policy: a case study of state family impact seminars, Family Relations, 44(1): 5–12. doi: 10.2307/584735
Bogenschneider, K., Day, E. and Parrott, E. (2019) Revisiting theory on research use: turning to policymakers for fresh insights, American Psychologist, 74(7): 778–93. doi: 10.1037/amp0000460
Bogenschneider, K., Little, O.M. and Johnson, K. (2013) Policymakers’ use of social science research: looking within and across policy actors, Journal of Marriage and Family, 75(2): 263–75. doi: 10.1111/jomf.12009
Bogenschneider, K., Olson, J.R., Linney, K.D. and Mills, J. (2000) Connecting research and policymaking: implications for theory and practice from the family impact seminars, Family Relations, 49(3): 327–39. doi: 10.1111/j.1741-3729.2000.00327.x
Bogenschneider, K., Olson, J.R., Mills, J. and Linney, K.D. (2002) How can we connect research and knowledge with state policymaking? Lessons from the Wisconsin Family Impact Seminars, in K. Bogenschneider (ed) Family Policy Matters: How Policymaking Affects Families and What Professionals Can Do, Mahwah, NJ: Lawrence Erlbaum Associates, Inc, pp 187–218.
Brennan, S.E., McKenzie, J.E., Turner, T., Redman, S., Makkar, S., Williamson, A., Haynes, A. and Green, S.E. (2017) Development and validation of SEER (Seeking, Engaging with and Evaluating Research): a measure of policymakers’ capacity to engage with and use research, Health Research Policy and Systems, 15(1): 1. doi: 10.1186/s12961-016-0162-8
Campbell, D.M., Redman, S., Jorm, L., Cooke, M., Zwi, A.B. and Rychetnik, L. (2009) Increasing the use of evidence in health policy: practice and views of policymakers and researchers, Australia and New Zealand Health Policy, 6(1): 21. doi: 10.1186/1743-8462-6-21
Commission on Evidence-Based Policymaking (2017) The Promise of Evidence-Based Policymaking, Washington DC: Commission on Evidence-Based Policymaking.
Crowley, D.M., Scott, T.B. and Fishbein, D. (2018) Translating prevention research for evidence-based policymaking: results from the research-to-policy collaboration pilot, Prevention Science, 19(2): 260–70. doi: 10.1007/s11121-017-0833-x
Dietrich, B.J., Lasley, S., Mondak, J.J., Remmel, M.L. and Turner, J. (2012) Personality and legislative politics: the big five trait dimensions among US state legislators, Political Psychology, 33(2): 195–210. doi: 10.1111/j.1467-9221.2012.00870.x
Fisher III, S.H. and Herrick, R. (2013) Old versus new: the comparative efficiency of mail and internet surveys of state legislators, State Politics & Policy Quarterly, 13(2): 147–63. doi: 10.1177/1532440012456540
Frasure, J. (2008) Analysis of instruments measuring nurses’ attitudes towards research utilization: a systematic review, Journal of Advanced Nursing, 61(1): 5–18. doi: 10.1111/j.1365-2648.2007.04525.x
Gitomer, D.H. and Crouse, K. (2019) Studying the Use of Research Evidence: A Review of Methods, New York: William T. Grant Foundation.
Hall, R. (1996) Participation in Congress, New Haven, CT: Yale University Press.
Haskins, R. and Margolis, G. (2014) Show me the Evidence: Obama’s Fight for Rigor and Results in Social Policy, Washington, DC: Brookings Institution Press.
Hertel-Fernandez, A., Mildenberger, M. and Stokes, L.C. (2019) Legislative staff and representation in Congress, American Political Science Review, 113(1): 1–18. doi: 10.1017/S0003055418000606
Hu, L.T. and Bentler, P.M. (1999) Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives, Structural Equation Modeling: A Multidisciplinary Journal, 6(1): 1–55. doi: 10.1080/10705519909540118
Hysom, T. (2008) Communicating with Congress: Recommendations for Improving the Democratic Dialogue, Washington, DC: Congressional Management Foundation.
Innvær, S., Vist, G., Trommald, M. and Oxman, A. (2002) Health policy-makers’ perceptions of their use of evidence: a systematic review, Journal of Health Services Research & Policy, 7(4): 239–44.
Jensen, J.M. (2011) Explaining congressional staff members’ decisions to leave the hill, Congress & the Presidency, 38(1): 39–59. doi: 10.1080/07343469.2010.501645
Kenny, C., Rose, D.C., Hobbs, A., Tyler, C. and Blackstock, J. (2017) The Role of Research in the UK Parliament, vol 1, London: Houses of Parliament.
Lance, C.E. and Vandenberg, R.J. (2002) Confirmatory factor analysis, in F. Drasgow and N. Schmitt (eds) The Jossey-Bass Business and Management Series. Measuring and Analyzing Behavior in Organizations: Advances in Measurement and Data Analysis, San Francisco, CA: Jossey-Bass, pp 221–54.
Lawlor, J., Mills, K., Neal, Z., Neal, J.W., Wilson, C. and McAlindon, K. (2019) Approaches to measuring use of research evidence in K-12 settings: a systematic review, Educational Research Review, 27: 218–28. doi: 10.1016/j.edurev.2019.04.002
Lawrence, N.S., Chambers, J.C., Morrison, S.M., Bestmann, S., O’Grady, G., Chambers, C.D. and Kythreotis, A.P. (2017) The Evidence Information Service as a new platform for supporting evidence-based policy: a consultation of UK parliamentarians, Evidence & Policy, 13(2): 275–316.
Makkar, S.R., Brennan, S., Turner, T., Williamson, A., Redman, S. and Green, S. (2016) The development of SAGE: A tool to evaluate how policymakers engage with and use research in health policymaking, Research Evaluation, 25(3): 315–28. doi: 10.1093/reseval/rvv044
Neal, Z.P. and Neal, J.W. (2018) Measuring Research Use and the Promise of Big Data, New York: William T. Grant Foundation.
Nichols, D., Dowdy, D., Atteberry, H., Menendez, T. and Hoelscher, D.M. (2017) Texas legislator survey: lessons learned from interviewing state politicians about obesity policies, Texas Public Health Journal, 69(2): 14–23.
Nutley, S.M., Walter, I. and Davies, H.T. (2007) Using Evidence: How Research Can Inform Public Services, Bristol: Policy Press.
Oliver, K., Innvar, S., Lorenc, T., Woodman, J. and Thomas, J. (2014) A systematic review of barriers to and facilitators of the use of evidence by policymakers, BMC Health Services Research, 14(1): 1. doi: 10.1186/1472-6963-14-1
Orton, L., Lloyd-Williams, F., Taylor-Robinson, D., O’Flaherty, M. and Capewell, S. (2011) The use of research evidence in public health decision making processes: systematic review, PloS One, 6(7): e21704.
Palinkas, L.A., Garcia, A.R., Aarons, G.A., Finno-Velasquez, M., Holloway, I.W., Mackie, T.I., Leslie, L.K. and Chamberlain, P. (2016) Measuring use of research evidence: the structured interview for evidence use, Research on Social Work Practice, 26(5): 550–64. doi: 10.1177/1049731514560413
Pomerantz, A.M. (2012) Informed consent to psychotherapy (empowered collaboration), in S.J. Knapp, M.C. Gottlieb, M.M. Handelsman and L.D. VandeCreek (eds) APA Handbook of Ethics in Psychology, Washington, DC: American Psychological Association, pp 311–32.
Quorum (2017) https://www.quorum.us/.
R Core Team (2019) R: A Language and Environment for Statistical Computing, Vienna: R Foundation for Statistical Computing.
Rosseel, Y. (2012) Lavaan: an R package for structural equation modeling and more, Journal of Statistical Software, 48(2): 1–36. doi: 10.18637/jss.v048.i02
Schumacker, R.E. and Lomax, R.G. (2010) A Beginner’s Guide to Structural Equation Modeling, 3rd edn, Hove: Psychology Press.
Scott, T., Long, E.C., Giray, C. and Crowley, M. (2020) Testing science communication strategies among legislators in the era of COVID-19, paper presented at the Penn State Prevention Research Center COVID-19 Seminar Series.
Scott, J.T., Larson, J.C., Buckingham, S.L., Maton, K.I. and Crowley, D.M. (2019) Bridging the research–policy divide: pathways to engagement and skill development, American Journal of Orthopsychiatry, 89(4): 434. doi: 10.1037/ort0000389
Sieber, J.E. (2012) Research with vulnerable populations, in S.J. Knapp, M.C. Gottlieb, M.M. Handelsman and L.D. VandeCreek (eds) APA Handbook of Ethics in Psychology, Washington, DC: American Psychological Association, pp 371–84.
Squires, J.E., Estabrooks, C.A., O’Rourke, H.M., Gustavsson, P., Newburn-Cook, C.V. and Wallin, L. (2011) A systematic review of the psychometric properties of self-report research utilization measures used in healthcare, Implementation Science, 6(1): 83. doi: 10.1186/1748-5908-6-83
Stein, C.H. and Mankowski, E.S. (2004) Asking, witnessing, interpreting, knowing: conducting qualitative research in community psychology, American Journal of Community Psychology, 33(1–2): 21–35. doi: 10.1023/B:AJCP.0000014316.27091.e8
Stevens, A. (2011) Telling policy stories: an ethnographic study of the use of evidence in policymaking in the UK, Journal of Social Policy, 40(2): 237–55. doi: 10.1017/S0047279410000723
St-Yves, M. (2006) ‘The psychology of rapport: five basic rules,’ in T. Williamson (ed) Investigative Interviewing, Portland: Willan, pp 87–106.
Tseng, V. (2012) The uses of research in policy and practice, Social Policy Report, 26(2): 1–24. doi: 10.1002/j.2379-3988.2012.tb00071.x
UK Parliament (2020) Celebrating two years of the Knowledge Exchange Unit in UK Parliament: our achievements, learnings, and next steps, https://www.parliament.uk/globalassets/keu-two-year-report.pdf.
Weiss, C.H. (1999) Research-policy linkages: how much influence does social science research have? In UNESCO, World Social Science Report 1999, Paris: UNESCO/Elsevier, pp 194–205.
Whiteman, D. (1997) Congress and policy analysis: a context for assessing the use of OTA projects, Technological Forecasting and Social Change, 54(2–3): 177–89. doi: 10.1016/S0040-1625(96)00184-9
Wilcox, B.L., Weisz, P.V., Miller, M.K. (2005) Practical guidelines for educating policymakers: the family impact seminar as an approach to advancing the interests of children and families in the policy arena, Journal of Clinical Child and Adolescent Psychology, 34(4): 638–45. doi: 10.1207/s15374424jccp3404_6
Worthington, R.L. and Whittaker, T.A. (2006) Scale development research: a content analysis and recommendations for best practices, The Counseling Psychologist, 34(6): 806–38. doi: 10.1177/0011000006288127
May 2022 onwards | Past Year | Past 30 Days | |
---|---|---|---|
Abstract Views | 354 | 354 | 29 |
Full Text Views | 65 | 65 | 0 |
PDF Downloads | 33 | 33 | 0 |