Skip to main content
Top
Published in: Health Research Policy and Systems 1/2017

Open Access 01-12-2017 | Research

Policymakers’ experience of a capacity-building intervention designed to increase their use of research: a realist process evaluation

Authors: Abby Haynes, Sue Brennan, Sally Redman, Anna Williamson, Steve R. Makkar, Gisselle Gallego, Phyllis Butow

Published in: Health Research Policy and Systems | Issue 1/2017

Login to get access

Abstract

Background

An intervention’s success depends on how participants interact with it in local settings. Process evaluation examines these interactions, indicating why an intervention was or was not effective, and how it (and similar interventions) can be improved for better contextual fit. This is particularly important for innovative trials like Supporting Policy In health with Research: an Intervention Trial (SPIRIT), where causal mechanisms are poorly understood. SPIRIT was testing a multi-component intervention designed to increase the capacity of health policymakers to use research.

Methods

Our mixed-methods process evaluation sought to explain variation in observed process effects across the six agencies that participated in SPIRIT. Data collection included observations of intervention workshops (n = 59), purposively sampled interviews (n = 76) and participant feedback forms (n = 553). Using a realist approach, data was coded for context-mechanism-process effect configurations (retroductive analysis) by two authors.

Results

Intervention workshops were very well received. There was greater variation of views regarding other aspects of SPIRIT such as data collection, communication and the intervention’s overall value. We identified nine inter-related mechanisms that were crucial for engaging participants in these policy settings: (1) Accepting the premise (agreeing with the study’s assumptions); (2) Self-determination (participative choice); (3) The Value Proposition (seeing potential gain); (4) ‘Getting good stuff’ (identifying useful ideas, resources or connections); (5) Self-efficacy (believing ‘we can do this!’); (6) Respect (feeling that SPIRIT understands and values one’s work); (7) Confidence (believing in the study’s integrity and validity); (8) Persuasive leadership (authentic and compelling advocacy from leaders); and (9) Strategic insider facilitation (local translation and mediation). These findings were used to develop tentative explanatory propositions and to revise the programme theory.

Conclusion

This paper describes how SPIRIT functioned in six policy agencies, including why strategies that worked well in one site were less effective in others. Findings indicate a complex interaction between participants’ perception of the intervention, shifting contextual factors, and the form that the intervention took in each site. Our propositions provide transferable lessons about contextualised areas of strength and weakness that may be useful in the development and implementation of similar studies.
Appendix
Available only for authorised users
Literature
1.
go back to reference Pawson R. The Science of Evaluation: A Realist Manifesto. London: Sage; 2013.CrossRef Pawson R. The Science of Evaluation: A Realist Manifesto. London: Sage; 2013.CrossRef
2.
go back to reference Greenhalgh T, Wong G, Jagosh J, Greenhalgh J, Manzano A, Westhorp G, Pawson R. Protocol—the RAMESES II study: developing guidance and reporting standards for realist evaluation. BMJ Open. 2015;5(8):e008567.CrossRefPubMedPubMedCentral Greenhalgh T, Wong G, Jagosh J, Greenhalgh J, Manzano A, Westhorp G, Pawson R. Protocol—the RAMESES II study: developing guidance and reporting standards for realist evaluation. BMJ Open. 2015;5(8):e008567.CrossRefPubMedPubMedCentral
3.
go back to reference Pawson R, Tilley N. Realist Evaluation. South Australia: Community Matters; 2004. Pawson R, Tilley N. Realist Evaluation. South Australia: Community Matters; 2004.
4.
go back to reference Astbury B. Some reflections on Pawson’s Science of evaluation: a realist manifesto. Evaluation. 2013;19:383–401.CrossRef Astbury B. Some reflections on Pawson’s Science of evaluation: a realist manifesto. Evaluation. 2013;19:383–401.CrossRef
5.
go back to reference Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.CrossRefPubMedPubMedCentral Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.CrossRefPubMedPubMedCentral
6.
go back to reference Funnell SC, Rogers PJ. Purposeful Program Theory: Effective Use of Theories of Change and Logic Models. San Francisco: Jossey-Bass; 2011. Funnell SC, Rogers PJ. Purposeful Program Theory: Effective Use of Theories of Change and Logic Models. San Francisco: Jossey-Bass; 2011.
7.
go back to reference Ritchie J, Lewis J. Qualitative Research Practice: A Guide for Social Science Students and Researchers. London: SAGE; 2003. Ritchie J, Lewis J. Qualitative Research Practice: A Guide for Social Science Students and Researchers. London: SAGE; 2003.
9.
go back to reference Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, Moore L, O’Cathain A, Tinati T, Wight D, Baird J. Process evaluation of complex interventions: Medical Research Council guidance. BMJ. 2015;350:h1258.CrossRefPubMedPubMedCentral Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, Moore L, O’Cathain A, Tinati T, Wight D, Baird J. Process evaluation of complex interventions: Medical Research Council guidance. BMJ. 2015;350:h1258.CrossRefPubMedPubMedCentral
10.
go back to reference Astbury B, Leeuw FL. Unpacking black boxes: mechanisms and theory building in evaluation. Am J Eval. 2010;31:363–81.CrossRef Astbury B, Leeuw FL. Unpacking black boxes: mechanisms and theory building in evaluation. Am J Eval. 2010;31:363–81.CrossRef
11.
go back to reference Nielsen K. How can we make organizational interventions work? Employees and line managers as actively crafting interventions. Hum Relat. 2013;66:1029–50.CrossRef Nielsen K. How can we make organizational interventions work? Employees and line managers as actively crafting interventions. Hum Relat. 2013;66:1029–50.CrossRef
12.
go back to reference Nielsen K, Randall R. Opening the black box: Presenting a model for evaluating organizational-level interventions. Eur J Work Organ Psy. 2013;22:601–17.CrossRef Nielsen K, Randall R. Opening the black box: Presenting a model for evaluating organizational-level interventions. Eur J Work Organ Psy. 2013;22:601–17.CrossRef
13.
go back to reference Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82:581–629.CrossRefPubMedPubMedCentral Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82:581–629.CrossRefPubMedPubMedCentral
14.
go back to reference McLean S, Moss G. They're happy, but did they make a difference? Applying Kirkpatrick's framework to the evaluation of a national leadership program. Can J Prog Eval. 2003;18:1–23. McLean S, Moss G. They're happy, but did they make a difference? Applying Kirkpatrick's framework to the evaluation of a national leadership program. Can J Prog Eval. 2003;18:1–23.
15.
go back to reference Wells M, Williams B, Treweek S, Coyle J, Taylor J. Intervention description is not enough: evidence from an in-depth multiple case study on the untold role and impact of context in randomised controlled trials of seven complex interventions. Trials. 2012;13:95–111.CrossRefPubMedPubMedCentral Wells M, Williams B, Treweek S, Coyle J, Taylor J. Intervention description is not enough: evidence from an in-depth multiple case study on the untold role and impact of context in randomised controlled trials of seven complex interventions. Trials. 2012;13:95–111.CrossRefPubMedPubMedCentral
16.
go back to reference Datta A, Shaxson L, Pellini A. Capacity, Complexity and Consulting: Lessons from Managing Capacity Development Projects. ODI Working Paper 344. London: Overseas Development Institute; 2012. Datta A, Shaxson L, Pellini A. Capacity, Complexity and Consulting: Lessons from Managing Capacity Development Projects. ODI Working Paper 344. London: Overseas Development Institute; 2012.
17.
go back to reference Moore GF. Developing a Mixed Methods Framework for Process Evaluations of Complex Interventions: The Case of the National Exercise Referral Scheme Policy Trial in Wales. Cardiff: Cardiff University; 2010. Moore GF. Developing a Mixed Methods Framework for Process Evaluations of Complex Interventions: The Case of the National Exercise Referral Scheme Policy Trial in Wales. Cardiff: Cardiff University; 2010.
18.
go back to reference Schein EH. Organizational Culture and Leadership. London: John Wiley & Sons; 2010. Schein EH. Organizational Culture and Leadership. London: John Wiley & Sons; 2010.
19.
go back to reference Devos G, Buelens M, Bouckenooghe D. Contribution of content, context, and process to understanding openness to organizational change: two experimental stimulation studies. J Soc Psychol. 2007;147:607–30.CrossRefPubMed Devos G, Buelens M, Bouckenooghe D. Contribution of content, context, and process to understanding openness to organizational change: two experimental stimulation studies. J Soc Psychol. 2007;147:607–30.CrossRefPubMed
20.
22.
go back to reference Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F. Implementation Research: A Synthesis of the Literature, Florida Mental Health Institute Publication 231. Tampa: National Implementation Research Network, University of South Florida; 2005. Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F. Implementation Research: A Synthesis of the Literature, Florida Mental Health Institute Publication 231. Tampa: National Implementation Research Network, University of South Florida; 2005.
23.
go back to reference Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008;337:a1655.CrossRefPubMedPubMedCentral Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008;337:a1655.CrossRefPubMedPubMedCentral
24.
go back to reference Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, Moore L, O’Cathain A, Tinati T, Wight D, Baird J. Process Evaluation of Complex Interventions: Medical Research Council Guidance. A Report Prepared on Behalf of the MRC Population Health Science Research Network. London: Institute of Education; 2015. Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, Moore L, O’Cathain A, Tinati T, Wight D, Baird J. Process Evaluation of Complex Interventions: Medical Research Council Guidance. A Report Prepared on Behalf of the MRC Population Health Science Research Network. London: Institute of Education; 2015.
25.
go back to reference Hoddinott P, Britten J, Pill R. Why do interventions work in some places and not others: A breastfeeding support group trial. Soc Sci Med. 2010;70:769–78.CrossRefPubMed Hoddinott P, Britten J, Pill R. Why do interventions work in some places and not others: A breastfeeding support group trial. Soc Sci Med. 2010;70:769–78.CrossRefPubMed
26.
go back to reference Ferlie E, Dopson S, Fitzgerald L, Locock L. Renewing policy to support evidence-based health care. Public Admin. 2009;87:837–52.CrossRef Ferlie E, Dopson S, Fitzgerald L, Locock L. Renewing policy to support evidence-based health care. Public Admin. 2009;87:837–52.CrossRef
27.
go back to reference Oliver K, Lorenc T, Innvær S. New directions in evidence-based policy research: a critical analysis of the literature. Health Res Policy Sys. 2014;12:34.CrossRef Oliver K, Lorenc T, Innvær S. New directions in evidence-based policy research: a critical analysis of the literature. Health Res Policy Sys. 2014;12:34.CrossRef
28.
go back to reference Haynes A, Gillespie JA, Derrick GE, Hall WD, Redman S, Chapman S, Sturk H. Galvanizers, guides, champions, and shields: the many ways that policymakers use public health researchers. Milbank Q. 2011;89:564–98.CrossRefPubMedPubMedCentral Haynes A, Gillespie JA, Derrick GE, Hall WD, Redman S, Chapman S, Sturk H. Galvanizers, guides, champions, and shields: the many ways that policymakers use public health researchers. Milbank Q. 2011;89:564–98.CrossRefPubMedPubMedCentral
29.
go back to reference Liverani M, Hawkins B, Parkhurst JO. Political and institutional influences on the use of evidence in public health policy. A systematic review. PLoS ONE. 2013;8:e77404.CrossRefPubMedPubMedCentral Liverani M, Hawkins B, Parkhurst JO. Political and institutional influences on the use of evidence in public health policy. A systematic review. PLoS ONE. 2013;8:e77404.CrossRefPubMedPubMedCentral
30.
go back to reference Russell J, Greenhalgh T, Byrne E, McDonnell J. Recognizing rhetoric in health care policy analysis. J Health Serv Res Policy. 2008;13:40–6.CrossRefPubMed Russell J, Greenhalgh T, Byrne E, McDonnell J. Recognizing rhetoric in health care policy analysis. J Health Serv Res Policy. 2008;13:40–6.CrossRefPubMed
31.
go back to reference Majone G. Evidence, argument, and persuasion in the policy process. New Haven, CT: Yale University Press; 1989. Majone G. Evidence, argument, and persuasion in the policy process. New Haven, CT: Yale University Press; 1989.
32.
go back to reference Rycroft-Malone J. The PARIHS framework—a framework for guiding the implementation of evidence‐based practice. J Nurs Care Qual. 2004;19:297–304.CrossRefPubMed Rycroft-Malone J. The PARIHS framework—a framework for guiding the implementation of evidence‐based practice. J Nurs Care Qual. 2004;19:297–304.CrossRefPubMed
33.
go back to reference Crilly T, Jashapara A, Ferlie E. Research Utilisation & Knowledge Mobilisation: A Scoping Review of the Literature. Report for the National Institute for Health Research Service Delivery and Organization. London: Queen's Printer and Controller of HMSO; 2010. Crilly T, Jashapara A, Ferlie E. Research Utilisation & Knowledge Mobilisation: A Scoping Review of the Literature. Report for the National Institute for Health Research Service Delivery and Organization. London: Queen's Printer and Controller of HMSO; 2010.
34.
go back to reference Tomoaia-Cotisel A, Scammon DL, Waitzman NJ, Cronholm PF, Halladay JR, Driscoll DL, Solberg LI, Hsu C, Tai-Seale M, Hiratsuka V, et al. Context matters: the experience of 14 research teams in systematically reporting contextual factors important for practice change. Ann Fam Med. 2013;11:S115–23.CrossRefPubMedPubMedCentral Tomoaia-Cotisel A, Scammon DL, Waitzman NJ, Cronholm PF, Halladay JR, Driscoll DL, Solberg LI, Hsu C, Tai-Seale M, Hiratsuka V, et al. Context matters: the experience of 14 research teams in systematically reporting contextual factors important for practice change. Ann Fam Med. 2013;11:S115–23.CrossRefPubMedPubMedCentral
35.
go back to reference Clark AM, MacIntyre PD, Cruickshank J. A critical realist approach to understanding and evaluating heart health programmes. Health. 2007;11:513–39.CrossRefPubMed Clark AM, MacIntyre PD, Cruickshank J. A critical realist approach to understanding and evaluating heart health programmes. Health. 2007;11:513–39.CrossRefPubMed
36.
go back to reference The CIPHER Investigators. Supporting Policy In health with Research: an Intervention Trial (SPIRIT)—protocol for a stepped wedge trial. BMJ Open. 2014;4(7):e005293.CrossRefPubMedCentral The CIPHER Investigators. Supporting Policy In health with Research: an Intervention Trial (SPIRIT)—protocol for a stepped wedge trial. BMJ Open. 2014;4(7):e005293.CrossRefPubMedCentral
37.
go back to reference Makkar SR, Turner T, Williamson A, Louviere J, Redman S, Haynes A, Green S, Brennan S. The development of ORACLe: a measure of an organisation’s capacity to engage in evidence-informed health policy. Health Res Policy Syst. 2016;14:4.CrossRefPubMedPubMedCentral Makkar SR, Turner T, Williamson A, Louviere J, Redman S, Haynes A, Green S, Brennan S. The development of ORACLe: a measure of an organisation’s capacity to engage in evidence-informed health policy. Health Res Policy Syst. 2016;14:4.CrossRefPubMedPubMedCentral
38.
go back to reference Makkar SR, Brennan S, Turner T, Williamson A, Redman S, Green S. The development of SAGE: a tool to evaluate how policymakers’ engage with and use research in health policymaking. Res Evaluat. 2016;25:315–28.CrossRef Makkar SR, Brennan S, Turner T, Williamson A, Redman S, Green S. The development of SAGE: a tool to evaluate how policymakers’ engage with and use research in health policymaking. Res Evaluat. 2016;25:315–28.CrossRef
39.
go back to reference Makkar SR, Williamson A, Turner T, Redman S, Louviere J. Using conjoint analysis to develop a system of scoring policymakers’ use of research in policy and program development. Health Res Policy Syst. 2015;13:35.CrossRefPubMedPubMedCentral Makkar SR, Williamson A, Turner T, Redman S, Louviere J. Using conjoint analysis to develop a system of scoring policymakers’ use of research in policy and program development. Health Res Policy Syst. 2015;13:35.CrossRefPubMedPubMedCentral
40.
go back to reference Makkar SR, Williamson A, Turner T, Redman S, Louviere J. Using conjoint analysis to develop a system to score research engagement actions by health decision makers. Health Res Policy Syst. 2015;13:22.CrossRefPubMedPubMedCentral Makkar SR, Williamson A, Turner T, Redman S, Louviere J. Using conjoint analysis to develop a system to score research engagement actions by health decision makers. Health Res Policy Syst. 2015;13:22.CrossRefPubMedPubMedCentral
41.
go back to reference Brennan SE, McKenzie JE, Turner T, Redman S, Makkar S, Williamson A, Haynes A, Green SE. Development and validation of SEER (Seeking, Engaging with and Evaluating Research): a measure of policymakers’ capacity to engage with and use research. Health Res Policy Syst. 2017;15:1.CrossRefPubMedPubMedCentral Brennan SE, McKenzie JE, Turner T, Redman S, Makkar S, Williamson A, Haynes A, Green SE. Development and validation of SEER (Seeking, Engaging with and Evaluating Research): a measure of policymakers’ capacity to engage with and use research. Health Res Policy Syst. 2017;15:1.CrossRefPubMedPubMedCentral
42.
go back to reference Redman S, Turner T, Davies H, Williamson A, Haynes A, Brennan S, Milat A, O'Connor D, Blyth F, Jorm L, Green S. The SPIRIT Action Framework: A structured approach to selecting and testing strategies to increase the use of research in policy. Soc Sci Med. 2015;136–137:147–55.CrossRefPubMed Redman S, Turner T, Davies H, Williamson A, Haynes A, Brennan S, Milat A, O'Connor D, Blyth F, Jorm L, Green S. The SPIRIT Action Framework: A structured approach to selecting and testing strategies to increase the use of research in policy. Soc Sci Med. 2015;136–137:147–55.CrossRefPubMed
43.
go back to reference Fridrich A, Jenny GJ, Bauer GF. The context, process, and outcome evaluation model for organisational health interventions. Biomed Res Int. 2015;2015:Article ID 414832. Fridrich A, Jenny GJ, Bauer GF. The context, process, and outcome evaluation model for organisational health interventions. Biomed Res Int. 2015;2015:Article ID 414832.
44.
go back to reference Westhorp G, Prins E, Kusters C, Hultink M, Guijt I, Brouwers J. Realist Evaluation: An Overview. Report from an Expert Seminar with Dr Gill Westhorp. Wageningen: Centre for Development Innovation, Wageningen University; 2011. Westhorp G, Prins E, Kusters C, Hultink M, Guijt I, Brouwers J. Realist Evaluation: An Overview. Report from an Expert Seminar with Dr Gill Westhorp. Wageningen: Centre for Development Innovation, Wageningen University; 2011.
45.
46.
go back to reference Haynes A, Brennan S, Carter S, O’Connor D, Huckel Schneider C, Turner T, Gallego G. Protocol for the process evaluation of a complex intervention designed to increase the use of research in health policy and program organisations (the SPIRIT study). Implement Sci. 2014;9:113.CrossRefPubMedPubMedCentral Haynes A, Brennan S, Carter S, O’Connor D, Huckel Schneider C, Turner T, Gallego G. Protocol for the process evaluation of a complex intervention designed to increase the use of research in health policy and program organisations (the SPIRIT study). Implement Sci. 2014;9:113.CrossRefPubMedPubMedCentral
47.
go back to reference Martin GP, Ward V, Hendy J, Rowley E, Nancarrow S, Heaton J, Britten N, Fielden S, Ariss S. The challenges of evaluating large-scale, multi-partner programmes: the case of NIHR CLAHRCs. Evid Policy. 2011;7:489–509.CrossRef Martin GP, Ward V, Hendy J, Rowley E, Nancarrow S, Heaton J, Britten N, Fielden S, Ariss S. The challenges of evaluating large-scale, multi-partner programmes: the case of NIHR CLAHRCs. Evid Policy. 2011;7:489–509.CrossRef
48.
49.
50.
go back to reference Pawson R. Middle range theory and program theory evaluation: from provenance to practice. In: Leeuw F, Vaessen J, editors. Mind the Gap: Perspectives on Policy Evaluation and the Social Sciences, vol. 16. Piscataway, NJ: Transaction Press; 2009. p. 171–203. Pawson R. Middle range theory and program theory evaluation: from provenance to practice. In: Leeuw F, Vaessen J, editors. Mind the Gap: Perspectives on Policy Evaluation and the Social Sciences, vol. 16. Piscataway, NJ: Transaction Press; 2009. p. 171–203.
51.
go back to reference McEvoy P, Richards D. A critical realist rationale for using a combination of quantitative and qualitative methods. J Res Nurs. 2006;11:66–78.CrossRef McEvoy P, Richards D. A critical realist rationale for using a combination of quantitative and qualitative methods. J Res Nurs. 2006;11:66–78.CrossRef
52.
go back to reference Dalkin SM, Greenhalgh J, Jones D, Cunningham B, Lhussier M. What’s in a mechanism? Development of a key concept in realist evaluation. Implement Sci. 2015;10:49.CrossRefPubMedPubMedCentral Dalkin SM, Greenhalgh J, Jones D, Cunningham B, Lhussier M. What’s in a mechanism? Development of a key concept in realist evaluation. Implement Sci. 2015;10:49.CrossRefPubMedPubMedCentral
53.
go back to reference Cartwright N. Knowing what we are talking about: why evidence doesn't always travel. Evid Policy. 2013;9:97–112.CrossRef Cartwright N. Knowing what we are talking about: why evidence doesn't always travel. Evid Policy. 2013;9:97–112.CrossRef
54.
go back to reference Merton RK. On sociological theories of the middle range. In: Social Theory and Social Structure. New York: Simon & Schuster, The Free Press; 1949. pp. 39–53. Merton RK. On sociological theories of the middle range. In: Social Theory and Social Structure. New York: Simon & Schuster, The Free Press; 1949. pp. 39–53.
55.
go back to reference Punton M, Vogel I, Lloyd R. Reflections from a realist evaluation in progress: scaling ladders and stitching theory. In: CDI Practice Papers. Brighton: Institute of Development Studies; 2016. Punton M, Vogel I, Lloyd R. Reflections from a realist evaluation in progress: scaling ladders and stitching theory. In: CDI Practice Papers. Brighton: Institute of Development Studies; 2016.
56.
go back to reference Abimbola S, Molemodile SK, Okonkwo OA, Negin J, Jan S, Martiniuk AL. ‘The government cannot do it all alone’: realist analysis of the minutes of community health committee meetings in Nigeria. Health Policy Plan. 2016;31(3):332–45.CrossRefPubMed Abimbola S, Molemodile SK, Okonkwo OA, Negin J, Jan S, Martiniuk AL. ‘The government cannot do it all alone’: realist analysis of the minutes of community health committee meetings in Nigeria. Health Policy Plan. 2016;31(3):332–45.CrossRefPubMed
57.
go back to reference Adams A, Sedalia S, McNab S, Sarker M. Lessons learned in using realist evaluation to assess maternal and newborn health programming in rural Bangladesh. Health Policy Plan. 2016;31:267–75.CrossRefPubMed Adams A, Sedalia S, McNab S, Sarker M. Lessons learned in using realist evaluation to assess maternal and newborn health programming in rural Bangladesh. Health Policy Plan. 2016;31:267–75.CrossRefPubMed
58.
go back to reference Rushmer R, Hunter D, Steven A. Using interactive workshops to prompt knowledge exchange: a realist evaluation of a knowledge to action initiative. Public Health. 2014;128(6):552–60.CrossRefPubMed Rushmer R, Hunter D, Steven A. Using interactive workshops to prompt knowledge exchange: a realist evaluation of a knowledge to action initiative. Public Health. 2014;128(6):552–60.CrossRefPubMed
59.
go back to reference Haynes A, Brennan S, Redman S, Williamson A, Gallego G, Butow P. Figuring out fidelity: a worked example of the methods used to identify, critique and revise the essential elements of a contextualised intervention in health policy agencies. Implement Sci. 2016;11:23.CrossRefPubMedPubMedCentral Haynes A, Brennan S, Redman S, Williamson A, Gallego G, Butow P. Figuring out fidelity: a worked example of the methods used to identify, critique and revise the essential elements of a contextualised intervention in health policy agencies. Implement Sci. 2016;11:23.CrossRefPubMedPubMedCentral
60.
go back to reference Danermark B, Ekstrom M, Jakobsen L, Karlsson J. Explaining Society: An Introduction to Critical Realism in the Social Sciences. London: Routledge; 2002. Danermark B, Ekstrom M, Jakobsen L, Karlsson J. Explaining Society: An Introduction to Critical Realism in the Social Sciences. London: Routledge; 2002.
61.
go back to reference Chen H-T. Practical Program Evaluation: Assessing and Improving Planning, Implementation, and Effectiveness. Thousand Oaks: Sage; 2005.CrossRef Chen H-T. Practical Program Evaluation: Assessing and Improving Planning, Implementation, and Effectiveness. Thousand Oaks: Sage; 2005.CrossRef
62.
go back to reference Kitson AL, Rycroft-Malone J, Harvey G, McCormack B, Seers K, Titchen A. Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges. Implement Sci. 2008;3:1.CrossRefPubMedPubMedCentral Kitson AL, Rycroft-Malone J, Harvey G, McCormack B, Seers K, Titchen A. Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges. Implement Sci. 2008;3:1.CrossRefPubMedPubMedCentral
63.
go back to reference Hawe P, Shiell A, Riley T. Theorising interventions as events in systems. Am J Community Psychol. 2009, 43:267–76. Hawe P, Shiell A, Riley T. Theorising interventions as events in systems. Am J Community Psychol. 2009, 43:267–76.
64.
go back to reference Damschroder LJ, Lowery JC. Evaluation of a large-scale weight management program using the consolidated framework for implementation research (CFIR). Implement Sci. 2013;8:51.CrossRefPubMedPubMedCentral Damschroder LJ, Lowery JC. Evaluation of a large-scale weight management program using the consolidated framework for implementation research (CFIR). Implement Sci. 2013;8:51.CrossRefPubMedPubMedCentral
65.
go back to reference May C, Finch T. Implementing, embedding, and integrating practices: an outline of normalization process theory. Sociology. 2009;43:535–54.CrossRef May C, Finch T. Implementing, embedding, and integrating practices: an outline of normalization process theory. Sociology. 2009;43:535–54.CrossRef
67.
go back to reference Greenhalgh T, Humphrey C, Hughes J, Macfarlane F, Butler C, Pawson R. How do you modernize a health service? A realist evaluation of whole-scale transformation in London. Milbank Q. 2009;87:391–416.CrossRefPubMedPubMedCentral Greenhalgh T, Humphrey C, Hughes J, Macfarlane F, Butler C, Pawson R. How do you modernize a health service? A realist evaluation of whole-scale transformation in London. Milbank Q. 2009;87:391–416.CrossRefPubMedPubMedCentral
68.
69.
go back to reference Ritchie J, Spencer L, O'Connor W. Carrying out qualitative analysis. In: Ritchie J, Lewis J, editors. Qualitative Research Practice: A Guide for Social Science Students and Researchers. London: SAGE; 2003. p. 219–62. Ritchie J, Spencer L, O'Connor W. Carrying out qualitative analysis. In: Ritchie J, Lewis J, editors. Qualitative Research Practice: A Guide for Social Science Students and Researchers. London: SAGE; 2003. p. 219–62.
70.
go back to reference International Pty Ltd QSR. NVivo Qualitative Data Analysis Software: Version 10. 2012. International Pty Ltd QSR. NVivo Qualitative Data Analysis Software: Version 10. 2012.
71.
go back to reference Charlson FJ, Moran AE, Freedman G, Norman RE, Stapelberg NJ, Baxter AJ, Vos T, Whiteford HA. The contribution of major depression to the global burden of ischemic heart disease: a comparative risk assessment. BMC Med. 2013;11:250.CrossRefPubMedPubMedCentral Charlson FJ, Moran AE, Freedman G, Norman RE, Stapelberg NJ, Baxter AJ, Vos T, Whiteford HA. The contribution of major depression to the global burden of ischemic heart disease: a comparative risk assessment. BMC Med. 2013;11:250.CrossRefPubMedPubMedCentral
72.
go back to reference Thorne S. Interpretive Description. Walnut Creek: Left Coast Press; 2008. Thorne S. Interpretive Description. Walnut Creek: Left Coast Press; 2008.
73.
go back to reference Thomas DR. A general inductive approach for analyzing qualitative evaluation data. Am J Eval. 2006;27:237–46.CrossRef Thomas DR. A general inductive approach for analyzing qualitative evaluation data. Am J Eval. 2006;27:237–46.CrossRef
74.
go back to reference Silverman D. Doing Qualitative Research: A Practical Handbook. 4th ed. London: SAGE Publications; 2013. Silverman D. Doing Qualitative Research: A Practical Handbook. 4th ed. London: SAGE Publications; 2013.
75.
go back to reference Sayer A. Method in Social Science: A Realist Approach. 2nd ed. Abingdon: Routledge; 2010. Sayer A. Method in Social Science: A Realist Approach. 2nd ed. Abingdon: Routledge; 2010.
76.
go back to reference Sandelowski M. Combining qualitative and quantitative sampling, data collection, and analysis techniques in mixed-method studies. Res Nurs Health. 2000;23:246–55.CrossRefPubMed Sandelowski M. Combining qualitative and quantitative sampling, data collection, and analysis techniques in mixed-method studies. Res Nurs Health. 2000;23:246–55.CrossRefPubMed
77.
go back to reference Caracelli VJ, Greene JC. Data analysis strategies for mixed-method evaluation designs. Educ Eval Policy An. 1993;15:195–207.CrossRef Caracelli VJ, Greene JC. Data analysis strategies for mixed-method evaluation designs. Educ Eval Policy An. 1993;15:195–207.CrossRef
78.
go back to reference Tremblay D, Touati N, Roberge D, Denis J-L, Turcotte A, Samson B. Conditions for production of interdisciplinary teamwork outcomes in oncology teams: protocol for a realist evaluation. Implement Sci. 2014;9:76.CrossRefPubMedPubMedCentral Tremblay D, Touati N, Roberge D, Denis J-L, Turcotte A, Samson B. Conditions for production of interdisciplinary teamwork outcomes in oncology teams: protocol for a realist evaluation. Implement Sci. 2014;9:76.CrossRefPubMedPubMedCentral
79.
go back to reference Reed M. Reflections on the ‘realist turn’ in organization and management studies. J Manage Stud. 2005;42:1621–44.CrossRef Reed M. Reflections on the ‘realist turn’ in organization and management studies. J Manage Stud. 2005;42:1621–44.CrossRef
80.
go back to reference Jagosh J, Pluye P, Wong G, Cargo M, Salsberg J, Bush PL, Herbert CP, Green LW, Greenhalgh T, Macaulay AC. Critical reflections on realist review: insights from customizing the methodology to the needs of participatory research assessment. Res Synth Methods. 2014;5:131–41.CrossRefPubMed Jagosh J, Pluye P, Wong G, Cargo M, Salsberg J, Bush PL, Herbert CP, Green LW, Greenhalgh T, Macaulay AC. Critical reflections on realist review: insights from customizing the methodology to the needs of participatory research assessment. Res Synth Methods. 2014;5:131–41.CrossRefPubMed
82.
go back to reference Dearing JW. Applying diffusion of innovation theory to intervention development. Res Social Work Prac. 2009;19:503–18.CrossRef Dearing JW. Applying diffusion of innovation theory to intervention development. Res Social Work Prac. 2009;19:503–18.CrossRef
83.
go back to reference Edwards M, Evans M. Getting evidence into policy-making: parliamentary triangle seminar report. In: ANZSIG Insights. Canberra: ANZOG Institute for Governance, University of Canberra; 2011. Edwards M, Evans M. Getting evidence into policy-making: parliamentary triangle seminar report. In: ANZSIG Insights. Canberra: ANZOG Institute for Governance, University of Canberra; 2011.
84.
go back to reference Aguinis H, Henle CA. Ethics in research. In: Rogelberg S, editor. Handbook of Research Methods in Industrial and Organizational Psychology. Oxford: Blackwell; 2004. p. 34–56. Aguinis H, Henle CA. Ethics in research. In: Rogelberg S, editor. Handbook of Research Methods in Industrial and Organizational Psychology. Oxford: Blackwell; 2004. p. 34–56.
85.
go back to reference Srivastava UR, Singh M. Psychological empowerment at the work place. Global J Bus Man. 2008;2:53–73.CrossRef Srivastava UR, Singh M. Psychological empowerment at the work place. Global J Bus Man. 2008;2:53–73.CrossRef
86.
go back to reference Barnes C, Blake H, Pinder D. Creating and Delivering your Value Proposition: Managing Customer Experience for Profit. London: Kogan Page Publishers; 2009. Barnes C, Blake H, Pinder D. Creating and Delivering your Value Proposition: Managing Customer Experience for Profit. London: Kogan Page Publishers; 2009.
87.
go back to reference Bandura A. Self-efficacy in Changing Societies. New York: Cambridge University Press; 1995. Bandura A. Self-efficacy in Changing Societies. New York: Cambridge University Press; 1995.
Metadata
Title
Policymakers’ experience of a capacity-building intervention designed to increase their use of research: a realist process evaluation
Authors
Abby Haynes
Sue Brennan
Sally Redman
Anna Williamson
Steve R. Makkar
Gisselle Gallego
Phyllis Butow
Publication date
01-12-2017
Publisher
BioMed Central
Published in
Health Research Policy and Systems / Issue 1/2017
Electronic ISSN: 1478-4505
DOI
https://doi.org/10.1186/s12961-017-0234-4

Other articles of this Issue 1/2017

Health Research Policy and Systems 1/2017 Go to the issue