Skip to main content
Top
Published in: Implementation Science 1/2015

Open Access 01-12-2015 | Methodology

Figuring out fidelity: a worked example of the methods used to identify, critique and revise the essential elements of a contextualised intervention in health policy agencies

Authors: Abby Haynes, Sue Brennan, Sally Redman, Anna Williamson, Gisselle Gallego, Phyllis Butow, The CIPHER team

Published in: Implementation Science | Issue 1/2015

Login to get access

Abstract

Background

In this paper, we identify and respond to the fidelity assessment challenges posed by novel contextualised interventions (i.e. interventions that are informed by composite social and psychological theories and which incorporate standardised and flexible components in order to maximise effectiveness in complex settings).
We (a) describe the difficulties of, and propose a method for, identifying the essential elements of a contextualised intervention; (b) provide a worked example of an approach for critiquing the validity of putative essential elements; and (c) demonstrate how essential elements can be refined during a trial without compromising the fidelity assessment.
We used an exploratory test-and-refine process, drawing on empirical evidence from the process evaluation of Supporting Policy In health with Research: an Intervention Trial (SPIRIT). Mixed methods data was triangulated to identify, critique and revise how the intervention’s essential elements should be articulated and scored.

Results

Over 50 provisional elements were refined to a final list of 20 and the scoring rationalised. Six (often overlapping) challenges to the validity of the essential elements were identified. They were (1) redundant—the element was not essential; (2) poorly articulated—unclear, too specific or not specific enough; (3) infeasible—it was not possible to implement the essential element as intended; (4) ineffective—the element did not effectively deliver the change principles; (5) paradoxical—counteracting vital goals or change principles; or (6) absent or suboptimal—additional or more effective ways of operationalising the theory were identified. We also identified potentially valuable ‘prohibited’ elements that could be used to help reduce threats to validity.

Conclusions

We devised a method for critiquing the construct validity of our intervention’s essential elements and modifying how they were articulated and measured, while simultaneously using them as fidelity indicators. This process could be used or adapted for other contextualised interventions, taking evaluators closer to making theoretically and contextually sensitive decisions upon which to base fidelity assessments.
Appendix
Available only for authorised users
Literature
1.
go back to reference Grimshaw JM, Zwarenstein M, Tetroe JM, Godin G, Graham ID, Lemyre L, et al. Looking inside the black box: a theory-based process evaluation alongside a randomised controlled trial of printed educational materials (the Ontario printed educational message, OPEM) to improve referral and prescribing practices in primary care in Ontario, Canada. Implement Sci. 2007;2(1):38.CrossRefPubMedPubMedCentral Grimshaw JM, Zwarenstein M, Tetroe JM, Godin G, Graham ID, Lemyre L, et al. Looking inside the black box: a theory-based process evaluation alongside a randomised controlled trial of printed educational materials (the Ontario printed educational message, OPEM) to improve referral and prescribing practices in primary care in Ontario, Canada. Implement Sci. 2007;2(1):38.CrossRefPubMedPubMedCentral
2.
go back to reference Harachi TW, Abbott RD, Catalano RF, Haggerty KP, Fleming CB. Opening the black box: Using process evaluation measures to assess implementation and theory building. Am J Commun Psychol. 1999;27(5):711–31.CrossRef Harachi TW, Abbott RD, Catalano RF, Haggerty KP, Fleming CB. Opening the black box: Using process evaluation measures to assess implementation and theory building. Am J Commun Psychol. 1999;27(5):711–31.CrossRef
3.
go back to reference Wilson DK, Griffin S, Saunders RP, Kitzman-Ulrich H, Meyers DC, Mansard L. Using process evaluation for program improvement in dose, fidelity and reach: the ACT trial experience. Int J Behav Nutr Phy. 2009;6(1):79.CrossRef Wilson DK, Griffin S, Saunders RP, Kitzman-Ulrich H, Meyers DC, Mansard L. Using process evaluation for program improvement in dose, fidelity and reach: the ACT trial experience. Int J Behav Nutr Phy. 2009;6(1):79.CrossRef
4.
go back to reference Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, et al. Process evaluation of complex interventions: Medical Research Council guidance. A report prepared on behalf of the MRC Population Health Science Research Network. London: Institute of Education; 2015. Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, et al. Process evaluation of complex interventions: Medical Research Council guidance. A report prepared on behalf of the MRC Population Health Science Research Network. London: Institute of Education; 2015.
6.
go back to reference Bellg AJ, Borrelli B, Resnick B, Hecht J, Minicucci DS, Ory M, et al. Enhancing treatment fidelity in health behavior change studies: Best practices and recommendations from the NIH Behavior Change Consortium. Health Psychol. 2004;23(5):443–51. doi:10.1037/0278-6133.23.5.443.CrossRefPubMed Bellg AJ, Borrelli B, Resnick B, Hecht J, Minicucci DS, Ory M, et al. Enhancing treatment fidelity in health behavior change studies: Best practices and recommendations from the NIH Behavior Change Consortium. Health Psychol. 2004;23(5):443–51. doi:10.​1037/​0278-6133.​23.​5.​443.CrossRefPubMed
7.
go back to reference Whyte J, Hart T. It’s more than a black box; it’s a Russian doll: Defining rehabilitation treatments. Am J Phys Med Rehab. 2003;82(8):639–52. Whyte J, Hart T. It’s more than a black box; it’s a Russian doll: Defining rehabilitation treatments. Am J Phys Med Rehab. 2003;82(8):639–52.
8.
go back to reference Galbraith JS, Herbst JH, Whittier DK, Jones PL, Smith BD, Uhl G, et al. Taxonomy for strengthening the identification of core elements for evidence-based behavioral interventions for HIV/AIDS prevention. Health Educ Res. 2011;26(5):872–85. doi:10.1093/her/cyr030.CrossRefPubMed Galbraith JS, Herbst JH, Whittier DK, Jones PL, Smith BD, Uhl G, et al. Taxonomy for strengthening the identification of core elements for evidence-based behavioral interventions for HIV/AIDS prevention. Health Educ Res. 2011;26(5):872–85. doi:10.​1093/​her/​cyr030.CrossRefPubMed
9.
go back to reference Mowbray CT, Holter MC, Teague GB, Bybee D. Fidelity criteria: Development, measurement, and validation. Am J Eval. 2003;24(3):315–40.CrossRef Mowbray CT, Holter MC, Teague GB, Bybee D. Fidelity criteria: Development, measurement, and validation. Am J Eval. 2003;24(3):315–40.CrossRef
10.
11.
go back to reference O’Connor C, Small SA, Cooney SM. Program fidelity and adaptation: Meeting local needs without compromising program effectiveness. What works, Wisconsin - Research to practice series. 2007;4. O’Connor C, Small SA, Cooney SM. Program fidelity and adaptation: Meeting local needs without compromising program effectiveness. What works, Wisconsin - Research to practice series. 2007;4.
12.
go back to reference Carroll C, Patterson M, Wood S, Booth A, Rick J, Balain S. A conceptual framework for implementation fidelity. Implement Sci. 2007;2(40). doi:4010.1186/1748-5908-2-40. Carroll C, Patterson M, Wood S, Booth A, Rick J, Balain S. A conceptual framework for implementation fidelity. Implement Sci. 2007;2(40). doi:4010.1186/1748-5908-2-40.
13.
15.
go back to reference Mars T, Ellard D, Carnes D, Homer K, Underwood M, Taylor SJC. Fidelity in complex behaviour change interventions: a standardised approach to evaluate intervention integrity. BMJ Open. 2013;3(11). doi:10.1136/bmjopen-2013-003555. Mars T, Ellard D, Carnes D, Homer K, Underwood M, Taylor SJC. Fidelity in complex behaviour change interventions: a standardised approach to evaluate intervention integrity. BMJ Open. 2013;3(11). doi:10.​1136/​bmjopen-2013-003555.
16.
go back to reference Weiss CH. Theory based evaluation: Past, present, and future. New Directions for Evaluation. 1997;1997(76):41–55.CrossRef Weiss CH. Theory based evaluation: Past, present, and future. New Directions for Evaluation. 1997;1997(76):41–55.CrossRef
17.
go back to reference Rovniak LS, Hovell MF, Wojcik JR, Winett RA, Martinez-Donate AP. Enhancing theoretical fidelity: An email–based walking program demonstration. Am J Health Promot. 2005;20(2):85–95.CrossRefPubMed Rovniak LS, Hovell MF, Wojcik JR, Winett RA, Martinez-Donate AP. Enhancing theoretical fidelity: An email–based walking program demonstration. Am J Health Promot. 2005;20(2):85–95.CrossRefPubMed
18.
go back to reference Saunders RP, Evans MH, Joshi P. Developing a process-evaluation plan for assessing health promotion program implementation: a how-to guide. Health Promot Pract. 2005;6(2):134–47.CrossRefPubMed Saunders RP, Evans MH, Joshi P. Developing a process-evaluation plan for assessing health promotion program implementation: a how-to guide. Health Promot Pract. 2005;6(2):134–47.CrossRefPubMed
19.
go back to reference Vartuli S, Rohs J. Assurance of outcome evaluation: Curriculum fidelity. J Res Child Educ. 2009;23(4):502–12.CrossRef Vartuli S, Rohs J. Assurance of outcome evaluation: Curriculum fidelity. J Res Child Educ. 2009;23(4):502–12.CrossRef
20.
go back to reference Blase K, Fixsen D. Core intervention components: Identifying and operationalizing what makes programs work. Washington: US Department of Health and Human Services; 2013. Blase K, Fixsen D. Core intervention components: Identifying and operationalizing what makes programs work. Washington: US Department of Health and Human Services; 2013.
21.
go back to reference Bond GR, Williams J, Evans L, Salyers MP, Kim H-W, Sharpe H, et al. Psychiatric rehabilitation fidelity toolkit. Evaluation Center, Human Services Research Institute and U.S. Cambridge: Department of Health and Human Services; 2000. Bond GR, Williams J, Evans L, Salyers MP, Kim H-W, Sharpe H, et al. Psychiatric rehabilitation fidelity toolkit. Evaluation Center, Human Services Research Institute and U.S. Cambridge: Department of Health and Human Services; 2000.
22.
go back to reference Cherney A, Head B. Evidence-based policy and practice key challenges for improvement. Aust J Soc Issues. 2010;45(4):509–26. Cherney A, Head B. Evidence-based policy and practice key challenges for improvement. Aust J Soc Issues. 2010;45(4):509–26.
23.
go back to reference Michie S, Fixsen D, Grimshaw JM, Eccles MP. Specifying and reporting complex behaviour change interventions: the need for a scientific method. Implement Sci. 2009;4(40). Michie S, Fixsen D, Grimshaw JM, Eccles MP. Specifying and reporting complex behaviour change interventions: the need for a scientific method. Implement Sci. 2009;4(40).
24.
go back to reference Backer TE. Implementation of evidence-based interventions: Key research issues. A presentation prepared for national implementation research network meeting. Northridge: Human Interaction Research Institute, California State University; 2005. Backer TE. Implementation of evidence-based interventions: Key research issues. A presentation prepared for national implementation research network meeting. Northridge: Human Interaction Research Institute, California State University; 2005.
25.
go back to reference O’Donnell CL. Defining, conceptualizing, and measuring fidelity of implementation and its relationship to outcomes in K–12 curriculum intervention research. Rev Educ Res. 2008;78(1):33–84.CrossRef O’Donnell CL. Defining, conceptualizing, and measuring fidelity of implementation and its relationship to outcomes in K–12 curriculum intervention research. Rev Educ Res. 2008;78(1):33–84.CrossRef
26.
go back to reference Durlak JA, DuPre EP. Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol. 2008;41(3–4):327–50.CrossRefPubMed Durlak JA, DuPre EP. Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol. 2008;41(3–4):327–50.CrossRefPubMed
27.
go back to reference Castro FG, Barrera Jr M, Martinez Jr CR. The cultural adaptation of prevention interventions: Resolving tensions between fidelity and fit. Prev Sci. 2004;5(1):41–5.CrossRefPubMed Castro FG, Barrera Jr M, Martinez Jr CR. The cultural adaptation of prevention interventions: Resolving tensions between fidelity and fit. Prev Sci. 2004;5(1):41–5.CrossRefPubMed
29.
go back to reference Michie S, Richardson M, Johnston M, Abraham C, Francis J, Hardeman W, et al. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Ann Behav Med. 2013;46(1):81–95. doi:10.1007/s12160-013-9486-6.CrossRefPubMed Michie S, Richardson M, Johnston M, Abraham C, Francis J, Hardeman W, et al. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Ann Behav Med. 2013;46(1):81–95. doi:10.​1007/​s12160-013-9486-6.CrossRefPubMed
31.
go back to reference Egan M, Bambra C, Petticrew M, Whitehead M. Reviewing evidence on complex social interventions: appraising implementation in systematic reviews of the health effects of organisational-level workplace interventions. J Epidemiol Community Health. 2009;63(1):4–11.CrossRefPubMedPubMedCentral Egan M, Bambra C, Petticrew M, Whitehead M. Reviewing evidence on complex social interventions: appraising implementation in systematic reviews of the health effects of organisational-level workplace interventions. J Epidemiol Community Health. 2009;63(1):4–11.CrossRefPubMedPubMedCentral
32.
go back to reference Michie S, van Stralen MM, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6(1):42.CrossRefPubMedPubMedCentral Michie S, van Stralen MM, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6(1):42.CrossRefPubMedPubMedCentral
34.
go back to reference Preskill H, Gopal S, Mack K, Cook J. Evaluating Complexity: Propositions for Improving Practice: FSG www.fsg.org2014 Nov. Preskill H, Gopal S, Mack K, Cook J. Evaluating Complexity: Propositions for Improving Practice: FSG www.fsg.org2014 Nov.
35.
go back to reference Pérez D, Lefèvre P, Castro M, Sánchez L, Toledo ME, Vanlerberghe V, et al. Process-oriented fidelity research assists in evaluation, adjustment and scaling-up of community-based interventions. Health Policy Plann. 2011;26(5):413–22.CrossRef Pérez D, Lefèvre P, Castro M, Sánchez L, Toledo ME, Vanlerberghe V, et al. Process-oriented fidelity research assists in evaluation, adjustment and scaling-up of community-based interventions. Health Policy Plann. 2011;26(5):413–22.CrossRef
36.
go back to reference Hasson H, Blomberg S, Dunér A. Fidelity and moderating factors in complex interventions: a case study of a continuum of care program for frail elderly people in health and social care. Implement Sci. 2012;7(1). Hasson H, Blomberg S, Dunér A. Fidelity and moderating factors in complex interventions: a case study of a continuum of care program for frail elderly people in health and social care. Implement Sci. 2012;7(1).
37.
go back to reference Zwarenstein M, Treweek S, Gagnier JJ, Altman DG, Tunis S, Haynes B et al. Improving the reporting of pragmatic trials: an extension of the CONSORT statement. BMJ. 2008:a2390. doi:10.1136/bmj.a2390. Zwarenstein M, Treweek S, Gagnier JJ, Altman DG, Tunis S, Haynes B et al. Improving the reporting of pragmatic trials: an extension of the CONSORT statement. BMJ. 2008:a2390. doi:10.​1136/​bmj.​a2390.
38.
go back to reference Moore G, Audrey S, Barker M, Bond L, Bonell C, Cooper C, et al. Process evaluation in complex public health intervention studies: the need for guidance. J Epidemiol Commun H. 2014;68(2):101–2. doi:10.1136/jech-2013-202869.CrossRef Moore G, Audrey S, Barker M, Bond L, Bonell C, Cooper C, et al. Process evaluation in complex public health intervention studies: the need for guidance. J Epidemiol Commun H. 2014;68(2):101–2. doi:10.​1136/​jech-2013-202869.CrossRef
39.
go back to reference Glasgow RE. Key Evaluation Issues in Facilitating Translation of Research to Practice and Policy. In: Williams B, Sankar M, editors. Evaluation South Asia. Kathmandu: UNICEF Regional Office for South Asia; 2008. p. 15–24. Glasgow RE. Key Evaluation Issues in Facilitating Translation of Research to Practice and Policy. In: Williams B, Sankar M, editors. Evaluation South Asia. Kathmandu: UNICEF Regional Office for South Asia; 2008. p. 15–24.
40.
go back to reference Cohn S, Clinch M, Bunn C, Stronge P. Entangled complexity: why complex interventions are just not complicated enough. J Health Serv Res Policy. 2013;18(1):40–3.CrossRefPubMed Cohn S, Clinch M, Bunn C, Stronge P. Entangled complexity: why complex interventions are just not complicated enough. J Health Serv Res Policy. 2013;18(1):40–3.CrossRefPubMed
43.
go back to reference Rycroft-Malone J, Wilkinson JE, Burton CR, Andrews G, Ariss S, Baker R, et al. Implementing health research through academic and clinical partnerships: a realistic evaluation of the Collaborations for Leadership in Applied Health Research and Care (CLAHRC). Implement Sci. 2011;6:74. doi:10.1186/1748-5908-6-74.CrossRefPubMedPubMedCentral Rycroft-Malone J, Wilkinson JE, Burton CR, Andrews G, Ariss S, Baker R, et al. Implementing health research through academic and clinical partnerships: a realistic evaluation of the Collaborations for Leadership in Applied Health Research and Care (CLAHRC). Implement Sci. 2011;6:74. doi:10.​1186/​1748-5908-6-74.CrossRefPubMedPubMedCentral
44.
45.
go back to reference Leykum LK, Pugh JA, Lanham HJ, Harmon J, McDaniel Jr RR. Implementation research design: integrating participatory action research into randomized controlled trials. Implement Sci. 2009;4:69.CrossRefPubMedPubMedCentral Leykum LK, Pugh JA, Lanham HJ, Harmon J, McDaniel Jr RR. Implementation research design: integrating participatory action research into randomized controlled trials. Implement Sci. 2009;4:69.CrossRefPubMedPubMedCentral
47.
go back to reference Van Daele T, Van Audenhove C, Hermans D, Van Den Bergh O, Van Den Broucke S. Empowerment implementation: enhancing fidelity and adaptation in a psycho-educational intervention. Health Promot Int. 2014;29(2):212–22.CrossRefPubMed Van Daele T, Van Audenhove C, Hermans D, Van Den Bergh O, Van Den Broucke S. Empowerment implementation: enhancing fidelity and adaptation in a psycho-educational intervention. Health Promot Int. 2014;29(2):212–22.CrossRefPubMed
48.
go back to reference Coulon SMMA, Wilson DKP, Griffin SPMPH, St George SMMA, Alia KABA, Trumpeter NNMS, et al. Formative Process Evaluation for Implementing a Social Marketing Intervention to IncreaseWalking Among African Americans in the Positive Action for Today's Health Trial. Am J Public Health. 2012;102(12):2315–21.CrossRefPubMedPubMedCentral Coulon SMMA, Wilson DKP, Griffin SPMPH, St George SMMA, Alia KABA, Trumpeter NNMS, et al. Formative Process Evaluation for Implementing a Social Marketing Intervention to IncreaseWalking Among African Americans in the Positive Action for Today's Health Trial. Am J Public Health. 2012;102(12):2315–21.CrossRefPubMedPubMedCentral
49.
go back to reference Pawson R, Tilley N. Realist Evaluation. South Australia: Community Matters; 2004. Pawson R, Tilley N. Realist Evaluation. South Australia: Community Matters; 2004.
50.
go back to reference Moore GF. Developing a mixed methods framework for process evaluations of complex interventions: the case of the National Exercise Referral Scheme Policy Trial in Wales. Cardiff University; 2010. Moore GF. Developing a mixed methods framework for process evaluations of complex interventions: the case of the National Exercise Referral Scheme Policy Trial in Wales. Cardiff University; 2010.
52.
go back to reference Patton MQ. Developmental evaluation: Applying complexity concepts to enhance innovation and use. New York: Guilford Press; 2011. Patton MQ. Developmental evaluation: Applying complexity concepts to enhance innovation and use. New York: Guilford Press; 2011.
54.
go back to reference Alia KA, Wilson DK, Mc Daniel T, St George SM, Kitzman-Ulrich H, Smith K, et al. Development of an innovative process evaluation approach for the Families Improving Together (FIT) for weight loss trial in African American adolescents. Eval Program Plann. 2015;49:106–16. doi:10.1016/j.evalprogplan.2014.12.020.CrossRefPubMed Alia KA, Wilson DK, Mc Daniel T, St George SM, Kitzman-Ulrich H, Smith K, et al. Development of an innovative process evaluation approach for the Families Improving Together (FIT) for weight loss trial in African American adolescents. Eval Program Plann. 2015;49:106–16. doi:10.​1016/​j.​evalprogplan.​2014.​12.​020.CrossRefPubMed
55.
go back to reference Century J, Rudnick M, Freeman C. A framework for measuring fidelity of implementation: A foundation for shared language and accumulation of knowledge. Am J Eval. 2010;31(2):199–218. doi:10.1177/1098214010366173.CrossRef Century J, Rudnick M, Freeman C. A framework for measuring fidelity of implementation: A foundation for shared language and accumulation of knowledge. Am J Eval. 2010;31(2):199–218. doi:10.​1177/​1098214010366173​.CrossRef
56.
go back to reference Marshall M, Lockwood A, Lewis S, Fiander M. Essential elements of an early intervention service for psychosis: the opinions of expert clinicians. BMC Psychiatry. 2004;4(1):17.CrossRefPubMedPubMedCentral Marshall M, Lockwood A, Lewis S, Fiander M. Essential elements of an early intervention service for psychosis: the opinions of expert clinicians. BMC Psychiatry. 2004;4(1):17.CrossRefPubMedPubMedCentral
63.
go back to reference Redman S, Turner T, Davies H, Williamson A, Haynes A, Brennan S et al. The SPIRIT Action Framework: A structured approach to selecting and testing strategies to increase the use of research in policy. Social Sci Med (1982). 2015;136–137:147–55. doi:10.1016/j.socscimed.2015.05.009. Redman S, Turner T, Davies H, Williamson A, Haynes A, Brennan S et al. The SPIRIT Action Framework: A structured approach to selecting and testing strategies to increase the use of research in policy. Social Sci Med (1982). 2015;136–137:147–55. doi:10.​1016/​j.​socscimed.​2015.​05.​009.
64.
go back to reference Oliver K, Lorenc T, Innvær S. New directions in evidence-based policy research: a critical analysis of the literature. Health Res Policy Syst. 2014;12(1):1–11.CrossRef Oliver K, Lorenc T, Innvær S. New directions in evidence-based policy research: a critical analysis of the literature. Health Res Policy Syst. 2014;12(1):1–11.CrossRef
65.
go back to reference Hallsworth M, Parker S, Rutter J. Policy making in the real world: Evidence and analysis. London: Institute for Government; 2011. Hallsworth M, Parker S, Rutter J. Policy making in the real world: Evidence and analysis. London: Institute for Government; 2011.
66.
go back to reference The CIPHER Investigators. Supporting Policy In health with Research: an Intervention Trial (SPIRIT)—protocol for a stepped wedge trial. BMJ Open. 2014;4(7). doi:10.1136/bmjopen-2014-005293. The CIPHER Investigators. Supporting Policy In health with Research: an Intervention Trial (SPIRIT)—protocol for a stepped wedge trial. BMJ Open. 2014;4(7). doi:10.​1136/​bmjopen-2014-005293.
67.
go back to reference Haynes A, Brennan S, Carter S, O’Connor D, Schneider CH, Turner T et al. Protocol for the process evaluation of a complex intervention designed to increase the use of research in health policy and program organisations (the SPIRIT study). Implement Sci. 2014;9(1). Haynes A, Brennan S, Carter S, O’Connor D, Schneider CH, Turner T et al. Protocol for the process evaluation of a complex intervention designed to increase the use of research in health policy and program organisations (the SPIRIT study). Implement Sci. 2014;9(1).
68.
go back to reference Dane AV, Schneider BH. Program integrity in primary and early secondary prevention: are implementation effects out of control? Clin Psychol Rev. 1998;18(1):23–45.CrossRefPubMed Dane AV, Schneider BH. Program integrity in primary and early secondary prevention: are implementation effects out of control? Clin Psychol Rev. 1998;18(1):23–45.CrossRefPubMed
69.
go back to reference Galbraith MW. Adult learning methods: A guide for effective instruction. 3rd ed. Malabar: Krieger Publishing Company; 2004. Galbraith MW. Adult learning methods: A guide for effective instruction. 3rd ed. Malabar: Krieger Publishing Company; 2004.
72.
go back to reference Poltawski L, Norris M, Dean S. Intervention fidelity: Developing an experience-based model for rehabilitation research. J Rehabil Med. 2014;46(7):609–15.CrossRefPubMed Poltawski L, Norris M, Dean S. Intervention fidelity: Developing an experience-based model for rehabilitation research. J Rehabil Med. 2014;46(7):609–15.CrossRefPubMed
73.
go back to reference Teague GB, Drake RE, Ackerson TH. Evaluating use of continuous treatment teams for persons with mental illness and substance abuse. Psychiat Serv. 1995;46(7):689–95.CrossRef Teague GB, Drake RE, Ackerson TH. Evaluating use of continuous treatment teams for persons with mental illness and substance abuse. Psychiat Serv. 1995;46(7):689–95.CrossRef
74.
go back to reference Wells M, Williams B, Treweek S, Coyle J, Taylor J. Intervention description is not enough: evidence from an in-depth multiple case study on the untold role and impact of context in randomised controlled trials of seven complex interventions. Trials. 2012;13(1):95–111.CrossRefPubMedPubMedCentral Wells M, Williams B, Treweek S, Coyle J, Taylor J. Intervention description is not enough: evidence from an in-depth multiple case study on the untold role and impact of context in randomised controlled trials of seven complex interventions. Trials. 2012;13(1):95–111.CrossRefPubMedPubMedCentral
Metadata
Title
Figuring out fidelity: a worked example of the methods used to identify, critique and revise the essential elements of a contextualised intervention in health policy agencies
Authors
Abby Haynes
Sue Brennan
Sally Redman
Anna Williamson
Gisselle Gallego
Phyllis Butow
The CIPHER team
Publication date
01-12-2015
Publisher
BioMed Central
Published in
Implementation Science / Issue 1/2015
Electronic ISSN: 1748-5908
DOI
https://doi.org/10.1186/s13012-016-0378-6

Other articles of this Issue 1/2015

Implementation Science 1/2015 Go to the issue