Skip to main content
Top
Published in: The Journal of Behavioral Health Services & Research 4/2012

01-10-2012

A Procedure for Assessing Intervention Fidelity in Experiments Testing Educational and Behavioral Interventions

Authors: Michael C. Nelson, BS, David S. Cordray, PhD, Chris S. Hulleman, PhD, Catherine L. Darrow, PhD, Evan C. Sommer, BS, BA

Published in: The Journal of Behavioral Health Services & Research | Issue 4/2012

Login to get access

Abstract

An intervention's effectiveness is judged by whether it produces positive outcomes for participants, with the randomized experiment being the gold standard for determining intervention effects. However, the intervention-as-implemented in an experiment frequently differs from the intervention-as-designed, making it unclear whether unfavorable results are due to an ineffective intervention model or the failure to implement the model fully. It is therefore vital to accurately and systematically assess intervention fidelity and, where possible, incorporate fidelity data in the analysis of outcomes. This paper elaborates a five-step procedure for systematically assessing intervention fidelity in the context of randomized controlled trials (RCTs), describes the advantages of assessing fidelity with this approach, and uses examples to illustrate how this procedure can be applied.
Footnotes
1
It is also possible that an intervention’s developer would specify as part of its change model one or more moderators, constructs thought to influence the nature (strength) of the causal relationship between two or more constructs. However, we have omitted discussion of moderators because they are exogenous to the intervention as designed.
 
2
Note that most models also involve assumptions (e.g., student characteristics) that may not be included in the graphic representation but that should be elaborated narratively.
 
3
While this example illustrates the problem in principle, it is unlikely to have inflated fidelity in this particular study given that the proportion of non-core items was relatively small and significant results were obtained.
 
Literature
1.
go back to reference Dane AV, Schneider BH. Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clinical Psychology Review 1998; 18(1):23–45.PubMedCrossRef Dane AV, Schneider BH. Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clinical Psychology Review 1998; 18(1):23–45.PubMedCrossRef
2.
go back to reference McIntyre LL, Gresham FM, DiGennaro FD, et al. Treatment integrity of school-based interventions with children in the Journal of Applied Behavior Analysis 1991–2005. Journal of Applied Behavior Analysis 2007; 40(4):659–672.PubMed McIntyre LL, Gresham FM, DiGennaro FD, et al. Treatment integrity of school-based interventions with children in the Journal of Applied Behavior Analysis 1991–2005. Journal of Applied Behavior Analysis 2007; 40(4):659–672.PubMed
3.
go back to reference Durlak JA, DuPre, EP. Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology 2008; 41(3):327–350.PubMedCrossRef Durlak JA, DuPre, EP. Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology 2008; 41(3):327–350.PubMedCrossRef
4.
go back to reference O'Donnell CL. Defining, Conceptualizing, and measuring fidelity of implementation and its relationship to outcomes in K-12 curriculum intervention research. Review of Educational Research 2008; 78(1):33–84.CrossRef O'Donnell CL. Defining, Conceptualizing, and measuring fidelity of implementation and its relationship to outcomes in K-12 curriculum intervention research. Review of Educational Research 2008; 78(1):33–84.CrossRef
5.
go back to reference Dusenbury L, Brannigan R, Falco M, et al. A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. Health Education Research 2003; 18(2):237–256.PubMedCrossRef Dusenbury L, Brannigan R, Falco M, et al. A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. Health Education Research 2003; 18(2):237–256.PubMedCrossRef
6.
go back to reference Fixsen DL, Naoom SF, Blasé KA, et al. Implementation Research: A Synthesis of the Literature. FMHI Publication no. 231. Tampa: Louis de la Parte Florida Mental Health Institute, National Implementation Research Network, University of South Florida, 2005. Fixsen DL, Naoom SF, Blasé KA, et al. Implementation Research: A Synthesis of the Literature. FMHI Publication no. 231. Tampa: Louis de la Parte Florida Mental Health Institute, National Implementation Research Network, University of South Florida, 2005.
7.
go back to reference Hulleman CS, Rimm-Kaufman SE, Abry TDS. Construct validity, measurement, and analytical issues for fidelity assessment in education research. In: Halle T, Martinez-Beck I, Metz A (eds.) Applying Implementation Science to Early Care and Education Programs and Systems: Exploring a New Frontier. Baltimore, M.D.: Brookes Publishing, in press. Hulleman CS, Rimm-Kaufman SE, Abry TDS. Construct validity, measurement, and analytical issues for fidelity assessment in education research. In: Halle T, Martinez-Beck I, Metz A (eds.) Applying Implementation Science to Early Care and Education Programs and Systems: Exploring a New Frontier. Baltimore, M.D.: Brookes Publishing, in press.
8.
go back to reference Bellg AJ, Borrelli B, Resnick, B, et al. Enhancing treatment fidelity in health behavior change studies: Best practices and recommendations from the Behavior Change Consortium. Health Psychology 2004; 23(5):443–451.PubMedCrossRef Bellg AJ, Borrelli B, Resnick, B, et al. Enhancing treatment fidelity in health behavior change studies: Best practices and recommendations from the Behavior Change Consortium. Health Psychology 2004; 23(5):443–451.PubMedCrossRef
9.
go back to reference Schoenwald SK, Garland AF, Chapman JE, et al. Toward the effective and efficient measurement of implementation fidelity. Administration and Policy in Mental Health and Mental Health Services Research 2011; 38(1):32–43PubMedCrossRef Schoenwald SK, Garland AF, Chapman JE, et al. Toward the effective and efficient measurement of implementation fidelity. Administration and Policy in Mental Health and Mental Health Services Research 2011; 38(1):32–43PubMedCrossRef
11.
go back to reference Holland PW. Statistics and causal inference. Journal of the American Statistical Association 1986; 81(396):945–960.CrossRef Holland PW. Statistics and causal inference. Journal of the American Statistical Association 1986; 81(396):945–960.CrossRef
12.
go back to reference Institute of Education Sciences. Education Research Training Grants. RFA No. IES-NCER-2008–02. Washington, D.C.: US Department of Education, 2007. Institute of Education Sciences. Education Research Training Grants. RFA No. IES-NCER-2008–02. Washington, D.C.: US Department of Education, 2007.
13.
go back to reference Cordray DS, Pion GM. Treatment strength and integrity: Models and methods. In: Bootzin RR, McKnight PE (eds). Strengthening Research Methodology: Psychological Measurement and Evaluation. Washington, DC: American Psychological Association, 2006: pp. 103–124. Cordray DS, Pion GM. Treatment strength and integrity: Models and methods. In: Bootzin RR, McKnight PE (eds). Strengthening Research Methodology: Psychological Measurement and Evaluation. Washington, DC: American Psychological Association, 2006: pp. 103–124.
14.
go back to reference Hulleman CS, Cordray D. Moving from the lab to the field: The role of fidelity and achieved relative intervention strength. Journal of Research on Intervention Effectiveness 2009; 2(1):88–110. Hulleman CS, Cordray D. Moving from the lab to the field: The role of fidelity and achieved relative intervention strength. Journal of Research on Intervention Effectiveness 2009; 2(1):88–110.
15.
go back to reference Borrelli B, Sepinwall D, Bellg AJ, et al. A new tool to assess treatment fidelity and evaluation of treatment fidelity across 10years of health behavior research. Journal of Consulting and Clinical Psychology 2005; 73(5):852–860.PubMedCrossRef Borrelli B, Sepinwall D, Bellg AJ, et al. A new tool to assess treatment fidelity and evaluation of treatment fidelity across 10years of health behavior research. Journal of Consulting and Clinical Psychology 2005; 73(5):852–860.PubMedCrossRef
16.
go back to reference Lichstein KL, Riedel BW, Grieve R. Fair tests of clinical trials: A treatment implementation model. Advances in Behavior Research and Therapy 1994; 16: 1–29.CrossRef Lichstein KL, Riedel BW, Grieve R. Fair tests of clinical trials: A treatment implementation model. Advances in Behavior Research and Therapy 1994; 16: 1–29.CrossRef
17.
go back to reference Cordray DS. 2007 Assessing Intervention Fidelity in Randomized Field Experiments. Funded Goal 5 proposal to Institute of Education Sciences. Cordray DS. 2007 Assessing Intervention Fidelity in Randomized Field Experiments. Funded Goal 5 proposal to Institute of Education Sciences.
18.
go back to reference Hulleman CS, Cordray DS, Nelson MC, et al. The state of treatment fidelity assessment in elementary mathematics interventions. Poster presented at the annual meeting conference of the Institute of Education Sciences, Washington, D.C., June 2009. Hulleman CS, Cordray DS, Nelson MC, et al. The state of treatment fidelity assessment in elementary mathematics interventions. Poster presented at the annual meeting conference of the Institute of Education Sciences, Washington, D.C., June 2009.
19.
go back to reference Knowlton LW, Phillips CC. The Logic Model Guidebook: Better Strategies for Great Results. Washington, D.C.: Sage, 2009. Knowlton LW, Phillips CC. The Logic Model Guidebook: Better Strategies for Great Results. Washington, D.C.: Sage, 2009.
20.
go back to reference Chen HT. Theory-Driven Evaluation. Thousand Oaks, CA: Sage Publications, 1990. Chen HT. Theory-Driven Evaluation. Thousand Oaks, CA: Sage Publications, 1990.
21.
go back to reference Sidani S, Sechrest L. Putting theory into operation. American Journal of Evaluation 1999; 20(2):227–238. Sidani S, Sechrest L. Putting theory into operation. American Journal of Evaluation 1999; 20(2):227–238.
22.
go back to reference Donaldson SI, Lipsey MW. Roles for theory in contemporary evaluation practice: Developing practical knowledge. In: Shaw I, Greene JC, Mark MM (eds). The Handbook of Evaluation: Policies, Programs, and Practices. London: Sage, 2006: pp. 56–75. Donaldson SI, Lipsey MW. Roles for theory in contemporary evaluation practice: Developing practical knowledge. In: Shaw I, Greene JC, Mark MM (eds). The Handbook of Evaluation: Policies, Programs, and Practices. London: Sage, 2006: pp. 56–75.
23.
go back to reference Trochim W, Cook J. Pattern matching in theory-driven evaluation: A field example from psychiatric rehabilitation. In: Chen H, Rossi PH (eds). Using Theory to Improve Program and Policy Evaluations. New York: Greenwood Press, 1992, pp. 49–69. Trochim W, Cook J. Pattern matching in theory-driven evaluation: A field example from psychiatric rehabilitation. In: Chen H, Rossi PH (eds). Using Theory to Improve Program and Policy Evaluations. New York: Greenwood Press, 1992, pp. 49–69.
24.
go back to reference Swafford JO, Jones GA, Thornton CA. Increased knowledge in geometry and instructional practice. Journal for Research in Mathematics Education 1997; 28(4):467–483.CrossRef Swafford JO, Jones GA, Thornton CA. Increased knowledge in geometry and instructional practice. Journal for Research in Mathematics Education 1997; 28(4):467–483.CrossRef
25.
go back to reference Noell GH, Witt JC, Slider NJ, et al. Treatment implementation following behavioral consultation in schools: A comparison of three follow-up strategies. School Psychology Review 2005; 34(1):87–106. Noell GH, Witt JC, Slider NJ, et al. Treatment implementation following behavioral consultation in schools: A comparison of three follow-up strategies. School Psychology Review 2005; 34(1):87–106.
26.
go back to reference Moss M, Fountain AR, Boulay B, et al. Reading First Implementation Evaluation: Final Report. Cambridge, MA: Abt Associates, 2008. Moss M, Fountain AR, Boulay B, et al. Reading First Implementation Evaluation: Final Report. Cambridge, MA: Abt Associates, 2008.
27.
go back to reference Shadish WR, Cook TD, Campbell, DT. Experimental and Quasi-Experimental Designs for Generalized Causal Inference. New York, NY: Houghton Mifflin Company, 2002. Shadish WR, Cook TD, Campbell, DT. Experimental and Quasi-Experimental Designs for Generalized Causal Inference. New York, NY: Houghton Mifflin Company, 2002.
28.
go back to reference Cook T. Postpositivist critical multiplism. In: Shotland RL, Marks MM (eds). Social Science and Social Policy. Beverly Hills, CA: Sage, 1985, pp. 21–62. Cook T. Postpositivist critical multiplism. In: Shotland RL, Marks MM (eds). Social Science and Social Policy. Beverly Hills, CA: Sage, 1985, pp. 21–62.
29.
go back to reference Cronbach LJ. Coefficient alpha and the internal structure of tests. Psychometrika 1951; 16(3):297–334.CrossRef Cronbach LJ. Coefficient alpha and the internal structure of tests. Psychometrika 1951; 16(3):297–334.CrossRef
30.
go back to reference Cordray DS. Identifying and Assessing the Cause in RCTs. Instructional session presented at the Institute of Education Sciences RCT Training Institute, Nashville, TN, June 22, 2009. Cordray DS. Identifying and Assessing the Cause in RCTs. Instructional session presented at the Institute of Education Sciences RCT Training Institute, Nashville, TN, June 22, 2009.
31.
go back to reference Cronbach LJ, Nageswari R, Gleser, GC. Theory of generalizability: A liberation of reliability theory. The British Journal of Statistical Psychology 1963; 16(2):137–163.CrossRef Cronbach LJ, Nageswari R, Gleser, GC. Theory of generalizability: A liberation of reliability theory. The British Journal of Statistical Psychology 1963; 16(2):137–163.CrossRef
32.
go back to reference Cronbach LJ, Gleser GC, Nanda H, et al. The Dependability of Behavioral Measurements: Theory of Generalizability for Scores and Profiles. New York: John Wiley, 1972. Cronbach LJ, Gleser GC, Nanda H, et al. The Dependability of Behavioral Measurements: Theory of Generalizability for Scores and Profiles. New York: John Wiley, 1972.
33.
go back to reference Crocker L, Algina, J. Introduction to Classical and Modern Test Theory. New York: Harcourt Brace Jovanovich College Publishers, 1986:527. Crocker L, Algina, J. Introduction to Classical and Modern Test Theory. New York: Harcourt Brace Jovanovich College Publishers, 1986:527.
34.
go back to reference Brennan LB. Generalizability theory. In: Gierl M (ed). ITEMS: The Instructional Topics in Educational Measurement Series. Madison, WI: National Council on Measurement in Education, 1992. Available at: www.ncme.org/pubs/items.cfm Accessed June 18, 2011. Brennan LB. Generalizability theory. In: Gierl M (ed). ITEMS: The Instructional Topics in Educational Measurement Series. Madison, WI: National Council on Measurement in Education, 1992. Available at: www.​ncme.​org/​pubs/​items.​cfm Accessed June 18, 2011.
35.
go back to reference Spector PE. Summated Rating Scale Construction: An Introduction. Newbury Park, CA: Sage, 1992. Spector PE. Summated Rating Scale Construction: An Introduction. Newbury Park, CA: Sage, 1992.
36.
go back to reference Lennon RT. Assumptions underlying the use of content validity. Educational and Psychological Measurement 1956; 16(3):294–304.CrossRef Lennon RT. Assumptions underlying the use of content validity. Educational and Psychological Measurement 1956; 16(3):294–304.CrossRef
37.
go back to reference Cronbach LJ. Test validation. In: Thorndike, RL (ed.). Educational Measurement (2nd ed.). Washington, D. C.: American Council on Education, 1971, pp. 443–507. Cronbach LJ. Test validation. In: Thorndike, RL (ed.). Educational Measurement (2nd ed.). Washington, D. C.: American Council on Education, 1971, pp. 443–507.
38.
go back to reference Mosier CI. A critical examination of the concepts of face validity. Educational & Psychological Measurement 1947; 7(2):191–205.CrossRef Mosier CI. A critical examination of the concepts of face validity. Educational & Psychological Measurement 1947; 7(2):191–205.CrossRef
39.
go back to reference McGrew JH, Bond GR, Dietzen L, Salyers M. Measuring the fidelity of implementation of a mental health program model. Journal of Consulting and Clinical Psychology 1994; 62(4): 670–678.PubMedCrossRef McGrew JH, Bond GR, Dietzen L, Salyers M. Measuring the fidelity of implementation of a mental health program model. Journal of Consulting and Clinical Psychology 1994; 62(4): 670–678.PubMedCrossRef
40.
go back to reference Mowbray CT, Holter MC, Teague GB et al. Fidelity criteria: Development, measurement, and validation. American Journal of Evaluation 2003; 24:315–340. Mowbray CT, Holter MC, Teague GB et al. Fidelity criteria: Development, measurement, and validation. American Journal of Evaluation 2003; 24:315–340.
41.
go back to reference Abry T, Rimm-Kaufman SE, Hulleman CS. Using Intervention Core Components to Identify the Active Ingredients of the Responsive Classroom approach. 2012, manuscript in preparation. Abry T, Rimm-Kaufman SE, Hulleman CS. Using Intervention Core Components to Identify the Active Ingredients of the Responsive Classroom approach. 2012, manuscript in preparation.
42.
go back to reference Fuchs LS, Fuchs D, Yazdian L, et al. Enhancing first-grade children's mathematical development with peer-assisted learning strategies. School Psychology Review 2002; 31(4):569–583. Fuchs LS, Fuchs D, Yazdian L, et al. Enhancing first-grade children's mathematical development with peer-assisted learning strategies. School Psychology Review 2002; 31(4):569–583.
43.
go back to reference Cordray DS, Pion GM, Dawson M, et al. 2008. The Efficacy of NWEA’s MAP Program. Institute of Education Sciences funded proposal. Cordray DS, Pion GM, Dawson M, et al. 2008. The Efficacy of NWEA’s MAP Program. Institute of Education Sciences funded proposal.
44.
go back to reference Wilson SJ, Lipsey MW, Derzon JH. The effects of school-based intervention programs on aggressive behavior: A meta-analysis. Journal of Consulting and Clinical Psychology 2003; 71:136–149.PubMedCrossRef Wilson SJ, Lipsey MW, Derzon JH. The effects of school-based intervention programs on aggressive behavior: A meta-analysis. Journal of Consulting and Clinical Psychology 2003; 71:136–149.PubMedCrossRef
45.
go back to reference Tobler NS. Meta-analysis of 143 adolescent drug prevention programs: Quantitative outcome results of program participants compared to a control or comparison group. Journal of Drug Issues, 1986; 16:537–567. Tobler NS. Meta-analysis of 143 adolescent drug prevention programs: Quantitative outcome results of program participants compared to a control or comparison group. Journal of Drug Issues, 1986; 16:537–567.
46.
go back to reference DuBois DL, Holloway BE, Valentine JC, et al. Effectiveness of mentoring programs for youth: A metaanalytic review. American Journal of Community Psychology 2002; 30:157–198.PubMedCrossRef DuBois DL, Holloway BE, Valentine JC, et al. Effectiveness of mentoring programs for youth: A metaanalytic review. American Journal of Community Psychology 2002; 30:157–198.PubMedCrossRef
47.
go back to reference Smith JD, Schneider BH, Smith PK, et al. The effectiveness of whole-school antibullying programs: A synthesis of evaluation research. School Psychology Review 2004; 33:547–560. Smith JD, Schneider BH, Smith PK, et al. The effectiveness of whole-school antibullying programs: A synthesis of evaluation research. School Psychology Review 2004; 33:547–560.
48.
go back to reference Likert R. A technique for the measurement of attitudes. Archives of Psychology 1932; 140:5–53. Likert R. A technique for the measurement of attitudes. Archives of Psychology 1932; 140:5–53.
49.
go back to reference Connor CM, Morrison FM, Fishman BJ, et al. Algorithm-guided individualized reading instruction. Science 2007; 315(5811):464–465.PubMedCrossRef Connor CM, Morrison FM, Fishman BJ, et al. Algorithm-guided individualized reading instruction. Science 2007; 315(5811):464–465.PubMedCrossRef
50.
go back to reference Fuchs LS, Fuchs D, Karns K. Enhancing kindergarteners’ mathematical development: Effects of peer-assisted learning strategies. Elementary School Journal 2001; 101(5):495–510.CrossRef Fuchs LS, Fuchs D, Karns K. Enhancing kindergarteners’ mathematical development: Effects of peer-assisted learning strategies. Elementary School Journal 2001; 101(5):495–510.CrossRef
51.
go back to reference Kutash K, Duchnowski A J, Sumi WC, et al. A school, family, and community collaborative program for children who have emotional disturbances. Journal of Emotional and Behavioral Disorders 2002; 10(2):99–107.CrossRef Kutash K, Duchnowski A J, Sumi WC, et al. A school, family, and community collaborative program for children who have emotional disturbances. Journal of Emotional and Behavioral Disorders 2002; 10(2):99–107.CrossRef
52.
go back to reference Ginsburg-Block M, Fantuzzo J. Reciprocal peer tutoring: An analysis of teacher and student interactions as a function of training and experience. School Psychology Quarterly 1997; 12(2):1–16.CrossRef Ginsburg-Block M, Fantuzzo J. Reciprocal peer tutoring: An analysis of teacher and student interactions as a function of training and experience. School Psychology Quarterly 1997; 12(2):1–16.CrossRef
53.
go back to reference Bond GR, Evans L, Salyers MP, et al. Measurement of fidelity in psychiatric rehabilitation. Mental Health Services Research, 2000; 2(2):75–87.PubMedCrossRef Bond GR, Evans L, Salyers MP, et al. Measurement of fidelity in psychiatric rehabilitation. Mental Health Services Research, 2000; 2(2):75–87.PubMedCrossRef
54.
go back to reference Teague GB, Bond GR, Drake RE. Program fidelity in assertive community treatment: Development and use of a measure. American Journal of Orthopsychiatry 1998; 68:216–232.PubMedCrossRef Teague GB, Bond GR, Drake RE. Program fidelity in assertive community treatment: Development and use of a measure. American Journal of Orthopsychiatry 1998; 68:216–232.PubMedCrossRef
55.
go back to reference Johnsen M, Samberg L, Calsyn R, et al. Case management models for persons who are homeless and mentally ill: The ACCESS Demonstration Project. Community Mental Health Journal 1999; 35:325–346.PubMedCrossRef Johnsen M, Samberg L, Calsyn R, et al. Case management models for persons who are homeless and mentally ill: The ACCESS Demonstration Project. Community Mental Health Journal 1999; 35:325–346.PubMedCrossRef
Metadata
Title
A Procedure for Assessing Intervention Fidelity in Experiments Testing Educational and Behavioral Interventions
Authors
Michael C. Nelson, BS
David S. Cordray, PhD
Chris S. Hulleman, PhD
Catherine L. Darrow, PhD
Evan C. Sommer, BS, BA
Publication date
01-10-2012
Publisher
Springer US
Published in
The Journal of Behavioral Health Services & Research / Issue 4/2012
Print ISSN: 1094-3412
Electronic ISSN: 2168-6793
DOI
https://doi.org/10.1007/s11414-012-9295-x

Other articles of this Issue 4/2012

The Journal of Behavioral Health Services & Research 4/2012 Go to the issue