Skip to main content
Top
Published in: Journal of Prevention 5/2014

01-10-2014 | Original Paper

Differences in Observers’ and Teachers’ Fidelity Assessments

Authors: William B. Hansen, Melinda M. Pankratz, Dana C. Bishop

Published in: Journal of Prevention | Issue 5/2014

Login to get access

Abstract

As evidence-based programs become disseminated, understanding the degree to which they are implemented with fidelity is crucial. This study tested the validity of fidelity ratings made by observers versus those made by teachers. We hypothesized that teachers’ reports about fidelity would have a positivity bias when compared to observers’ reports. Further, we hypothesized that there would generally be low correspondence between teachers’ and observers’ ratings of fidelity. We examined teachers’ and observers’ ratings as they were related to mediating variables targeted for change by the intervention. Finally, we examined the role that years of teaching experience played in achieving fidelity. Eighteen teachers and four research assistants participated in this project as raters. Teachers made video recordings of their implementation of All Stars and completed fidelity assessment forms. Trained observers independently completed parallel forms for 215 sampled classroom sessions. Both teachers and observers rated adherence, quality of delivery, attendance, and participant engagement. Teachers made more positive fidelity ratings than did observers. With the exception of ratings for attendance, teachers and observers failed to agree on fidelity ratings. Observers’ ratings were significantly related to students’ pretest assessments of targeted program mediators. That observers’ ratings were related to students’ pretest scores, suggests it is easier to teach well when students are predisposed to program success. Teachers’ ratings were infrequently related to mediators, but when they were, the relationship was counterintuitive. Experienced teachers taught with greater fidelity than novice teachers. While possibly inflated and inaccurate, gathering fidelity assessments from teachers may sensitize them to issues of fidelity as a result of requiring form completion. Assessing fidelity through observers’ ratings of video recordings has significant merit. As a longterm investment in improving prevention outcomes, policy makers should consider requiring both teacher and observer fidelity assessments as essential components of evaluation.
Literature
go back to reference Abbott, R. D., O’Donnell, J., Hawkins, J. D., Hill, K. G., Kosterman, R., & Catalano, R. F. (1998). Changing teaching practices to promote achievement and bonding to school. American Journal of Orthopsychiatry, 68, 542–552.PubMedCrossRef Abbott, R. D., O’Donnell, J., Hawkins, J. D., Hill, K. G., Kosterman, R., & Catalano, R. F. (1998). Changing teaching practices to promote achievement and bonding to school. American Journal of Orthopsychiatry, 68, 542–552.PubMedCrossRef
go back to reference Adams, A. S., Soumerai, S. B., Lomas, J., & Ross-Degnan, D. (1999). Evidence of self-report bias in assessing adherence to guidelines. International Journal of Quality Health Care, 11, 187–192.CrossRef Adams, A. S., Soumerai, S. B., Lomas, J., & Ross-Degnan, D. (1999). Evidence of self-report bias in assessing adherence to guidelines. International Journal of Quality Health Care, 11, 187–192.CrossRef
go back to reference Berkel, C., Mauricio, A. M., Schoenfelder, E., & Sandler, I. N. (2011). Putting the pieces together: An integrated model of program implementation. Prevention Science, 12(1), 23–33.PubMedCrossRef Berkel, C., Mauricio, A. M., Schoenfelder, E., & Sandler, I. N. (2011). Putting the pieces together: An integrated model of program implementation. Prevention Science, 12(1), 23–33.PubMedCrossRef
go back to reference Bishop, D. C., Pankratz, M. M., Hansen, W. B., Albritton, J., Albritton, L., & Strack, J. (2013). Measuring fidelity and adaptation: Reliability of an instrument for school-based prevention programs. Evaluation and the Health Professions. doi:10.1177/0163278713476882.PubMed Bishop, D. C., Pankratz, M. M., Hansen, W. B., Albritton, J., Albritton, L., & Strack, J. (2013). Measuring fidelity and adaptation: Reliability of an instrument for school-based prevention programs. Evaluation and the Health Professions. doi:10.​1177/​0163278713476882​.PubMed
go back to reference Burke, R. V., Oats, R. G., Ringle, J. L., Fichtner, L. O. N., & DelGaudio, M. B. (2011). Implementation of a classroom management program with urban elementary schools in low-income neighborhoods: Does program fidelity affect student behavior and academic outcomes. Journal of Education for Students Placed at Risk, 16, 201–218.CrossRef Burke, R. V., Oats, R. G., Ringle, J. L., Fichtner, L. O. N., & DelGaudio, M. B. (2011). Implementation of a classroom management program with urban elementary schools in low-income neighborhoods: Does program fidelity affect student behavior and academic outcomes. Journal of Education for Students Placed at Risk, 16, 201–218.CrossRef
go back to reference Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: Are implementation effects out of control. Clinical Psychology Review, 18, 23–45.PubMedCrossRef Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: Are implementation effects out of control. Clinical Psychology Review, 18, 23–45.PubMedCrossRef
go back to reference Domitrovich, C. E., & Greenberg, M. T. (2000). The study of implementation: Current findings from effective programs that prevent mental disorders in school-aged children. Journal of Educational and Psychological Consultation, 11(2), 193–221. Domitrovich, C. E., & Greenberg, M. T. (2000). The study of implementation: Current findings from effective programs that prevent mental disorders in school-aged children. Journal of Educational and Psychological Consultation, 11(2), 193–221.
go back to reference Donaldson, S. I., & Grant-Vallone, E. J. (2002). Understanding self-report bias in organizational behavior research. Journal of Business and Psychology, 17(2), 245–260.CrossRef Donaldson, S. I., & Grant-Vallone, E. J. (2002). Understanding self-report bias in organizational behavior research. Journal of Business and Psychology, 17(2), 245–260.CrossRef
go back to reference Durlak, J., & DuPre, E. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327–350.PubMedCrossRef Durlak, J., & DuPre, E. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327–350.PubMedCrossRef
go back to reference Dusenbury, L., Brannigan, R., Falco, M., & Hansen, W. B. (2003). A review of research on fidelity of implementation: Implications for drug abuse prevention in school settings. Health Education Research, 18, 237–256.PubMedCrossRef Dusenbury, L., Brannigan, R., Falco, M., & Hansen, W. B. (2003). A review of research on fidelity of implementation: Implications for drug abuse prevention in school settings. Health Education Research, 18, 237–256.PubMedCrossRef
go back to reference Dusenbury, L., Brannigan, R., Hansen, W. B., Walsh, J., & Falco, M. (2005). Quality of implementation: Developing measures crucial to understanding the diffusion of preventive interventions. Health Education Research, 20, 308–313.PubMedCrossRef Dusenbury, L., Brannigan, R., Hansen, W. B., Walsh, J., & Falco, M. (2005). Quality of implementation: Developing measures crucial to understanding the diffusion of preventive interventions. Health Education Research, 20, 308–313.PubMedCrossRef
go back to reference Ennett, S. T., Haws, S., Ringwalt, C. L., Vincus, A. A., Hanley, S., Bowling, J. M., et al. (2011). Evidence-based practice in school substance use prevention: Fidelity of implementation under real-world conditions. Health Education Research, 26(2), 361–371.PubMedCentralPubMedCrossRef Ennett, S. T., Haws, S., Ringwalt, C. L., Vincus, A. A., Hanley, S., Bowling, J. M., et al. (2011). Evidence-based practice in school substance use prevention: Fidelity of implementation under real-world conditions. Health Education Research, 26(2), 361–371.PubMedCentralPubMedCrossRef
go back to reference Fagan, A. A., & Mihalic, S. (2003). Strategies for enhancing the adoption of school-based prevention programs: Lessons learned from the blueprints for violence prevention replications of the life skills training program. Journal of Community Psychology, 31(3), 235–253.CrossRef Fagan, A. A., & Mihalic, S. (2003). Strategies for enhancing the adoption of school-based prevention programs: Lessons learned from the blueprints for violence prevention replications of the life skills training program. Journal of Community Psychology, 31(3), 235–253.CrossRef
go back to reference Giles, S. M., Jackson-Newsom, J., Pankratz, M. M., Hansen, W. B., Ringwalt, C. L., & Dusenbury, L. (2008). Measuring quality of delivery in a substance use prevention program. Journal of Primary Prevention, 29, 489–501.PubMedCrossRef Giles, S. M., Jackson-Newsom, J., Pankratz, M. M., Hansen, W. B., Ringwalt, C. L., & Dusenbury, L. (2008). Measuring quality of delivery in a substance use prevention program. Journal of Primary Prevention, 29, 489–501.PubMedCrossRef
go back to reference Gottfredson, D. C., Cross, A., Wilson, D., Rorie, M., & Connell, N. (2010). An experimental evaluation of the All Stars prevention curriculum in a community after school setting. Prevention Science, 11(2), 142–154.PubMedCrossRef Gottfredson, D. C., Cross, A., Wilson, D., Rorie, M., & Connell, N. (2010). An experimental evaluation of the All Stars prevention curriculum in a community after school setting. Prevention Science, 11(2), 142–154.PubMedCrossRef
go back to reference Griffin, K. W., Mahadeo, M., Weinstein, J., & Botvin, G. J. (2006). Program implementation fidelity and substance use outcomes among middle school students in a drug abuse prevention program. Salud y Drogas, 6, 7–26. Griffin, K. W., Mahadeo, M., Weinstein, J., & Botvin, G. J. (2006). Program implementation fidelity and substance use outcomes among middle school students in a drug abuse prevention program. Salud y Drogas, 6, 7–26.
go back to reference Hansen, W. B. (1996). Pilot test results comparing the All Stars program with seventh grade D.A.R.E.: Program integrity and mediating variable analysis. Substance Use and Misuse, 31(10), 1359–1377.PubMedCrossRef Hansen, W. B. (1996). Pilot test results comparing the All Stars program with seventh grade D.A.R.E.: Program integrity and mediating variable analysis. Substance Use and Misuse, 31(10), 1359–1377.PubMedCrossRef
go back to reference Hansen, W. B., & McNeal, R. B. (1999). Drug education practice: Results of an observational study. Health Education Research, 14, 85–97.PubMedCrossRef Hansen, W. B., & McNeal, R. B. (1999). Drug education practice: Results of an observational study. Health Education Research, 14, 85–97.PubMedCrossRef
go back to reference Hansen, W. B., Pankratz, M. M., Dusenbury, L., Giles, S. M., Bishop, D. C., Albritton, J., et al. (2013). Styles of adaptation: The impact of frequency and valence of adaptation on preventing substance use. Health Education, 113(4), 345–363. doi:10.1108/09654281311329268.CrossRef Hansen, W. B., Pankratz, M. M., Dusenbury, L., Giles, S. M., Bishop, D. C., Albritton, J., et al. (2013). Styles of adaptation: The impact of frequency and valence of adaptation on preventing substance use. Health Education, 113(4), 345–363. doi:10.​1108/​0965428131132926​8.CrossRef
go back to reference Harrington, N. G., Giles, S. M., Hoyle, R. H., Feeney, G. J., & Youngbluth, S. C. (2001). Evaluation of the All Stars character education and problem behavior prevention program: Effects on mediator and outcome variables for middle school students. Health Education & Behavior, 28, 533–546.CrossRef Harrington, N. G., Giles, S. M., Hoyle, R. H., Feeney, G. J., & Youngbluth, S. C. (2001). Evaluation of the All Stars character education and problem behavior prevention program: Effects on mediator and outcome variables for middle school students. Health Education & Behavior, 28, 533–546.CrossRef
go back to reference Johnson, K., Ogilvie, K., Collins, D., Shamblen, S., Dirks, L., Ringwalt, C., et al. (2010). Studying implementation quality of a school-based prevention curriculum in frontier Alaska: Application of video-recorded observations and expert panel judgment. Prevention Science, 11, 275–286.PubMedCentralPubMedCrossRef Johnson, K., Ogilvie, K., Collins, D., Shamblen, S., Dirks, L., Ringwalt, C., et al. (2010). Studying implementation quality of a school-based prevention curriculum in frontier Alaska: Application of video-recorded observations and expert panel judgment. Prevention Science, 11, 275–286.PubMedCentralPubMedCrossRef
go back to reference Jones, E. E., & Nisbett, R. E. (1971). The actor and the observer: Divergent perceptions of the causes of behavior. New York: General Learning Press. Jones, E. E., & Nisbett, R. E. (1971). The actor and the observer: Divergent perceptions of the causes of behavior. New York: General Learning Press.
go back to reference Kunter, M., & Baumert, J. (2006). Who is the expert? Construct and criteria validity of student and teacher ratings of instruction. Learning Environments Research, 9, 231–251.CrossRef Kunter, M., & Baumert, J. (2006). Who is the expert? Construct and criteria validity of student and teacher ratings of instruction. Learning Environments Research, 9, 231–251.CrossRef
go back to reference Lawrenz, F., Huffman, D., & Robey, J. (2003). Relationships among student, teacher and observer perceptions of science classrooms and student achievement. International Journal of Science Education, 25, 409–420.CrossRef Lawrenz, F., Huffman, D., & Robey, J. (2003). Relationships among student, teacher and observer perceptions of science classrooms and student achievement. International Journal of Science Education, 25, 409–420.CrossRef
go back to reference Lillehoj, C. J., Griffin, K. W., & Spoth, R. (2004). Program provider and observer ratings of school-based preventive intervention implementation: Agreement and relation to youth outcomes. Health Education & Behavior, 31, 242–257.CrossRef Lillehoj, C. J., Griffin, K. W., & Spoth, R. (2004). Program provider and observer ratings of school-based preventive intervention implementation: Agreement and relation to youth outcomes. Health Education & Behavior, 31, 242–257.CrossRef
go back to reference Mayer, D. P. (1999). Measuring instructional practice: Can policy makers trust survey data? Educational Evaluation and Policy Analysis, 21, 29–45.CrossRef Mayer, D. P. (1999). Measuring instructional practice: Can policy makers trust survey data? Educational Evaluation and Policy Analysis, 21, 29–45.CrossRef
go back to reference McGraw, S. A., Sellers, D., Stone, E., Resnicow, K. A., Kuester, S., Fridinger, F., et al. (2000). Measuring implementation of school programs and policies to promote healthy eating and physical activity among youth. Preventive Medicine, 31, S86–S97.CrossRef McGraw, S. A., Sellers, D., Stone, E., Resnicow, K. A., Kuester, S., Fridinger, F., et al. (2000). Measuring implementation of school programs and policies to promote healthy eating and physical activity among youth. Preventive Medicine, 31, S86–S97.CrossRef
go back to reference McNeal, R. B., Hansen, W. B., Harrington, N. G., & Giles, S. M. (2004). How all stars works: An examination of program effects on mediating variables. Health Education & Behavior, 31(2), 165–178. McNeal, R. B., Hansen, W. B., Harrington, N. G., & Giles, S. M. (2004). How all stars works: An examination of program effects on mediating variables. Health Education & Behavior, 31(2), 165–178.
go back to reference Newfield, J. (1980). Accuracy of teacher reports: Reports and observations of specific classroom behaviors. The Journal of Educational Research, 74(2), 78–82. Newfield, J. (1980). Accuracy of teacher reports: Reports and observations of specific classroom behaviors. The Journal of Educational Research, 74(2), 78–82.
go back to reference Pankratz, M. M., Jackson-Newsom, J., Giles, S. M., Ringwalt, C. L., Bliss, K., & Bell, M. L. (2006). Implementation fidelity in a teacher-led alcohol use prevention curriculum. Journal of Drug Education, 36, 317–333.PubMedCrossRef Pankratz, M. M., Jackson-Newsom, J., Giles, S. M., Ringwalt, C. L., Bliss, K., & Bell, M. L. (2006). Implementation fidelity in a teacher-led alcohol use prevention curriculum. Journal of Drug Education, 36, 317–333.PubMedCrossRef
go back to reference Pentz, M. A., Trebow, E. A., Hansen, W. B., MacKinnon, D. P., Dwyer, J. H., Johnson, C. A., et al. (1990). Effects of program implementation on adolescent drug use behavior: The Midwestern prevention project. Adolescent Drug Use Behavior, 14, 264–289. Pentz, M. A., Trebow, E. A., Hansen, W. B., MacKinnon, D. P., Dwyer, J. H., Johnson, C. A., et al. (1990). Effects of program implementation on adolescent drug use behavior: The Midwestern prevention project. Adolescent Drug Use Behavior, 14, 264–289.
go back to reference Ransford, C., Greenberg, M. T., Domitrovich, C. E., Small, M., & Jacobson, L. (2009). The role of teachers’ psychological experiences and perceptions of curriculum supports on implementation of a social emotional curriculum. School Psychology Review, 38, 510–532. Ransford, C., Greenberg, M. T., Domitrovich, C. E., Small, M., & Jacobson, L. (2009). The role of teachers’ psychological experiences and perceptions of curriculum supports on implementation of a social emotional curriculum. School Psychology Review, 38, 510–532.
go back to reference Resnicow, K., Davis, M., Smith, M., Lazarus-Yaroch, A., Baranowski, T., Baranowski, J., et al. (1998). How best to measure implementation of school health curricula: A comparison of three measures. Health Education Research, 13(2), 239–250. Resnicow, K., Davis, M., Smith, M., Lazarus-Yaroch, A., Baranowski, T., Baranowski, J., et al. (1998). How best to measure implementation of school health curricula: A comparison of three measures. Health Education Research, 13(2), 239–250.
go back to reference Ruiz-Primo, M. A. (2006). A multi-method and multi-source approach for studying fidelity of implementation. CSE Report 677, National Center for Research on Evaluation, Standards, and Student Testing, UCLA, Los Angeles, CA. Ruiz-Primo, M. A. (2006). A multi-method and multi-source approach for studying fidelity of implementation. CSE Report 677, National Center for Research on Evaluation, Standards, and Student Testing, UCLA, Los Angeles, CA.
go back to reference Spoth, R., Gull, M., Lillehoj, C. J., Redmond, C., & Greenberg, M. (2007). PROSPER study of evidence-based intervention implementation quality by community-university partnerships. Journal of Community Psychology, 35, 981–999.PubMedCentralPubMedCrossRef Spoth, R., Gull, M., Lillehoj, C. J., Redmond, C., & Greenberg, M. (2007). PROSPER study of evidence-based intervention implementation quality by community-university partnerships. Journal of Community Psychology, 35, 981–999.PubMedCentralPubMedCrossRef
go back to reference Spoth, R., Guyll, M., Trudeau, L., & Goldberg-Lillehoj, C. (2002). Two studies of proximal outcomes and implementation quality of universal preventive interventions in a community-university collaboration context. Journal of Community Psychology, 30, 499–518.CrossRef Spoth, R., Guyll, M., Trudeau, L., & Goldberg-Lillehoj, C. (2002). Two studies of proximal outcomes and implementation quality of universal preventive interventions in a community-university collaboration context. Journal of Community Psychology, 30, 499–518.CrossRef
go back to reference Swanson, E., Wanzek, J., Haring, C., Cuillo, S., & McCulley, L. (2011). Intervention fidelity in special and general education research journals. Journal of Special Education, 20, 1–11. Swanson, E., Wanzek, J., Haring, C., Cuillo, S., & McCulley, L. (2011). Intervention fidelity in special and general education research journals. Journal of Special Education, 20, 1–11.
go back to reference Taylor, L. (1994). Reflecting on teaching: The benefits of self-evaluation. Assessment & Evaluation in Higher Education, 19, 109–122.CrossRef Taylor, L. (1994). Reflecting on teaching: The benefits of self-evaluation. Assessment & Evaluation in Higher Education, 19, 109–122.CrossRef
go back to reference Wickstrom, K. F., Jones, K. M., LaFleur, L. H., & Witt, J. C. (1998). An analysis of treatment integrity in school-based behavioral consultation. School Psychology Quarterly, 13, 141.CrossRef Wickstrom, K. F., Jones, K. M., LaFleur, L. H., & Witt, J. C. (1998). An analysis of treatment integrity in school-based behavioral consultation. School Psychology Quarterly, 13, 141.CrossRef
go back to reference Wubbels, T., Brekelmans, M., & Hooymayers, H. P. (1992). Do teacher ideals distort the self-reports of their interpersonal behavior? Teaching and Teacher Education, 8, 47–58.CrossRef Wubbels, T., Brekelmans, M., & Hooymayers, H. P. (1992). Do teacher ideals distort the self-reports of their interpersonal behavior? Teaching and Teacher Education, 8, 47–58.CrossRef
Metadata
Title
Differences in Observers’ and Teachers’ Fidelity Assessments
Authors
William B. Hansen
Melinda M. Pankratz
Dana C. Bishop
Publication date
01-10-2014
Publisher
Springer US
Published in
Journal of Prevention / Issue 5/2014
Print ISSN: 2731-5533
Electronic ISSN: 2731-5541
DOI
https://doi.org/10.1007/s10935-014-0351-6

Other articles of this Issue 5/2014

Journal of Prevention 5/2014 Go to the issue