Skip to main content
Top
Published in: Prevention Science 2/2018

Open Access 01-02-2018

Quality Matters: Implementation Moderates Student Outcomes in the PATHS Curriculum

Authors: Neil Humphrey, Alexandra Barlow, Ann Lendrum

Published in: Prevention Science | Issue 2/2018

Login to get access

Abstract

Analyses of the relationship between levels of implementation and outcomes of school-based social and emotional learning (SEL) interventions are relatively infrequent and are typically narrowly focused. Thus, our objective was to assess the relationship between variability in a range of implementation dimensions and intervention outcomes in the Promoting Alternative Thinking Strategies (PATHS) curriculum. Implementation of PATHS was examined in 69 classrooms across 23 schools in the first year of a major randomized controlled trial. Implementation data were generated via classroom-level structured observations. In addition to factual data on dosage and reach, exploratory factor analysis of observer ratings revealed two distinct implementation dimensions, namely, “quality and participant responsiveness” and “procedural fidelity.” Student social-emotional skills, pro-social behavior, internalizing symptoms, and externalizing problems were captured through child self-report and teacher informant-report surveys (N = 1721). Hierarchical linear modeling of study data revealed that higher implementation quality and participant responsiveness was associated with significantly lower ratings of students’ externalizing problems at 12-month follow-up. Conversely, and contrary to expectations, higher dosage was associated with significantly lower pro-social behavior and social-emotional skills at 12-month follow-up. No significant associations were found between variability in either procedural fidelity or reach and any intervention outcomes. The implications of these findings are discussed, and study limitations are noted.
Footnotes
1
The rubric was designed to orient observers to behaviors indicative of the implementation indicator in question. For example, when rating students’ engagement in core activities, the explanatory notes guided observers to assess the extent to which children in the class actively participated in lesson activities (e.g., joining in role plays and answering questions).
 
2
In order to meet minimal sample size requirements (e.g., N = 10 per item), the EFA was conducted using data from the 127 observations conducted across the 2 years of the trial.
 
3
Historically, the SDQ has been scored according to a five-factor structure (emotional symptoms, conduct problems, inattention/hyperactivity, peer problems, and pro-social behavior). However, research has indicated that a three-factor structure (internalizing symptoms, externalizing problems, pro-social behavior) offers improved data fit (Goodman, Lamping & Ploubidis, 2010).
 
4
We note that one item initially designated as belonging to the fidelity dimension also loaded onto this factor; however, there was evidence that of ambiguity, with this item cross-loading onto both factors (loadings of 0.57 and 0.48 respectively).
 
5
Schonfeld et al. (2015) did not provide descriptive statistics for their implementation data and a direct comparison of overall dosage levels is therefore not possible.
 
Literature
go back to reference Berry, V., Axford, N., Blower, S., Taylor, R. S., Edwards, R. T., Tobin, K., … Bywater, T. (2016). The effectiveness and micro-costing analysis of a universal, school-based, social–emotional learning programme in the UK: A cluster-randomised controlled trial. School Mental Health, 238–256. doi:10.1007/s12310-015-9160-1 Berry, V., Axford, N., Blower, S., Taylor, R. S., Edwards, R. T., Tobin, K., … Bywater, T. (2016). The effectiveness and micro-costing analysis of a universal, school-based, social–emotional learning programme in the UK: A cluster-randomised controlled trial. School Mental Health, 238–256. doi:10.​1007/​s12310-015-9160-1
go back to reference Bierman, K. L., Nix, R. L., Heinrichs, B. S., Domitrovich, C. E., Gest, S. D., Welsh, J. A., & Gill, S. (2014). Effects of head start REDI on children’s outcomes 1 year later in different kindergarten contexts. Child Development, 85, 140–159. doi:10.1111/cdev.12117.CrossRefPubMed Bierman, K. L., Nix, R. L., Heinrichs, B. S., Domitrovich, C. E., Gest, S. D., Welsh, J. A., & Gill, S. (2014). Effects of head start REDI on children’s outcomes 1 year later in different kindergarten contexts. Child Development, 85, 140–159. doi:10.​1111/​cdev.​12117.CrossRefPubMed
go back to reference Carpenter, J. J. R., Goldstein, H., & Kenward, M. G. M. (2011). REALCOM-IMPUTE software for multilevel multiple imputation with mixed response types. Journal of Statistical Software, 45, 1–12. doi:10.18637/jss.v045.i05.CrossRef Carpenter, J. J. R., Goldstein, H., & Kenward, M. G. M. (2011). REALCOM-IMPUTE software for multilevel multiple imputation with mixed response types. Journal of Statistical Software, 45, 1–12. doi:10.​18637/​jss.​v045.​i05.CrossRef
go back to reference Conduct Problems Prevention Research Group. (1999). Initial impact of the Fast track prevention trial for conduct problems: II. Classroom effects. Journal of Consulting and Clinical Psychology, 67, 648–657.CrossRefPubMedCentral Conduct Problems Prevention Research Group. (1999). Initial impact of the Fast track prevention trial for conduct problems: II. Classroom effects. Journal of Consulting and Clinical Psychology, 67, 648–657.CrossRefPubMedCentral
go back to reference Crean, H. F., & Johnson, D. B. (2013). Promoting alternative thinking strategies (PATHS) and elementary school aged children’s aggression: Results from a cluster randomized trial. American Journal of Community Psychology, 52, 56–72. doi:10.1007/s10464-013-9576-4.CrossRefPubMed Crean, H. F., & Johnson, D. B. (2013). Promoting alternative thinking strategies (PATHS) and elementary school aged children’s aggression: Results from a cluster randomized trial. American Journal of Community Psychology, 52, 56–72. doi:10.​1007/​s10464-013-9576-4.CrossRefPubMed
go back to reference Department for Education. (2012). Schools, pupils and their characteristics. London: Department for Education. Department for Education. (2012). Schools, pupils and their characteristics. London: Department for Education.
go back to reference DiStefano, C., Zhu, M., & Mindrila, D. (2009). Understanding and using factor scores: Considerations for the applied researcher. Practical Assessment, Research and Evaluation, 14, 1–11. DiStefano, C., Zhu, M., & Mindrila, D. (2009). Understanding and using factor scores: Considerations for the applied researcher. Practical Assessment, Research and Evaluation, 14, 1–11.
go back to reference Domitrovich, C. E., Cortes, R. C., & Greenberg, M. T. (2007). Improving young children’s social and emotional competence: A randomized trial of the preschool “PATHS” curriculum. The Journal of Primary Prevention, 28, 67–91. doi:10.1007/s10935-007-0081-0.CrossRefPubMed Domitrovich, C. E., Cortes, R. C., & Greenberg, M. T. (2007). Improving young children’s social and emotional competence: A randomized trial of the preschool “PATHS” curriculum. The Journal of Primary Prevention, 28, 67–91. doi:10.​1007/​s10935-007-0081-0.CrossRefPubMed
go back to reference Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327–350. doi:10.1007/s10464-008-9165-0.CrossRefPubMed Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327–350. doi:10.​1007/​s10464-008-9165-0.CrossRefPubMed
go back to reference Faria, A. M., Kendziora, K., Brown, L., O’Brien, B., & Osher, D. (2013). PATHS implementation and outcome study in the Cleveland metropolitan School District: Final report. Washington: American Institutes for Research. Faria, A. M., Kendziora, K., Brown, L., O’Brien, B., & Osher, D. (2013). PATHS implementation and outcome study in the Cleveland metropolitan School District: Final report. Washington: American Institutes for Research.
go back to reference Goodman, A., Lamping, D. L., & Ploubidis, G. B. (2010). When to use broader internalising and externalising subscales instead of the hypothesised five subscales on the strengths and difficulties questionnaire (SDQ): Data from British parents, teachers and children. Journal of Abnormal Child Psychology, 38, 1179–1191. doi:10.1007/s10802-010-9434-x.CrossRefPubMed Goodman, A., Lamping, D. L., & Ploubidis, G. B. (2010). When to use broader internalising and externalising subscales instead of the hypothesised five subscales on the strengths and difficulties questionnaire (SDQ): Data from British parents, teachers and children. Journal of Abnormal Child Psychology, 38, 1179–1191. doi:10.​1007/​s10802-010-9434-x.CrossRefPubMed
go back to reference Green, H., McGinnity, A., Meltzer, H., Ford, T., & Goodman, R. (2005). Mental health of children and young people in Great Britain. Newport: Office for National Statistics. Green, H., McGinnity, A., Meltzer, H., Ford, T., & Goodman, R. (2005). Mental health of children and young people in Great Britain. Newport: Office for National Statistics.
go back to reference Gresham, F. M., & Elliot, S. N. (2008). Social skills improvement system: Rating scales manual. Minneapolis, MN: Pearson Gresham, F. M., & Elliot, S. N. (2008). Social skills improvement system: Rating scales manual. Minneapolis, MN: Pearson
go back to reference Hallgren, K. A. (2012). Computing inter-rater reliability for observational data: An overview and tutorial. Tutorials in Quantitative Methods for Psychology, 8, 23–34.CrossRefPubMedPubMedCentral Hallgren, K. A. (2012). Computing inter-rater reliability for observational data: An overview and tutorial. Tutorials in Quantitative Methods for Psychology, 8, 23–34.CrossRefPubMedPubMedCentral
go back to reference Hansen, W. (2014). Measuring fidelity. In Z. Sloboda & H. Petras (Eds.), Defining prevention science (pp. 335–359). New York: Springer.CrossRef Hansen, W. (2014). Measuring fidelity. In Z. Sloboda & H. Petras (Eds.), Defining prevention science (pp. 335–359). New York: Springer.CrossRef
go back to reference Hansen, W. B., Pankrantz, M. M., Dusenbury, L., Giles, S. M., Bishop, D., Albritton, J., et al. (2013). Styles of adaptation: The impact of frequency and valence of adaptation on preventing substance abuse. Health Education, 113, 345–363.CrossRef Hansen, W. B., Pankrantz, M. M., Dusenbury, L., Giles, S. M., Bishop, D., Albritton, J., et al. (2013). Styles of adaptation: The impact of frequency and valence of adaptation on preventing substance abuse. Health Education, 113, 345–363.CrossRef
go back to reference Heijtan, D. F., & Basu, S. (1996). Distinguishing “missing at random” and “missing completely at random”. The American Statistician, 50, 207–213. Heijtan, D. F., & Basu, S. (1996). Distinguishing “missing at random” and “missing completely at random”. The American Statistician, 50, 207–213.
go back to reference Humphrey, N., Kalambouka, A., Wigelsworth, M., Lendrum, A., Deighton, J., & Wolpert, M. (2011). Measures of social and emotional skills for children and young people: A systematic review. Educational and Psychological Measurement, 71, 617–637. doi:10.1177/0013164410382896.CrossRef Humphrey, N., Kalambouka, A., Wigelsworth, M., Lendrum, A., Deighton, J., & Wolpert, M. (2011). Measures of social and emotional skills for children and young people: A systematic review. Educational and Psychological Measurement, 71, 617–637. doi:10.​1177/​0013164410382896​.CrossRef
go back to reference Humphrey, N., Barlow, A., Wigelsworth, M., Lendrum, A., Pert, K., Joyce, C., et al. (2015). Promoting alternative thinking strategies (PATHS): Evaluation report. London: Education Endowment Foundation. Humphrey, N., Barlow, A., Wigelsworth, M., Lendrum, A., Pert, K., Joyce, C., et al. (2015). Promoting alternative thinking strategies (PATHS): Evaluation report. London: Education Endowment Foundation.
go back to reference Humphrey, N., Lendrum, A., Ashworth, E., Frearson, K., Buck, R., & Kerr, K. (2016b). Implementation and process evaluation (IPE) for interventions in educational settings: A synthesis of the literature. London: Education Endowment Foundation. Humphrey, N., Lendrum, A., Ashworth, E., Frearson, K., Buck, R., & Kerr, K. (2016b). Implementation and process evaluation (IPE) for interventions in educational settings: A synthesis of the literature. London: Education Endowment Foundation.
go back to reference Kam, C.M., Greenberg, M. T., & Walls, C. T. (2003). Examining the role of implementation quality in school-based prevention using the PATHS curriculum. Prevention Science, 4, 55–63–63. doi:10.1023/A:1021786811186 Kam, C.M., Greenberg, M. T., & Walls, C. T. (2003). Examining the role of implementation quality in school-based prevention using the PATHS curriculum. Prevention Science, 4, 55–63–63. doi:10.​1023/​A:​1021786811186
go back to reference Lendrum, A., Humphrey, N., & Greenberg, M. T. (2016). Implementing for success in school-based mental health promotion: The role of quality in resolving the tension between fidelity and adaptation. In R. Shute & P. Slee (Eds.), Mental health and wellbeing through schools: The way forward (pp. 53–63). London: Taylor and Francis. Lendrum, A., Humphrey, N., & Greenberg, M. T. (2016). Implementing for success in school-based mental health promotion: The role of quality in resolving the tension between fidelity and adaptation. In R. Shute & P. Slee (Eds.), Mental health and wellbeing through schools: The way forward (pp. 53–63). London: Taylor and Francis.
go back to reference Liu, J. (2010). Minimum effective dose. In Encyclopedia of Biopharmaceutical Statistics (pp. 799–800). London: Informa. Liu, J. (2010). Minimum effective dose. In Encyclopedia of Biopharmaceutical Statistics (pp. 799–800). London: Informa.
go back to reference Nelson, M. C., Cordray, D. S., Hulleman, C. S., Darrow, C. L., & Sommer, E. C. (2012). A procedure for assessing intervention fidelity in experiments testing educational and behavioral interventions. Journal of Behavioral Health Services and Research, 39, 374–396. doi:10.1007/s11414-012-9295-x.CrossRefPubMed Nelson, M. C., Cordray, D. S., Hulleman, C. S., Darrow, C. L., & Sommer, E. C. (2012). A procedure for assessing intervention fidelity in experiments testing educational and behavioral interventions. Journal of Behavioral Health Services and Research, 39, 374–396. doi:10.​1007/​s11414-012-9295-x.CrossRefPubMed
go back to reference O’Donnell, C. L. (2008). Defining, conceptualising, and measuring fidelity of implementation and its relationship to outcomes in K-12 curriculum intervention research. Review of Educational Research, 78, 33–84.CrossRef O’Donnell, C. L. (2008). Defining, conceptualising, and measuring fidelity of implementation and its relationship to outcomes in K-12 curriculum intervention research. Review of Educational Research, 78, 33–84.CrossRef
go back to reference Ogden, T., & Fixsen, D. L. (2014). Implementation science: A brief overview and a look ahead. Zeitschrift für Psychologie, 222, 4–11.CrossRef Ogden, T., & Fixsen, D. L. (2014). Implementation science: A brief overview and a look ahead. Zeitschrift für Psychologie, 222, 4–11.CrossRef
go back to reference Pettigrew, J., Graham, J. W., Miller-Day, M., Hecht, M. L., Krieger, J. L., & Shin, Y. J. (2015). Adherence and delivery: Implementation quality and program outcomes for the seventh-grade keepin’ it REAL program. Prevention Science : The Official Journal of the Society for Prevention Research, 16, 90–99. doi:10.1007/s11121-014-0459-1.CrossRef Pettigrew, J., Graham, J. W., Miller-Day, M., Hecht, M. L., Krieger, J. L., & Shin, Y. J. (2015). Adherence and delivery: Implementation quality and program outcomes for the seventh-grade keepin’ it REAL program. Prevention Science : The Official Journal of the Society for Prevention Research, 16, 90–99. doi:10.​1007/​s11121-014-0459-1.CrossRef
go back to reference Schonfeld, D. J., Adams, R. E., Fredstrom, B. K., Weissberg, R. P., Gilman, R., Voyce, C., et al. (2015). Cluster-randomized trial demonstrating impact on academic achievement of elementary social-emotional learning. School Psychology Quarterly, 30, 406–420. doi:10.1037/spq0000099.CrossRefPubMed Schonfeld, D. J., Adams, R. E., Fredstrom, B. K., Weissberg, R. P., Gilman, R., Voyce, C., et al. (2015). Cluster-randomized trial demonstrating impact on academic achievement of elementary social-emotional learning. School Psychology Quarterly, 30, 406–420. doi:10.​1037/​spq0000099.CrossRefPubMed
go back to reference Sklad, M., Diekstra, R., De Ritter, M., Ben, J., & Gravesteijn, C. (2012). Effectiveness of school-based universal social, emotional, and behavioral programs: Do they enhance students’ development in the area of skills, behavior and adjustment? Psychology in the Schools, 49, 892–909. doi:10.1002/pits.CrossRef Sklad, M., Diekstra, R., De Ritter, M., Ben, J., & Gravesteijn, C. (2012). Effectiveness of school-based universal social, emotional, and behavioral programs: Do they enhance students’ development in the area of skills, behavior and adjustment? Psychology in the Schools, 49, 892–909. doi:10.​1002/​pits.CrossRef
go back to reference Snijders, T. A. B. (2005). Power and sample size in multilevel modeling. In B. S. Everitt & D. C. Howell (Eds.), Encyclopedia of statistics in behavioral science (pp. 1570–1573). Chichester: Wiley. Snijders, T. A. B. (2005). Power and sample size in multilevel modeling. In B. S. Everitt & D. C. Howell (Eds.), Encyclopedia of statistics in behavioral science (pp. 1570–1573). Chichester: Wiley.
go back to reference Snijders, T. A. B., & Bosker, R. J. (2012). Multilevel analysis. London: SAGE Publications. Snijders, T. A. B., & Bosker, R. J. (2012). Multilevel analysis. London: SAGE Publications.
go back to reference Social and Character Development Research Consortium. (2010). Efficacy of school-wide programs to promote social and character development and reduce problem behavior in elementary school children. Washington: Institute of Educational Science. Social and Character Development Research Consortium. (2010). Efficacy of school-wide programs to promote social and character development and reduce problem behavior in elementary school children. Washington: Institute of Educational Science.
go back to reference Weissberg, R. P., Durlak, J. A., Domitrovich, C. E., & Gullotta, T. P. (2015). Social and emotional learning: Past, present and future. In J. A. Durlak, C. E. Domitrovich, R. P. Weissberg, & T. P. Gullotta (Eds.), Handbook of social and emotional learning (pp. 3–19). New York: Guilford Press. Weissberg, R. P., Durlak, J. A., Domitrovich, C. E., & Gullotta, T. P. (2015). Social and emotional learning: Past, present and future. In J. A. Durlak, C. E. Domitrovich, R. P. Weissberg, & T. P. Gullotta (Eds.), Handbook of social and emotional learning (pp. 3–19). New York: Guilford Press.
go back to reference Wigelsworth, M., Lendrum, A., Oldfield, J., Scott, A., Ten-Bokkel, I., Tate, K., & Emery, C. (2016). The influence of trial stage, developer involvement and international transferability on the outcomes of universal social and emotional learning outcomes: A meta-analysis. Cambridge Journal of Education, 46, 347–376. doi:10.1080/0305764X.2016.1195791. Wigelsworth, M., Lendrum, A., Oldfield, J., Scott, A., Ten-Bokkel, I., Tate, K., & Emery, C. (2016). The influence of trial stage, developer involvement and international transferability on the outcomes of universal social and emotional learning outcomes: A meta-analysis. Cambridge Journal of Education, 46, 347–376. doi:10.​1080/​0305764X.​2016.​1195791.
Metadata
Title
Quality Matters: Implementation Moderates Student Outcomes in the PATHS Curriculum
Authors
Neil Humphrey
Alexandra Barlow
Ann Lendrum
Publication date
01-02-2018
Publisher
Springer US
Published in
Prevention Science / Issue 2/2018
Print ISSN: 1389-4986
Electronic ISSN: 1573-6695
DOI
https://doi.org/10.1007/s11121-017-0802-4

Other articles of this Issue 2/2018

Prevention Science 2/2018 Go to the issue