Skip to main content
Top
Published in: Prevention Science 7/2015

Open Access 01-10-2015

Standards of Evidence for Efficacy, Effectiveness, and Scale-up Research in Prevention Science: Next Generation

Authors: Denise C. Gottfredson, Thomas D. Cook, Frances E. M. Gardner, Deborah Gorman-Smith, George W. Howe, Irwin N. Sandler, Kathryn M. Zafft

Published in: Prevention Science | Issue 7/2015

Login to get access

Abstract

A decade ago, the Society of Prevention Research (SPR) endorsed a set of standards for evidence related to research on prevention interventions. These standards (Flay et al., Prevention Science 6:151–175, 2005) were intended in part to increase consistency in reviews of prevention research that often generated disparate lists of effective interventions due to the application of different standards for what was considered to be necessary to demonstrate effectiveness. In 2013, SPR’s Board of Directors decided that the field has progressed sufficiently to warrant a review and, if necessary, publication of “the next generation” of standards of evidence. The Board convened a committee to review and update the standards. This article reports on the results of this committee’s deliberations, summarizing changes made to the earlier standards and explaining the rationale for each change. The SPR Board of Directors endorses “The Standards of Evidence for Efficacy, Effectiveness, and Scale-up Research in Prevention Science: Next Generation.”
Footnotes
1
We are aware of the challenges related to identifying core components of an EBI and the fact that the rigorous research necessary to adequately test differential effects of different components of an EBI is rare in Prevention Science (Elliott and Mihalic 2004). We suggest that the identification of core components is provisional and based on the developers’ theory of the intervention, but at the same time encourage an increase in empirical testing of these components. See Efficacy Standard 4.
 
2
See effectiveness standard 6.a for discussion of meaning of “public health impact.”
 
3
This is not true of a similar design often used to study smaller scale interventions, primarily in behavioral analysis, the ABA or ABAB design. This design is similar to the time series design used for larger units except that the timing of the intervention is controlled by the researcher and therefore not confounded with other events.
 
4
They calculate the rate of erroneous published positive findings in the field of psychology as follows: Assume the null is true 90 % of the time. Using alpha = .05 and power = .8, type I error will occur in 4.5 % of studies (90 % *.05) and correct rejections of the null will occur in 8 % of studies (10 % *.8). Therefore, the proportion of published positive findings that are erroneous is 36 % (4.5 %/(4.5 % + 8 %)). The calculation is highly dependent upon the assumption about the percentage of tests conducted for which the null hypothesis is true. The actual rate of correct nulls in Prevention Science studies is not known.
 
5
Establishing generalizability across time (e.g., do observed effects of Head Start programs tested in the 1980s generalize to Head Start programs operating decades later?) and settings (e.g., do effects observed for Life Skills Training observed in public schools generalize to alternative schools?) may also be important. If the statement of “for whom” and “under what conditions” the intervention is expected to be effective (see Efficacy Standard 2.c.) includes specific times and settings, generalizability across these dimensions should also be tested.
 
6
Among the interventions for which at least one study conducted by an independent evaluator has failed to replicate findings reported by the developer are: Alert (Ringwalt et al. 2010); Multi-Systemic Therapy (Centre for Children and Families in the Justice System 2006; Löfholm et al. 2009); PATHS (Social and Character Development Research Consortium 2010); Quantum Opportunities (Schirm et al. 2006); Reconnecting Youth (Hallfors et al. 2006); Strengthening Families Program (Gottfredson et al. 2006; Gutman et al. 2004); Toward no Drug Abuse (Rohrbach et al. 2010); and Triple P (Malti et al. 2011). Note that for some of these interventions, independent evaluations have also found positive effects.
 
7
These practices include, “(a) leveraging chance by running many low-powered studies, rather than a few high-powered ones; (b) uncritically dismissing “failed” studies as pilot tests or because of methodological flaws but uncritically accepting “successful” studies as methodologically sound; (c) selectively reporting studies with positive results and not studies with negative results or selectively reporting “clean” results; (d) stopping data collection as soon as a reliable effect is obtained; (e) continuing data collection until a reliable effect is obtained; (f) including multiple independent or dependent variables and reporting the subset that “worked;” (g) maintaining flexibility in design and analytic models, including the attempt of a variety of data exclusion or transformation methods, and reporting a subset; (h) reporting a discovery as if it had been the result of a confirmatory test, and; (i) once a reliable effect is obtained, not doing a direct replication” (Nosek et al. 2012, p. 618).
 
Literature
go back to reference Aarons, G. A., Horowitz, J. D., Dlugosz, L. R., & Ehrhart, M. G. (2012). The role of organizational processes in dissemination and implementation research. In R. C. Brownson, G. A. Colditz, & E. K. Proctor (Eds.), Dissemination and implementation research in health: Translating science to practice (pp. 128–153). New York: Oxford University Press.CrossRef Aarons, G. A., Horowitz, J. D., Dlugosz, L. R., & Ehrhart, M. G. (2012). The role of organizational processes in dissemination and implementation research. In R. C. Brownson, G. A. Colditz, & E. K. Proctor (Eds.), Dissemination and implementation research in health: Translating science to practice (pp. 128–153). New York: Oxford University Press.CrossRef
go back to reference Allen, J. D., Linnan, L. A., & Emmons, K. M. (2012). Fidelity and its relationship to implementation effectiveness, adaptation, and dissemination. In R. C. Brownson, G. A. Colditz, & E. K. Proctor (Eds.), Dissemination and implementation research in health: Translating science to practice (pp. 281–304). New York: Oxford University Press.CrossRef Allen, J. D., Linnan, L. A., & Emmons, K. M. (2012). Fidelity and its relationship to implementation effectiveness, adaptation, and dissemination. In R. C. Brownson, G. A. Colditz, & E. K. Proctor (Eds.), Dissemination and implementation research in health: Translating science to practice (pp. 281–304). New York: Oxford University Press.CrossRef
go back to reference Bloom, H. S., Michalopoulos, C., & Hill, C. J. (2005). Using experiments to assess nonexperimental comparison-group methods for measuring program effects. In H. S. Bloom (Ed.), Learning more from social experiments (pp. 173–235). New York: Russell Sage Foundation. Bloom, H. S., Michalopoulos, C., & Hill, C. J. (2005). Using experiments to assess nonexperimental comparison-group methods for measuring program effects. In H. S. Bloom (Ed.), Learning more from social experiments (pp. 173–235). New York: Russell Sage Foundation.
go back to reference Bloomquist, M. L., August, G. J., Lee, S. S., Lee, C. S., Realmuto, G. M., & Klimes-Dougan, B. (2013). Going-to-scale with the early risers conduct problems prevention program: Use of a comprehensive implementation support (CIS) system to optimize fidelity, participation and child outcomes. Evaluation and Program Planning, 38, 19–27.PubMedCentralCrossRefPubMed Bloomquist, M. L., August, G. J., Lee, S. S., Lee, C. S., Realmuto, G. M., & Klimes-Dougan, B. (2013). Going-to-scale with the early risers conduct problems prevention program: Use of a comprehensive implementation support (CIS) system to optimize fidelity, participation and child outcomes. Evaluation and Program Planning, 38, 19–27.PubMedCentralCrossRefPubMed
go back to reference Boruch, R. F. (Ed). (2005). Place randomized trials: Special issue. Annals of the American Academy of Political and Social Sciences, 599, whole issue. Boruch, R. F. (Ed). (2005). Place randomized trials: Special issue. Annals of the American Academy of Political and Social Sciences, 599, whole issue.
go back to reference Brown, C. H. (1993). Statistical methods for prevention trials in mental health. Statistics in Medicine, 12, 289–300.CrossRefPubMed Brown, C. H. (1993). Statistical methods for prevention trials in mental health. Statistics in Medicine, 12, 289–300.CrossRefPubMed
go back to reference Brown, C. H., Wang, W., Kellam, S. G., Muthén, B. O., Petras, H., Toyinbo, P., & The Prevention Science and Methodology Group. (2008). Methods for testing theory and evaluating impact in randomized field trials: Intent-to-treat analyses for integrating the perspectives of person, place, and time. Drug and Alcohol Dependence, 95, S74–S104.PubMedCentralCrossRefPubMed Brown, C. H., Wang, W., Kellam, S. G., Muthén, B. O., Petras, H., Toyinbo, P., & The Prevention Science and Methodology Group. (2008). Methods for testing theory and evaluating impact in randomized field trials: Intent-to-treat analyses for integrating the perspectives of person, place, and time. Drug and Alcohol Dependence, 95, S74–S104.PubMedCentralCrossRefPubMed
go back to reference Brownson, R. C., Colditz, G. A., & Proctor, E. K. (Eds.). (2012). Dissemination and implementation research in health: Translating science to practice. New York: Oxford University Press. Brownson, R. C., Colditz, G. A., & Proctor, E. K. (Eds.). (2012). Dissemination and implementation research in health: Translating science to practice. New York: Oxford University Press.
go back to reference Bryk, A. S., & Raudenbush, S. W. (1992). Hierarchical linear models: Applications and data analysis methods. Newbury Park: Sage. Bryk, A. S., & Raudenbush, S. W. (1992). Hierarchical linear models: Applications and data analysis methods. Newbury Park: Sage.
go back to reference Campbell, D. T. (1968). The experimenting society. In W. N. Dunn (Ed.), The experimenting society: Essays in honor of Donald T. Campbell (pp. 35–68). New Brunswick: Transaction Publishers. Campbell, D. T. (1968). The experimenting society. In W. N. Dunn (Ed.), The experimenting society: Essays in honor of Donald T. Campbell (pp. 35–68). New Brunswick: Transaction Publishers.
go back to reference Campbell, M. K., Piaggio, G., Elbourne, D. R., Altman, D. G., & for the CONSORT Group. (2012). Consort 2010 statement: Extension to cluster randomised trials. BMJ, 345, 1–21. Campbell, M. K., Piaggio, G., Elbourne, D. R., Altman, D. G., & for the CONSORT Group. (2012). Consort 2010 statement: Extension to cluster randomised trials. BMJ, 345, 1–21.
go back to reference Chamberlain, P., Brown, C. H., Saldana, L., Reid, J., Wang, W., Marsenich, L., & Bouwman, G. (2008). Engaging and recruiting counties in an experiment on implementing evidence-based practice in California. [Research Support, N.I.H., Extramural Research Support, U.S. Government, Non-P.H.S.]. Administration & Policy in Mental Health, 35, 250–260.CrossRef Chamberlain, P., Brown, C. H., Saldana, L., Reid, J., Wang, W., Marsenich, L., & Bouwman, G. (2008). Engaging and recruiting counties in an experiment on implementing evidence-based practice in California. [Research Support, N.I.H., Extramural Research Support, U.S. Government, Non-P.H.S.]. Administration & Policy in Mental Health, 35, 250–260.CrossRef
go back to reference Chen, H. T. (1990). Theory-driven evaluations. Newbury Park: Sage. Chen, H. T. (1990). Theory-driven evaluations. Newbury Park: Sage.
go back to reference Cook, T. D., & Campbell, D. T. (1979). Quasi-experimentation: Design & analysis issues for field settings. Boston: Houghton-Mifflin. Cook, T. D., & Campbell, D. T. (1979). Quasi-experimentation: Design & analysis issues for field settings. Boston: Houghton-Mifflin.
go back to reference Cook, T. D., & Payne, M. R. (2002). Objecting to the objections to using random assignment in educational studies. In F. Mosteller & R. Boruch (Eds.), Evidence matters: Randomized trials in education research (pp. 150–178). Washington: Brookings Institution Press. Cook, T. D., & Payne, M. R. (2002). Objecting to the objections to using random assignment in educational studies. In F. Mosteller & R. Boruch (Eds.), Evidence matters: Randomized trials in education research (pp. 150–178). Washington: Brookings Institution Press.
go back to reference Cook, T. D., Shadish, W. R., & Wong, V. C. (2008). Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within-study comparisons. Journal of Policy Analysis and Management, 27, 724–750.CrossRef Cook, T. D., Shadish, W. R., & Wong, V. C. (2008). Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within-study comparisons. Journal of Policy Analysis and Management, 27, 724–750.CrossRef
go back to reference Cook, T. D., Steiner, P. M., & Pohl, S. (2009). How bias reduction is affected by covariate choice, unreliability, and mode of data analysis: Results from two types of within-study comparisons. Multivariate Behavioral Research, 44, 828–847.CrossRef Cook, T. D., Steiner, P. M., & Pohl, S. (2009). How bias reduction is affected by covariate choice, unreliability, and mode of data analysis: Results from two types of within-study comparisons. Multivariate Behavioral Research, 44, 828–847.CrossRef
go back to reference Curran, G. M., Bauer, M., Mitman, B., Pyne, J. M., & Stetler, C. (2013). Effectiveness-implementation hybrid designs: Combining elements of clinical effectiveness and implementation research to enhance public health impact. Medical Care, 50, 217–226.CrossRef Curran, G. M., Bauer, M., Mitman, B., Pyne, J. M., & Stetler, C. (2013). Effectiveness-implementation hybrid designs: Combining elements of clinical effectiveness and implementation research to enhance public health impact. Medical Care, 50, 217–226.CrossRef
go back to reference Diaz, J. J., & Handa, S. (2006). An assessment of propensity score matching as a nonexperimental impact estimator evidence from Mexico’s PROGRESA program. Journal of Human Resources, 41, 319–345. Diaz, J. J., & Handa, S. (2006). An assessment of propensity score matching as a nonexperimental impact estimator evidence from Mexico’s PROGRESA program. Journal of Human Resources, 41, 319–345.
go back to reference Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327–350.CrossRefPubMed Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327–350.CrossRefPubMed
go back to reference Elliott, D. S., & Mihalic, S. (2004). Issues in disseminating and replicating effective prevention programs. Prevention Science, 5, 47–53.CrossRefPubMed Elliott, D. S., & Mihalic, S. (2004). Issues in disseminating and replicating effective prevention programs. Prevention Science, 5, 47–53.CrossRefPubMed
go back to reference Enders, C. (2011). Missing not at random models for latent growth curve analyses. Psychological Methods, 16, 1–16.CrossRefPubMed Enders, C. (2011). Missing not at random models for latent growth curve analyses. Psychological Methods, 16, 1–16.CrossRefPubMed
go back to reference Ennett, S. T., Ringwalt, C. L., Thorne, J., Rohrbach, L. A., Vincus, A., Simons-Randolf, A., & Jones, S. (2003). A comparison of current practice in school-based substance use prevention programs with meta-analysis findings. Prevention Science, 4, 1–14.CrossRefPubMed Ennett, S. T., Ringwalt, C. L., Thorne, J., Rohrbach, L. A., Vincus, A., Simons-Randolf, A., & Jones, S. (2003). A comparison of current practice in school-based substance use prevention programs with meta-analysis findings. Prevention Science, 4, 1–14.CrossRefPubMed
go back to reference Fagan, A. A., Hanson, K., Hawkins, J. D., & Arthur, M. W. (2009). Translation research in action: Implementation of the communities that care prevention system in 12 communities. Journal of Community Psychology, 37, 809–829.PubMedCentralCrossRefPubMed Fagan, A. A., Hanson, K., Hawkins, J. D., & Arthur, M. W. (2009). Translation research in action: Implementation of the communities that care prevention system in 12 communities. Journal of Community Psychology, 37, 809–829.PubMedCentralCrossRefPubMed
go back to reference Fisher, C. B., Hoagwood, K., Boyce, C., Duster, T., Frank, D. A., Grisso, T., Levine, R. J., Macklin, R., Spencer, M. B., Takanishi, R., Trimble, J. E., & Zayas, L. H. (2002). Research ethics for mental health science involving ethnic minority children and youths. American Psychologist, 57, 1024–1040.CrossRefPubMed Fisher, C. B., Hoagwood, K., Boyce, C., Duster, T., Frank, D. A., Grisso, T., Levine, R. J., Macklin, R., Spencer, M. B., Takanishi, R., Trimble, J. E., & Zayas, L. H. (2002). Research ethics for mental health science involving ethnic minority children and youths. American Psychologist, 57, 1024–1040.CrossRefPubMed
go back to reference Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network. Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network.
go back to reference Flay, B. R. (1986). Efficacy and effectiveness trials (and other phases of research) in the development of health promotion programs. Preventive Medicine, 15, 451–474.CrossRefPubMed Flay, B. R. (1986). Efficacy and effectiveness trials (and other phases of research) in the development of health promotion programs. Preventive Medicine, 15, 451–474.CrossRefPubMed
go back to reference Flay, B. R., Biglan, A., Boruch, R. F., Gonzalez Castro, F., Gottfredson, D., Kellam, S., Moscicki, E. K., Schinke, S., Valentine, J. C., & Ji, P. (2005). Standards of evidence: Criteria for efficacy, effectiveness and dissemination. Prevention Science, 6, 151–175.CrossRefPubMed Flay, B. R., Biglan, A., Boruch, R. F., Gonzalez Castro, F., Gottfredson, D., Kellam, S., Moscicki, E. K., Schinke, S., Valentine, J. C., & Ji, P. (2005). Standards of evidence: Criteria for efficacy, effectiveness and dissemination. Prevention Science, 6, 151–175.CrossRefPubMed
go back to reference Forgatch, M. S., & DeGarmo, D. S. (2011). Sustaining fidelity following the nationwide PMTO implementation in Norway. Prevention Science, 12, 235–246.PubMedCentralCrossRefPubMed Forgatch, M. S., & DeGarmo, D. S. (2011). Sustaining fidelity following the nationwide PMTO implementation in Norway. Prevention Science, 12, 235–246.PubMedCentralCrossRefPubMed
go back to reference Foster, M., Porter, M., Ayers, T., Kaplan, D., & Sandler, I. (2007). Estimating costs of preventive interventions. Evaluation Review, 31, 261–286.CrossRefPubMed Foster, M., Porter, M., Ayers, T., Kaplan, D., & Sandler, I. (2007). Estimating costs of preventive interventions. Evaluation Review, 31, 261–286.CrossRefPubMed
go back to reference Gardner, F., Burton, J., & Klimes, I. (2006). Randomised controlled trial of a parenting intervention in the voluntary sector for reducing child conduct problems: Outcomes and mechanisms of change. Journal of Child Psychology and Psychiatry, 47, 1123–1132.CrossRefPubMed Gardner, F., Burton, J., & Klimes, I. (2006). Randomised controlled trial of a parenting intervention in the voluntary sector for reducing child conduct problems: Outcomes and mechanisms of change. Journal of Child Psychology and Psychiatry, 47, 1123–1132.CrossRefPubMed
go back to reference Gardner, F., Mayo-Wilson, E., Montgomery, P., Hopewell, S., Macdonald, G., Moher, D., & Grant, S. (2013). Editorial perspective: The need for new guidelines to improve the reporting of trials in child and adolescent mental health. Journal of Child Psychology and Psychiatry, 54, 810–812.CrossRefPubMed Gardner, F., Mayo-Wilson, E., Montgomery, P., Hopewell, S., Macdonald, G., Moher, D., & Grant, S. (2013). Editorial perspective: The need for new guidelines to improve the reporting of trials in child and adolescent mental health. Journal of Child Psychology and Psychiatry, 54, 810–812.CrossRefPubMed
go back to reference Gerber, A. S., Green, D. P., & Carnegie, A. J. (2013). Evaluating public health law using randomized experiments. In A. C. Wagenaar & S. C. Burris (Eds.), Public health law research: Theory and methods (pp. 283–305). Somerset: Wiley. Gerber, A. S., Green, D. P., & Carnegie, A. J. (2013). Evaluating public health law using randomized experiments. In A. C. Wagenaar & S. C. Burris (Eds.), Public health law research: Theory and methods (pp. 283–305). Somerset: Wiley.
go back to reference Glasgow, R. E., & Steiner, J. F. (2012). Comparative effectiveness research to accelerate translation: Recommendations for an emerging field of science. In R. C. Brownson, G. A. Colditz, & E. K. Proctor (Eds.), Dissemination and implementation research in health: Translating science to practice (pp. 72–93). New York: Oxford University Press.CrossRef Glasgow, R. E., & Steiner, J. F. (2012). Comparative effectiveness research to accelerate translation: Recommendations for an emerging field of science. In R. C. Brownson, G. A. Colditz, & E. K. Proctor (Eds.), Dissemination and implementation research in health: Translating science to practice (pp. 72–93). New York: Oxford University Press.CrossRef
go back to reference Glazerman, S., Levy, D. M., & Myers, D. (2003). Nonexperimental versus experimental estimates of earnings impacts. The Annals of the American Academy of Political and Social Science, 589, 63–93.CrossRef Glazerman, S., Levy, D. M., & Myers, D. (2003). Nonexperimental versus experimental estimates of earnings impacts. The Annals of the American Academy of Political and Social Science, 589, 63–93.CrossRef
go back to reference Glisson, C., Schoenwald, S. K., Hemmelgarn, A., Green, P., Dukes, D., Armstrong, K. S., & Chapman, J. E. (2010). Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. Journal of Consulting and Clinical Psychology, 78, 537–550.PubMedCentralCrossRefPubMed Glisson, C., Schoenwald, S. K., Hemmelgarn, A., Green, P., Dukes, D., Armstrong, K. S., & Chapman, J. E. (2010). Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. Journal of Consulting and Clinical Psychology, 78, 537–550.PubMedCentralCrossRefPubMed
go back to reference Gottfredson, D. C., & Gottfredson, G. D. (2002). Quality of school-based prevention programs: Results from a national survey. Journal of Research in Crime and Delinquency, 39, 3–35.CrossRef Gottfredson, D. C., & Gottfredson, G. D. (2002). Quality of school-based prevention programs: Results from a national survey. Journal of Research in Crime and Delinquency, 39, 3–35.CrossRef
go back to reference Gottfredson, D. C., Kumpfer, K., Polizzi-Fox, D., Wilson, D., Puryear, V., Beatty, P., & Vilmenay, M. (2006). The strengthening Washington D.C. Families project: A randomized effectiveness trial of family-based prevention. Prevention Science, 7, 57–76.CrossRefPubMed Gottfredson, D. C., Kumpfer, K., Polizzi-Fox, D., Wilson, D., Puryear, V., Beatty, P., & Vilmenay, M. (2006). The strengthening Washington D.C. Families project: A randomized effectiveness trial of family-based prevention. Prevention Science, 7, 57–76.CrossRefPubMed
go back to reference Greene, W. H. (1993). Econometric analysis. New York: MacMillan. Greene, W. H. (1993). Econometric analysis. New York: MacMillan.
go back to reference Griffin, K. W., Botvin, G. J., & Nichols, T. R. (2004). Long-term follow-up effects of a school-based prevention program on adolescent risky driving. Prevention Science, 5, 207–212.CrossRefPubMed Griffin, K. W., Botvin, G. J., & Nichols, T. R. (2004). Long-term follow-up effects of a school-based prevention program on adolescent risky driving. Prevention Science, 5, 207–212.CrossRefPubMed
go back to reference Gutman, M. A., Foltz, C., Mittal, R., & Kaltenbach, K. (2004). Outcomes of a family-based prevention model with women in substance abuse treatment and their children: The Philadelphia Strengthening Families Project. Unpublished manuscript. Philadelphia: Treatment Research Institute. Gutman, M. A., Foltz, C., Mittal, R., & Kaltenbach, K. (2004). Outcomes of a family-based prevention model with women in substance abuse treatment and their children: The Philadelphia Strengthening Families Project. Unpublished manuscript. Philadelphia: Treatment Research Institute.
go back to reference Hallfors, D., & Godette, D. (2002). Will the ‘principles of effectiveness’ improve prevention practice? Early findings from a diffusion study. Health Education Research, 17, 461–470.CrossRefPubMed Hallfors, D., & Godette, D. (2002). Will the ‘principles of effectiveness’ improve prevention practice? Early findings from a diffusion study. Health Education Research, 17, 461–470.CrossRefPubMed
go back to reference Hallfors, D., Cho, H., Sanchez, V., Khatapoush, S., Kim, H., & Bauer, D. (2006). Efficacy vs effectiveness trial results of an indicated “model” substance abuse program: Implications for public health. American Journal of Public Health, 96, 2254–2259.PubMedCentralCrossRefPubMed Hallfors, D., Cho, H., Sanchez, V., Khatapoush, S., Kim, H., & Bauer, D. (2006). Efficacy vs effectiveness trial results of an indicated “model” substance abuse program: Implications for public health. American Journal of Public Health, 96, 2254–2259.PubMedCentralCrossRefPubMed
go back to reference Hedeker, D., Gibbons, R. D., & Flay, B. R. (1994). Random-effects regression models for clustered data: With an example from smoking prevention research. Journal of Consulting and Clinical Psychology, 624, 57–765. Hedeker, D., Gibbons, R. D., & Flay, B. R. (1994). Random-effects regression models for clustered data: With an example from smoking prevention research. Journal of Consulting and Clinical Psychology, 624, 57–765.
go back to reference Hunter, J. E. (2001). The desperate need for replications. Journal of Consumer Research, 28, 149–158.CrossRef Hunter, J. E. (2001). The desperate need for replications. Journal of Consumer Research, 28, 149–158.CrossRef
go back to reference Hutchings, J., Bywater, T., Daley, D., Gardner, F., Jones, K., Eames, C., & Edwards, R. T. (2007). Pragmatic randomised controlled trial of a parenting intervention in ‘Sure Start’ services for children at risk of developing conduct disorder. British Medical Journal, 334, 678–686.PubMedCentralCrossRefPubMed Hutchings, J., Bywater, T., Daley, D., Gardner, F., Jones, K., Eames, C., & Edwards, R. T. (2007). Pragmatic randomised controlled trial of a parenting intervention in ‘Sure Start’ services for children at risk of developing conduct disorder. British Medical Journal, 334, 678–686.PubMedCentralCrossRefPubMed
go back to reference Imai, K. (2009). Statistical analysis of randomized experiments with non-ignorable missing binary outcomes: An application to a voting experiment. Journal of the Royal Statistical Society: Series C (Applied Statistics), 58, 83–104.CrossRef Imai, K. (2009). Statistical analysis of randomized experiments with non-ignorable missing binary outcomes: An application to a voting experiment. Journal of the Royal Statistical Society: Series C (Applied Statistics), 58, 83–104.CrossRef
go back to reference Imai, K., Tingley, D., & Yamamoto, T. (2012). Experimental designs for identifying causal mechanisms. Journal of the Royal Statistical Society A, 1–27. Imai, K., Tingley, D., & Yamamoto, T. (2012). Experimental designs for identifying causal mechanisms. Journal of the Royal Statistical Society A, 1–27.
go back to reference Ioannidis, J. P. A. (2012). Why science is not necessarily self-correcting. Perspectives on Psychological Science, 7, 645–654.CrossRefPubMed Ioannidis, J. P. A. (2012). Why science is not necessarily self-correcting. Perspectives on Psychological Science, 7, 645–654.CrossRefPubMed
go back to reference Kenny, D. A., & Judd, C. M. (1986). Consequences of violating the independence assumption in analysis of variance. Psychological Bulletin, 99, 422–431.CrossRef Kenny, D. A., & Judd, C. M. (1986). Consequences of violating the independence assumption in analysis of variance. Psychological Bulletin, 99, 422–431.CrossRef
go back to reference Löfholm, C. A., Olsson, T., Sundell, K., & Hansson, K. (2009). Multisystemic therapy with conduct disordered young people: Stability of treatment outcomes two years after intake. Evidence & Policy, 5, 373–397.CrossRef Löfholm, C. A., Olsson, T., Sundell, K., & Hansson, K. (2009). Multisystemic therapy with conduct disordered young people: Stability of treatment outcomes two years after intake. Evidence & Policy, 5, 373–397.CrossRef
go back to reference MacKinnon, D. T. (2008). Introduction to statistical mediation analysis. New York: Taylor & Francis. MacKinnon, D. T. (2008). Introduction to statistical mediation analysis. New York: Taylor & Francis.
go back to reference Makel, M., Plucker, J., & Hegarty, B. (2012). Replications in psychology research: How often do they really occur? Perspectives on Psychological Science, 7, 537–542.CrossRefPubMed Makel, M., Plucker, J., & Hegarty, B. (2012). Replications in psychology research: How often do they really occur? Perspectives on Psychological Science, 7, 537–542.CrossRefPubMed
go back to reference Malti, T., Ribeaud, D., & Eisner, M. (2011). The effectiveness of two universal preventive interventions in reducing children’s externalizing behavior: A cluster randomized controlled trial. Journal of Clinical Child & Adolescent Psychology, 40, 677–692.CrossRef Malti, T., Ribeaud, D., & Eisner, M. (2011). The effectiveness of two universal preventive interventions in reducing children’s externalizing behavior: A cluster randomized controlled trial. Journal of Clinical Child & Adolescent Psychology, 40, 677–692.CrossRef
go back to reference Menting, A. T., de Castro, B. O., & Matthys, W. (2013). Effectiveness of the incredible years parent training to modify disruptive and prosocial child behavior: A meta-analytic review. Clinical Psychology Review, 33, 901–913.CrossRefPubMed Menting, A. T., de Castro, B. O., & Matthys, W. (2013). Effectiveness of the incredible years parent training to modify disruptive and prosocial child behavior: A meta-analytic review. Clinical Psychology Review, 33, 901–913.CrossRefPubMed
go back to reference Montgomery, P., Underhill, K., Gardner, F., Operario, D., & Mayo-Wilson, E. (2013b). The Oxford implementation index: A new tool for incorporating implementation data into systematic reviews and meta-analyses. Journal of Clinical Epidemiology, 66, 874–882.PubMedCentralCrossRefPubMed Montgomery, P., Underhill, K., Gardner, F., Operario, D., & Mayo-Wilson, E. (2013b). The Oxford implementation index: A new tool for incorporating implementation data into systematic reviews and meta-analyses. Journal of Clinical Epidemiology, 66, 874–882.PubMedCentralCrossRefPubMed
go back to reference Mrazek, P. G., & Haggerty, R. J. (Eds.). (1994). Reducing risks for mental disorders: Frontiers for preventive intervention research. Washington: National Academy Press. Mrazek, P. G., & Haggerty, R. J. (Eds.). (1994). Reducing risks for mental disorders: Frontiers for preventive intervention research. Washington: National Academy Press.
go back to reference Murray, D. M. (1998). Design and analysis of group-randomized trials. New York: Oxford University Press. Murray, D. M. (1998). Design and analysis of group-randomized trials. New York: Oxford University Press.
go back to reference Muthen, B., Asparouhov, T., Hunter, A. M., & Leichter, A. F. (2011). Growth modeling with nonignorable dropout: Alternative analyses of the STAR*D Antidepressant Trial. Psychological Methods, 16, 17–33.PubMedCentralCrossRefPubMed Muthen, B., Asparouhov, T., Hunter, A. M., & Leichter, A. F. (2011). Growth modeling with nonignorable dropout: Alternative analyses of the STAR*D Antidepressant Trial. Psychological Methods, 16, 17–33.PubMedCentralCrossRefPubMed
go back to reference Nerlove, M., & Diebold, F. (1990). Unit roots in economic time-series: A selective survey. In T. Bewley (Ed.), Advances in econometrics (Vol. 8). New York: JAI. Nerlove, M., & Diebold, F. (1990). Unit roots in economic time-series: A selective survey. In T. Bewley (Ed.), Advances in econometrics (Vol. 8). New York: JAI.
go back to reference Olds, D. L., Robinson, J., Pettitt, L., Luckey, D. W., Holmberg, J., Ng, R. K., Isacks, K., Sheff, K., & Henderson, C. R. (2004). Effects of home visits by paraprofessionals and by nurses: Age 4 follow-up results of a randomized trial. Pediatrics, 114, 1560–1568.CrossRefPubMed Olds, D. L., Robinson, J., Pettitt, L., Luckey, D. W., Holmberg, J., Ng, R. K., Isacks, K., Sheff, K., & Henderson, C. R. (2004). Effects of home visits by paraprofessionals and by nurses: Age 4 follow-up results of a randomized trial. Pediatrics, 114, 1560–1568.CrossRefPubMed
go back to reference Pashler, H., & Harris, C. R. (2012). Is the replicability crisis overblown? Three arguments examined. Perspectives in Psychological Science, 7, 531–536.CrossRef Pashler, H., & Harris, C. R. (2012). Is the replicability crisis overblown? Three arguments examined. Perspectives in Psychological Science, 7, 531–536.CrossRef
go back to reference Perrino, T., Howe, G., Sperling, A., Beardslee, W., Sandler, I., Shern, D., Pantin, H., Kaupert, S., Cano, N., Cruden, G., Bandiera, F, & Brown, C. H. (2013). Advancing science through collaborative data sharing and synthesis. Perspectives in Psychological Science, Published Online (NIH Public Access). doi: 10.1177/1745691613491579. Perrino, T., Howe, G., Sperling, A., Beardslee, W., Sandler, I., Shern, D., Pantin, H., Kaupert, S., Cano, N., Cruden, G., Bandiera, F, & Brown, C. H. (2013). Advancing science through collaborative data sharing and synthesis. Perspectives in Psychological Science, Published Online (NIH Public Access). doi: 10.1177/1745691613491579.
go back to reference Petrosino, A., & Soydan, H. (2005). The impact of program developers as evaluators on criminal recidivism: Results from meta-analyses of experimental and quasi-experimental research. Journal of Experimental Criminology, 1, 435–450.CrossRef Petrosino, A., & Soydan, H. (2005). The impact of program developers as evaluators on criminal recidivism: Results from meta-analyses of experimental and quasi-experimental research. Journal of Experimental Criminology, 1, 435–450.CrossRef
go back to reference Puma, M., Bell, S., Cook, R., Heid, C., Broene, P., Jenkins, F., Mashburn, A., & Downer, J. (2012). Third grade follow-up to the Head Start Impact Study final report. (OPRE Report # 2012-45).Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. Puma, M., Bell, S., Cook, R., Heid, C., Broene, P., Jenkins, F., Mashburn, A., & Downer, J. (2012). Third grade follow-up to the Head Start Impact Study final report. (OPRE Report # 2012-45).Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.
go back to reference Rabin, B. A., & Brownson, R. C. (2012). Developing the terminology for dissemination and implementation research. In R. C. Brownson, G. A. Colditz, & E. K. Proctor (Eds.), Dissemination and implementation research in health: Translating science to practice (pp. 23–51). New York: Oxford University Press.CrossRef Rabin, B. A., & Brownson, R. C. (2012). Developing the terminology for dissemination and implementation research. In R. C. Brownson, G. A. Colditz, & E. K. Proctor (Eds.), Dissemination and implementation research in health: Translating science to practice (pp. 23–51). New York: Oxford University Press.CrossRef
go back to reference Ringwalt, C. L., Clark, H. K., Hanley, S., Shamblen, S. R., & Flewelling, R. L. (2010). The effects of Project ALERT one year past curriculum completion. Prevention Science, 11, 172–184.CrossRefPubMed Ringwalt, C. L., Clark, H. K., Hanley, S., Shamblen, S. R., & Flewelling, R. L. (2010). The effects of Project ALERT one year past curriculum completion. Prevention Science, 11, 172–184.CrossRefPubMed
go back to reference Rohrbach, L. A., Grana, R., Sussman, S., & Valente, T. W. (2006). Type II translation: Transporting prevention interventions from research to real-world settings. Evaluation & the Health Professions, 29, 302–333. doi:10.1177/01632706290408.CrossRef Rohrbach, L. A., Grana, R., Sussman, S., & Valente, T. W. (2006). Type II translation: Transporting prevention interventions from research to real-world settings. Evaluation & the Health Professions, 29, 302–333. doi:10.​1177/​01632706290408.CrossRef
go back to reference Rohrbach, L. A., Sun, P., & Sussman, S. (2010). One-year follow-up evaluation of the project Towards No Drug Abuse (TND) dissemination trial. Preventive Medicine, 51, 313–319.PubMedCentralCrossRefPubMed Rohrbach, L. A., Sun, P., & Sussman, S. (2010). One-year follow-up evaluation of the project Towards No Drug Abuse (TND) dissemination trial. Preventive Medicine, 51, 313–319.PubMedCentralCrossRefPubMed
go back to reference Scariano, S. M., & Davenport, J. M. (1987). The effects of violations of the independence assumption in the one-way ANOVA. The American Statistician, 41, 123–128. Scariano, S. M., & Davenport, J. M. (1987). The effects of violations of the independence assumption in the one-way ANOVA. The American Statistician, 41, 123–128.
go back to reference Schafer, J. L., & Graham, J. W. (2002). Missing data: Our view of the state of the art. Psychological Methods, 7, 147–177.CrossRefPubMed Schafer, J. L., & Graham, J. W. (2002). Missing data: Our view of the state of the art. Psychological Methods, 7, 147–177.CrossRefPubMed
go back to reference Schirm, A., Stuart, E., & McKie, A. (2006). The quantum opportunity program demonstration: Final impacts. Princeton: Mathematica Policy Research, Inc. Schirm, A., Stuart, E., & McKie, A. (2006). The quantum opportunity program demonstration: Final impacts. Princeton: Mathematica Policy Research, Inc.
go back to reference Schochet, P. Z. (2007). Guidelines for multiple testing in experimental evaluations of educational interventions. Princeton: Mathematica Policy Research, Inc. Schochet, P. Z. (2007). Guidelines for multiple testing in experimental evaluations of educational interventions. Princeton: Mathematica Policy Research, Inc.
go back to reference Schroeder, B. A., Messina, A., Schroeder, D., Good, K., Barto, S., Saylor, J., & Masiello, M. (2011). The implementation of a statewide bullying prevention program: Preliminary findings from the field and the importance of coalitions. Health Promotion Practice, Advance online publication. doi: 10.1177/1524839910386887 Schroeder, B. A., Messina, A., Schroeder, D., Good, K., Barto, S., Saylor, J., & Masiello, M. (2011). The implementation of a statewide bullying prevention program: Preliminary findings from the field and the importance of coalitions. Health Promotion Practice, Advance online publication. doi: 10.​1177/​1524839910386887​
go back to reference Schulz, K. F., Altman, D. G., Moher, D., & for the CONSORT Group. (2010). CONSORT 2010 statement: Updated guidelines for reporting parallel group randomised trials. The British Medical Journal, 340, 698–702.CrossRef Schulz, K. F., Altman, D. G., Moher, D., & for the CONSORT Group. (2010). CONSORT 2010 statement: Updated guidelines for reporting parallel group randomised trials. The British Medical Journal, 340, 698–702.CrossRef
go back to reference Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin Company. Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin Company.
go back to reference Shadish, W. R., Clark, M. H., & Steiner, P. M. (2008). Can nonrandomized experiments yield accurate answers? A randomized experiment comparing random and nonrandom assignments. Journal of the American Statistical Association, 103, 1334–1356.CrossRef Shadish, W. R., Clark, M. H., & Steiner, P. M. (2008). Can nonrandomized experiments yield accurate answers? A randomized experiment comparing random and nonrandom assignments. Journal of the American Statistical Association, 103, 1334–1356.CrossRef
go back to reference Shumaker, S. A., Legault, C., Rapp, S. R., Thal, L., Wallace, R. B., Ockene, J. K., Hendrix, S. L., Jones, B. N., Assaf, A. R., Jackson, R. D., Kotchen, J. M., Wassertheil-Smoller, S., & Wactawski-Wende, J. (2003). Estrogen plus progestin and the incidence of dementia and mild cognitive impairment in post-menopausal women: The women’s health initiative memory study: A randomized controlled trial. Journal of the American Medical Association, 289, 2651–2662.CrossRefPubMed Shumaker, S. A., Legault, C., Rapp, S. R., Thal, L., Wallace, R. B., Ockene, J. K., Hendrix, S. L., Jones, B. N., Assaf, A. R., Jackson, R. D., Kotchen, J. M., Wassertheil-Smoller, S., & Wactawski-Wende, J. (2003). Estrogen plus progestin and the incidence of dementia and mild cognitive impairment in post-menopausal women: The women’s health initiative memory study: A randomized controlled trial. Journal of the American Medical Association, 289, 2651–2662.CrossRefPubMed
go back to reference Social and Character Development Research Consortium. (2010). Efficacy of schoolwide programs to promote social and character development and reduce problem behavior in elementary school children (NCER 2011-2001). Washington: National Center for Education Research, Institute of Education Sciences, U.S. Department of Education. Social and Character Development Research Consortium. (2010). Efficacy of schoolwide programs to promote social and character development and reduce problem behavior in elementary school children (NCER 2011-2001). Washington: National Center for Education Research, Institute of Education Sciences, U.S. Department of Education.
go back to reference Spoth, R. L., & Greenberg, M. T. (2011). Impact challenges in community science-with-practice: Lessons from PROSPER on transformative practitioner–scientist partnerships and prevention infrastructure development. American Journal of Community Psychology, 40, 1178–1191. Spoth, R. L., & Greenberg, M. T. (2011). Impact challenges in community science-with-practice: Lessons from PROSPER on transformative practitioner–scientist partnerships and prevention infrastructure development. American Journal of Community Psychology, 40, 1178–1191.
go back to reference Spoth, R. L., Guyll, M., Redmond, C., Greenberg, M. T., & Feinberg, M. E. (2011). Six-year sustainability of evidence-based intervention implementation quality by community–university partnerships: The PROSPER study. American Journal of Community Psychology, 48, 412–425.PubMedCentralCrossRefPubMed Spoth, R. L., Guyll, M., Redmond, C., Greenberg, M. T., & Feinberg, M. E. (2011). Six-year sustainability of evidence-based intervention implementation quality by community–university partnerships: The PROSPER study. American Journal of Community Psychology, 48, 412–425.PubMedCentralCrossRefPubMed
go back to reference Spoth, R., Rohrbach, L. A., Greenberg, M., Leaf, P., Brown, C. H., Fagan, A., Catalano, R. F., Pentz, M. A., Sloboda, Z., Hawkins, J. D., & Society for Prevention Research Type 2 Translational Task Force Members and Contributing Authors. (2013). Addressing core challenges for the next generation of type 2 translation research and systems: The translation science to population impact (TSci Impact) framework. Prevention Science, Published Online (Open Access). doi: 10.1007/s11121-012-0362-6 Spoth, R., Rohrbach, L. A., Greenberg, M., Leaf, P., Brown, C. H., Fagan, A., Catalano, R. F., Pentz, M. A., Sloboda, Z., Hawkins, J. D., & Society for Prevention Research Type 2 Translational Task Force Members and Contributing Authors. (2013). Addressing core challenges for the next generation of type 2 translation research and systems: The translation science to population impact (TSci Impact) framework. Prevention Science, Published Online (Open Access). doi: 10.​1007/​s11121-012-0362-6
go back to reference St. Clair, T. Cook, T. D. & Hallberg, K. (2014). Examining the internal validity and statistical precision of the comparative interrupted times series design by comparison with a randomized experiment. American Journal of Evaluation, 35, 311–327. St. Clair, T. Cook, T. D. & Hallberg, K. (2014). Examining the internal validity and statistical precision of the comparative interrupted times series design by comparison with a randomized experiment. American Journal of Evaluation, 35, 311–327.
go back to reference Supplee, L. H., Kelly, B. C., MacKinnon, D. M., & Yoches Barofsky, M. (2013). Introduction to the special issue: Subgroup analysis in prevention and intervention research. Prevention Science, 14, 107–110.CrossRefPubMed Supplee, L. H., Kelly, B. C., MacKinnon, D. M., & Yoches Barofsky, M. (2013). Introduction to the special issue: Subgroup analysis in prevention and intervention research. Prevention Science, 14, 107–110.CrossRefPubMed
go back to reference Trochim, W. M. K. (1984). Research design for program evaluation: The regression-discontinuity approach. Newbury Park: Sage. Trochim, W. M. K. (1984). Research design for program evaluation: The regression-discontinuity approach. Newbury Park: Sage.
go back to reference Trochim, W. (2000). The research methods knowledge base (2nd ed.). Cincinnati: Atomic Dog Publishing. Trochim, W. (2000). The research methods knowledge base (2nd ed.). Cincinnati: Atomic Dog Publishing.
go back to reference Valentine, J. C., Biglan, A., Boruch, R. F., González Castro, F., Collins, L. M., Flay, B. R., Kellam, S., Mościcki, E. K., & Schinke, S. P. (2011). Replication in prevention science. Prevention Science, 12, 103–117.CrossRefPubMed Valentine, J. C., Biglan, A., Boruch, R. F., González Castro, F., Collins, L. M., Flay, B. R., Kellam, S., Mościcki, E. K., & Schinke, S. P. (2011). Replication in prevention science. Prevention Science, 12, 103–117.CrossRefPubMed
go back to reference Wagenaar, A. C., & Komro, K. A. (2013). Natural experiments: Research design elements for optimal causal inference without randomization. In A. C. Wagenaar & S. C. Burris (Eds.), Public health law research: Theory and methods (pp. 307–324). Somerset: Wiley. Wagenaar, A. C., & Komro, K. A. (2013). Natural experiments: Research design elements for optimal causal inference without randomization. In A. C. Wagenaar & S. C. Burris (Eds.), Public health law research: Theory and methods (pp. 307–324). Somerset: Wiley.
go back to reference Wagenaar, A. C., & Webster, D. W. (1986). Preventing injuries to children through compulsory automobile safety seat use. [erratum appears in Pediatrics Jun;79(6):863]. Pediatrics, 78, 662–672.PubMed Wagenaar, A. C., & Webster, D. W. (1986). Preventing injuries to children through compulsory automobile safety seat use. [erratum appears in Pediatrics Jun;79(6):863]. Pediatrics, 78, 662–672.PubMed
go back to reference Wing, C., & Cook, T. D. (2013). Strengthening the regression discontinuity design using additional design elements: A within-study comparison. Journal of Policy Analysis and Management, 32, 853–877.CrossRef Wing, C., & Cook, T. D. (2013). Strengthening the regression discontinuity design using additional design elements: A within-study comparison. Journal of Policy Analysis and Management, 32, 853–877.CrossRef
go back to reference Winokur Early, K., Hand, G., Blankenship, J., & Chapman, S. (2012). Redirection continues to save money and reduce recidivism. Tallahassee: Justice Research Center. Winokur Early, K., Hand, G., Blankenship, J., & Chapman, S. (2012). Redirection continues to save money and reduce recidivism. Tallahassee: Justice Research Center.
go back to reference Wolchik, S. A., Sandler, I. N., Millsap, R. E., Plummer, B. A., Greene, S. M., Anderson, E. R., et al. (2002). Six-year follow-up of a randomized, controlled trial of preventive interventions for children of divorce. Journal of the American Medical Association, 288, 1–8.CrossRef Wolchik, S. A., Sandler, I. N., Millsap, R. E., Plummer, B. A., Greene, S. M., Anderson, E. R., et al. (2002). Six-year follow-up of a randomized, controlled trial of preventive interventions for children of divorce. Journal of the American Medical Association, 288, 1–8.CrossRef
go back to reference Zeger, S. L., Liang, K. Y., & Albert, P. S. (1988). Models for longitudinal data: A generalized estimating equation approach. Biometrics, 44, 1049–1060.CrossRefPubMed Zeger, S. L., Liang, K. Y., & Albert, P. S. (1988). Models for longitudinal data: A generalized estimating equation approach. Biometrics, 44, 1049–1060.CrossRefPubMed
Metadata
Title
Standards of Evidence for Efficacy, Effectiveness, and Scale-up Research in Prevention Science: Next Generation
Authors
Denise C. Gottfredson
Thomas D. Cook
Frances E. M. Gardner
Deborah Gorman-Smith
George W. Howe
Irwin N. Sandler
Kathryn M. Zafft
Publication date
01-10-2015
Publisher
Springer US
Published in
Prevention Science / Issue 7/2015
Print ISSN: 1389-4986
Electronic ISSN: 1573-6695
DOI
https://doi.org/10.1007/s11121-015-0555-x

Other articles of this Issue 7/2015

Prevention Science 7/2015 Go to the issue