Skip to main content
Top
Published in: Implementation Science 1/2017

Open Access 01-12-2017 | Research

Psychometric assessment of three newly developed implementation outcome measures

Authors: Bryan J. Weiner, Cara C. Lewis, Cameo Stanick, Byron J. Powell, Caitlin N. Dorsey, Alecia S. Clary, Marcella H. Boynton, Heather Halko

Published in: Implementation Science | Issue 1/2017

Login to get access

Abstract

Background

Implementation outcome measures are essential for monitoring and evaluating the success of implementation efforts. Yet, currently available measures lack conceptual clarity and have largely unknown reliability and validity. This study developed and psychometrically assessed three new measures: the Acceptability of Intervention Measure (AIM), Intervention Appropriateness Measure (IAM), and Feasibility of Intervention Measure (FIM).

Methods

Thirty-six implementation scientists and 27 mental health professionals assigned 31 items to the constructs and rated their confidence in their assignments. The Wilcoxon one-sample signed rank test was used to assess substantive and discriminant content validity. Exploratory and confirmatory factor analysis (EFA and CFA) and Cronbach alphas were used to assess the validity of the conceptual model. Three hundred twenty-six mental health counselors read one of six randomly assigned vignettes depicting a therapist contemplating adopting an evidence-based practice (EBP). Participants used 15 items to rate the therapist’s perceptions of the acceptability, appropriateness, and feasibility of adopting the EBP. CFA and Cronbach alphas were used to refine the scales, assess structural validity, and assess reliability. Analysis of variance (ANOVA) was used to assess known-groups validity. Finally, half of the counselors were randomly assigned to receive the same vignette and the other half the opposite vignette; and all were asked to re-rate acceptability, appropriateness, and feasibility. Pearson correlation coefficients were used to assess test-retest reliability and linear regression to assess sensitivity to change.

Results

All but five items exhibited substantive and discriminant content validity. A trimmed CFA with five items per construct exhibited acceptable model fit (CFI = 0.98, RMSEA = 0.08) and high factor loadings (0.79 to 0.94). The alphas for 5-item scales were between 0.87 and 0.89. Scale refinement based on measure-specific CFAs and Cronbach alphas using vignette data produced 4-item scales (α’s from 0.85 to 0.91). A three-factor CFA exhibited acceptable fit (CFI = 0.96, RMSEA = 0.08) and high factor loadings (0.75 to 0.89), indicating structural validity. ANOVA showed significant main effects, indicating known-groups validity. Test-retest reliability coefficients ranged from 0.73 to 0.88. Regression analysis indicated each measure was sensitive to change in both directions.

Conclusions

The AIM, IAM, and FIM demonstrate promising psychometric properties. Predictive validity assessment is planned.
Appendix
Available only for authorised users
Literature
1.
go back to reference Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Admin Pol Ment Health. 2011;38(2):65–76. doi:10.1007/s10488-010-0319-7.CrossRef Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Admin Pol Ment Health. 2011;38(2):65–76. doi:10.​1007/​s10488-010-0319-7.CrossRef
9.
go back to reference Hinkin TR. A brief tutorial on the development of measures for use in survey questionnaires. Organ Res Methods. 1998;1(1):104–21.CrossRef Hinkin TR. A brief tutorial on the development of measures for use in survey questionnaires. Organ Res Methods. 1998;1(1):104–21.CrossRef
12.
go back to reference Lewis CC, Stanick CF, Martinez RG, Weiner BJ, Kim M, Barwick M, et al. The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation. Implement Sci. 2015;10(2). doi:10.1186/s13012-014-0193-x. Lewis CC, Stanick CF, Martinez RG, Weiner BJ, Kim M, Barwick M, et al. The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation. Implement Sci. 2015;10(2). doi:10.​1186/​s13012-014-0193-x.
13.
go back to reference Hochberg Y. A sharper Bonferroni procedure for multiple tests of significance. Biometrika. 1988;75(4):800–2.CrossRef Hochberg Y. A sharper Bonferroni procedure for multiple tests of significance. Biometrika. 1988;75(4):800–2.CrossRef
15.
go back to reference Muthén LK, Muthén BO. Mplus user’s guide. seventh edition ed. Muthén & Muthén: Los Angeles; 2012. Muthén LK, Muthén BO. Mplus user’s guide. seventh edition ed. Muthén & Muthén: Los Angeles; 2012.
17.
go back to reference Hox JJ, Bechger TM. An introduction to structural equation modelling. Fam Sci Rev. 1998;11:354–73. Hox JJ, Bechger TM. An introduction to structural equation modelling. Fam Sci Rev. 1998;11:354–73.
18.
20.
go back to reference Davidson M. Known-groups validity. In: Michalos AC, editor. Encyclopedia of quality of life and well-being research. Dodrecht: Springer Netherlands; 2014. p. 3481–2.CrossRef Davidson M. Known-groups validity. In: Michalos AC, editor. Encyclopedia of quality of life and well-being research. Dodrecht: Springer Netherlands; 2014. p. 3481–2.CrossRef
23.
go back to reference Cohen J. Statistical power analysis for the behavioral sciences. Second edition ed. Lawrence Earlbaum Associates: Hillsdale; 1988. Cohen J. Statistical power analysis for the behavioral sciences. Second edition ed. Lawrence Earlbaum Associates: Hillsdale; 1988.
24.
go back to reference WMK T, JP D. The research methods knowledge base. 3rd ed. Cincinnati: Atomic Dog Publishing; 2006. WMK T, JP D. The research methods knowledge base. 3rd ed. Cincinnati: Atomic Dog Publishing; 2006.
25.
go back to reference Spearman C. The proof and measurement of association between two things. Am J Psychol. 1904;15(1):72–101.CrossRef Spearman C. The proof and measurement of association between two things. Am J Psychol. 1904;15(1):72–101.CrossRef
26.
go back to reference McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, et al. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: a Tower of Babel? Implement Sci. 2010;5:16. doi:10.1186/1748-5908-5-16.CrossRefPubMedPubMedCentral McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, et al. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: a Tower of Babel? Implement Sci. 2010;5:16. doi:10.​1186/​1748-5908-5-16.CrossRefPubMedPubMedCentral
28.
go back to reference Proctor EK, Powell BJ, Feely M. Measurement in dissemination and implementation science. In: Kendall RSBPC, editor. Dissemination and implementation of evidence-based practices in child and adolescent mental health. New York: Oxford University Press; 2014. p. 22–43. Proctor EK, Powell BJ, Feely M. Measurement in dissemination and implementation science. In: Kendall RSBPC, editor. Dissemination and implementation of evidence-based practices in child and adolescent mental health. New York: Oxford University Press; 2014. p. 22–43.
34.
go back to reference Powell BJ, Weiner BJ, Stanick CF, Halko H, Dorsey C, Lewis CC. Stakeholders’ perceptions of criteria for pragmatic measurement in implementation: a concept mapping approach (oral presentation). 9th Annual Conference on the Science of Dissemination & Implementation. Washington, D. C: Academy Health and the National Institutes of Health; 2016. Powell BJ, Weiner BJ, Stanick CF, Halko H, Dorsey C, Lewis CC. Stakeholders’ perceptions of criteria for pragmatic measurement in implementation: a concept mapping approach (oral presentation). 9th Annual Conference on the Science of Dissemination & Implementation. Washington, D. C: Academy Health and the National Institutes of Health; 2016.
Metadata
Title
Psychometric assessment of three newly developed implementation outcome measures
Authors
Bryan J. Weiner
Cara C. Lewis
Cameo Stanick
Byron J. Powell
Caitlin N. Dorsey
Alecia S. Clary
Marcella H. Boynton
Heather Halko
Publication date
01-12-2017
Publisher
BioMed Central
Published in
Implementation Science / Issue 1/2017
Electronic ISSN: 1748-5908
DOI
https://doi.org/10.1186/s13012-017-0635-3

Other articles of this Issue 1/2017

Implementation Science 1/2017 Go to the issue