Skip to main content
Log in

Mixing methods in randomized controlled trials (RCTs): Validation, contextualization, triangulation, and control

  • Published:
Educational Assessment, Evaluation and Accountability Aims and scope Submit manuscript

Abstract

In this paper we described how we mixed research approaches in a Randomized Control Trial (RCT) of a school principal professional development program. Using examples from our study we illustrate how combining qualitative and quantitative data can address some key challenges from validating instruments and measures of mediator variables to examining how contextual factors interact with the treatment. Describing how we transformed our qualitative and quantitative data, we consider how mixing methods enabled us to deal with the two core RCT challenges of random assignment and treatment control critical. Our account offers insights into ways of maximizing the potential of mixing research methods in RCTs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. Cloverville is a pseudonym.

References

  • Birckmayer, J. D., & Weiss, C. H. (2000). Theory-based evaluation in practice: what do we learn? Evaluation Review, 24(4), 407–431.

    Article  Google Scholar 

  • Bloom, H. S. (Ed.). (2005). Learning more from social experiments: Evolving analytic approaches. New York: Russell Sage Foundation.

    Google Scholar 

  • Borman, G. D., Slavin, R. E., Cheun, A., Chamberlain, A. M., Madden, N. A., & Chambergs, B. (2005). Success for all: first-year results from the national randomized field trial. Educational Evaluation and Policy Analysis, 27(1), 1–22.

    Article  Google Scholar 

  • Boruch, R. (1997). Randomized experiments for planning and evaluation: A practical guide. Thousand Oaks: Sage.

    Google Scholar 

  • Boruch, R. (2002). The virtues of randomness. Education Next, 2(3), 36–41.

    Google Scholar 

  • Brewer, J., & Hunter, A. (1989). Multimethod research: A synthesis of styles. Thousand Oaks: Sage.

    Google Scholar 

  • Bryman, A. (2006). Integrating quantitative and qualitative research: how is it done? Qualitative Research, 6(1), 97–113.

    Article  Google Scholar 

  • Caracelli, V. J., & Greene, J. C. (1993). Data analysis strategies for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 15(2), 195–207.

    Google Scholar 

  • Caracelli, V. J., & Greene, J. C. (1997). Crafting mixed-method evaluation designs. In J. C. Greene & V. J. Caracelli (Eds.), Advances in mixed methods evaluation: The challenges and benefits of integrating diverse paradigms. San Francisco: Jossey-Bass.

    Google Scholar 

  • Chatterji, M. (2005). Evidence on "what works": an argument for extended-term mixed-method (ETMM) evaluation designs. Educational Researcher, 34(5), 14–24.

    Article  Google Scholar 

  • Chen, H. T. (1990). Theory-driven evaluations. Newbury Park: Sage.

    Google Scholar 

  • Chen, H. T. (1997a). Applying mixed methods under the framwork of theory-driven evaluations. New Directions for Program Evaluation, 1997(74), 61–72.

    Article  Google Scholar 

  • Chen, H. T. (1997b). Normative evaluation of an anti-drug abuse program. Evaluation and Program Planning, 20(2), 195–204.

    Article  Google Scholar 

  • Chen, H. T. (2005). Practical program evaluation: Assessing and improving planning, implementation and effectiveness. Thousand Oaks: Sage.

    Google Scholar 

  • Chen, H. T., & Rossi, P. H. (1980). The multi-goal, theory-driven approach to evaluation: a model linking basic and applied social science. Social Forces, 59(1), 106–122.

    Article  Google Scholar 

  • Chen, H. T., & Rossi, P. H. (1983). Evaluating with sense. Evaluation Review, 7(3), 283–302.

    Article  Google Scholar 

  • Chi, M. T. H. (1997). Quantifying qualitative analyses of verbal data: a practical guide. The Journal of the Learning Sciences, 6(3), 271–315.

    Article  Google Scholar 

  • Cook, T. D. (2002). Randomized experiments in educational policy research: a critical examination of the reasons the educational evaluation community has offered for not doing them. Educational Evaluation and Policy Analysis, 24(3), 175–199.

    Article  Google Scholar 

  • Cook, T. D., & Reichardt, C. S. (Eds.). (1979). Qualitative and quantitative methods in evaluation research. Thousand Oaks: Sage.

    Google Scholar 

  • Cook, T. D., & Wong, V. (2006). The IES agenda to institutionalize randomized clinical trials in educational research: Description and commentary. Paper presented at the Institute for Policy Research Spring 2006 Colloquium.

  • Cook, T. D., Murphy, R. F., & Hunt, H. D. (2000). Comer's school development program in Chicago: a theory-based evaluation. American Educational Research Journal, 37(2), 535–597.

    Google Scholar 

  • Creswell, J. W. (2002). Educational research: Planning conducting, and evaluation quantitative and qualitative research. Upper Saddle River: Merrill Prentice Hall.

    Google Scholar 

  • Creswell, J. W., Shope, R., Plano Clark, V. L., & Green, D. O. (2006). How interpretive qualitative research extends mixed methods research. Research in the Schools, 13(1), 1–11.

    Google Scholar 

  • Cronbach, L. J. (1988). Five perspecitves on the validity argument. In H. Wainer & H. I. Braun (Eds.), Test validity. Hillsdale: Erlbaum.

    Google Scholar 

  • Cronbach, L. J. (1989). Construct validation after thirty years. In R. L. Linn (Ed.), Intelligence: Measurement, theory, and public policy. Urbana: University of Illinois Press.

    Google Scholar 

  • Datta, L. (1994). Paradigm wars: a basis for peaceful coexistence and beyond. New Directions for Program Evaluation, 61, 53–70.

    Article  Google Scholar 

  • Denzin, N. K. (1978). The research act: A theoretical introduction to sociological methods (2nd ed.). New York: McGraw-Hill.

    Google Scholar 

  • Denzin, N. K. (1989). The research act: A theoretical introduction to sociological methods (3rd ed.). Englewood Cliffs: Prentice-Hall.

    Google Scholar 

  • Denzin, N. K., & Lincoln, Y. S. (Eds.). (2005). The SAGE handbook of qualitative research (3rd ed.). Thousand Oaks: Sage.

    Google Scholar 

  • Eisenhart, M., & Towne, L. (2003). Contestation and change in national policy on "scientifically based" education research. Educational Researcher, 32(7), 31–38.

    Article  Google Scholar 

  • Flemming, K., Adamson, J., & Atkin, K. (2008). Improving the effectiveness of interventions in palliative care: the potential role of qualitative research in enhancing evidence from randomized controlled trials. Palliative Medicine, 22(2), 123–131.

    Article  Google Scholar 

  • Goldring, E., Huff, J., Spillane, J. P., & Barnes, C. A. (2009). Measuring the learning-centered leadership expertise of school principals. Leadership and Policy in Schools, 8(2), 197–228.

    Article  Google Scholar 

  • Gottlieb, N. H., Lovato, C. Y., Weinstein, R., Green, L. W., & Eriksen, M. P. (1992). The implementation of a restrictive worksite smoking policy in a large decentralized organization. Health Education & Behavior, 19(1), 77–100.

    Article  Google Scholar 

  • Greene, J. C. (2006). Toward a methodology of mixed methods social inquiry. Research in the Schools, 13(1), 93–98.

    Google Scholar 

  • Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11(3), 255–274.

    Google Scholar 

  • Guba, E. G. (1990). The alternative paradigm dialog. In E. G. Guba (Ed.), The paradigm dialog. Newbury Park: Sage.

    Google Scholar 

  • Hall, B., & Howard, K. (2008). A synergistic approach: conducting mixed methods research with typological and systemic design considerations. Journal of Mixed Methods Research, 2(3), 248–269.

    Article  Google Scholar 

  • Howe, K. R. (1988). Against the quantitative-qualitative incompatibility thesis or dogmas die hard. Educational Researcher, 17(8), 10–16.

    Google Scholar 

  • Howe, K. R. (2004). A critique of experimentalism. Qualitative Inquiry, 10(1), 42–61.

    Article  Google Scholar 

  • Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: a research paradigm whose time has come. Educational Researcher, 33(7), 14–26.

    Article  Google Scholar 

  • Klein, G. A., Calderwood, R., & MacGregor, D. (1989). Critical decision method for eliciting knowledge. IEEE Transactions on Systems, Man, and Cybernetics, 19(3), 462–472.

    Article  Google Scholar 

  • Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121–1134.

    Article  Google Scholar 

  • Lipsey, M. W., & Wilson, D. B. (1993). The efficacy of psychological, educational, and behavioral treatment: confirmation from meta-analysis. American Psychologist, 48(12), 1181–1209.

    Article  Google Scholar 

  • Lynch, S., Szesze, M., Pyke, C., & Kuipers, J. (2007). Scaling up highly rated middle science curriculum units for diverse student populations: Features that affect collaborative research and vice versa. In B. Schneider & S.-K. McDonald (Eds.), Scale-up in education: Issues in practice (Vol. 2). Plymouth: Rowman & Litlefield.

    Google Scholar 

  • Mactavish, J. B., & Schleien, S. J. (2004). Re-injecting spontaneity and balance in family life: parents' perspectives on recreation in families that include children with developmental disability. Journal of Intellectual Disability Research, 48(2), 123–141.

    Article  Google Scholar 

  • Mathison, S. (1988). Why triangulate? Educational Researcher, 17(2), 13–17.

    Google Scholar 

  • Maxwell, J. A., & Loomis, D. M. (2003). Mixed methods design: An alternative approach. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research. Thousand Oaks: Sage.

    Google Scholar 

  • Messick, S. (1988). The once and future issues of validity: Assessing the meaning and consequences of measurement. In H. Wainer & H. I. Braun (Eds.), Test validity. Hillsdale: Erlbaum.

    Google Scholar 

  • Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook. Thousands Oaks: Sage.

    Google Scholar 

  • Morgan, D. L. (1998). Practical strategies for combining qualitative and quantitative methods: applications to health research. Qualitative Health Research, 8, 362–376.

    Article  Google Scholar 

  • Morse, J. M. (1991). Approaches to qualitative-quantitative methodological triangulation. Nursing Research, 40(2), 120–123.

    Article  Google Scholar 

  • Morse, J. M. (2003). Principles of mixed methods and multimethod research design. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research. Thousand Oaks: Sage.

    Google Scholar 

  • Moss, P. A. (1992). Shifting conceptions of validity in educational measurement: implications for performance assessment. Review of Educational Research, 62(3), 229–258.

    Google Scholar 

  • Moss, P. A. (1994). Can there be validity without reliability? Educational Researcher, 23(2), 5–12.

    Google Scholar 

  • Nastasi, B. K., & Schensul, S. L. (2005). Contributions of qualitative research to the validity of intervention research. Journal of School Psychology, 43(3), 177–195.

    Article  Google Scholar 

  • National Policy Board for Educational Administration (2002). Standards for advanced programs in educational leadership for principals, superintendents, curriculum directors and supervisors.

  • Niglas, K. (2004). The combined use of qualitative and quantitative methods in educational research. Tallinn: Tallinn Pedagogical University.

    Google Scholar 

  • Petrosino, A. (2000). Answering the "why" question in evaluation: the causal-model approach. Canadian Journal of Program Evaluation, 15(1), 1–24.

    Google Scholar 

  • Ragin, C. C. (2000). Fuzzy-set social science. Chicago: University of Chicago Press.

    Google Scholar 

  • Reichardt, C. S., & Rallis, S. F. (Eds.). (1994). The qualitative-quantitative debate: New perspectives. San Francisco: Jossey-Bass.

    Google Scholar 

  • Riggin, L. J. C. (1997). Advances in mixed-method evaluation: a synthesis and comment. New Directions for Program Evaluation, 74, 87–94.

    Article  Google Scholar 

  • Rossi, P. H., Lipsey, M. W., & Freeman, D. J. (2004). Evaluation: A systematic approach (7th ed.). Thousand Oaks: Sage.

    Google Scholar 

  • Rossman, G. B., & Wilson, B. L. (1993). Numbers and words revisited: being "shamelessly eclectic". Quality and Quantity, 28(3), 315–327.

    Article  Google Scholar 

  • Sandelowski, M. (1996). Focus on qualitative methods: using qualitative methods in intervention studies. Research in Nursing and Health, 19(4), 359–364.

    Article  Google Scholar 

  • Sandelowski, M. (2003). Tables or tableaux? The challenges of writing and reading mixed methods studies. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social dn behavioral research. Thousand Oaks: Sage.

    Google Scholar 

  • Shadish, W. R., Cook, T. D., & Leviton, L. C. (1991). Foundations of program evaluation: Theories of practice. Newbury Park: Sage.

    Google Scholar 

  • Shavelson, R. J., & Towne, L. (Eds.). (2002). Scientific research in education. Washington, DC: National Academies Press.

    Google Scholar 

  • Spillane, J. P., Camburn, E. M., & Pareja, A. S. (2007). Taking a distributed perspective to the school principal's workday. Leadership and Policy in Schools, 6(1), 103–125.

    Article  Google Scholar 

  • Strauss, A. L., & Corbin, J. M. (1990). Basics of qualitative research: Grounded theory procedures and techniques. Thousand Oaks: Sage.

    Google Scholar 

  • Tashakkori, A., & Teddlie, C. (1998). Mixed methodology: Combining qualitative and quantitative approaches. Thousand Oaks: Sage.

    Google Scholar 

  • Tashakkori, A., & Teddlie, C. (Eds.). (2003). Handbook of mixed methods in social and behavioral research. Thousand Oaks: Sage.

    Google Scholar 

  • Weiss, C. H. (1997). How can theory-based evaluation make greater headway? Evaluation Review, 21(4), 501–524.

    Article  Google Scholar 

  • Yin, R. K. (2006). Mixed methods research: are the methods genuinely integrated or merely parallel? Research in the Schools, 13(1), 41–47.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to James P. Spillane.

Appendix

Appendix

Effective teaching and learning scenario coding rubric

Dimensions of teaching and learning referred to in the scale below include but are NOT limited to:

  • student and/or teacher effort produces achievement,

  • student learning is about making connections,

  • students learn with and through others,

  • student learning takes time,

  • student and teacher motivation is important to effective teaching and student learning,

  • focused teaching promotes accelerated learning,

  • clear expectations and continuous feedback to students and/or teachers activate student learning (this does not include the process of monitoring instruction in classrooms),

  • good teaching builds on students strengths and respects individual differences,

  • good teaching involves modeling what students should learn

  • general references to teachers’ use of effective teaching and learning practices (this includes discussions of teachers’ use of best practices)

Other dimensions might include but are not limited to:

  • cognitively or developmentally appropriate or challenging curriculum for students

  • applied learning theory

  • individualized instruction

  • reciprocal teaching

  • inquiry teaching or direct instruction

  1. 1.

    A Little

    Mere mention of one or two aspects of effective teaching and/or learning with no development of the aspect(s). NOTE: mentioning the same thing 10 times with no development is still a mere mention.

  2. 2.

    Some.

    Mentions at least three or more different aspects of effective teaching and learning but does not develop any of the aspects.

  3. 3.

    Sufficient

    Mentions at least one aspect of effective teaching and learning and develops at least one aspect; that is, the response goes beyond mention of an aspect to develop it suggesting a deeper understanding. (For example, the respondent might mention effective instructional strategies in reading and say teachers need to use “writing workshop” or “balanced literacy.” Or, the respondent might mention evidence based teaching or assessment and go on to note trying to figure out the strategies that teachers use who have high performing students).

    Specific example of single aspect (individualized instruction) that is developed:

    “Students must have pre assessment in the critical areas of reading such as vocabulary, phonics, fluency, comprehension, etc. Teachers must know the basic reading levels of their students. Instruction must be tailored to meet these specific needs.”

  4. 4.

    Quite a Bit

    Mentions at least two aspects of effective teaching and learning and develops two or more; that is, the response goes beyond mentioning the aspects to developing them with more discussion that suggests a deeper understanding of the aspects.

  5. 5.

    A Great Deal

    Mentions at least two aspects of effective teaching and learning and develops two or more AND makes connections between at least two of the aspects mentioned; that is, the response goes beyond mentioning and developing two or more aspects of effective teaching and learning to making a link or connection between at least two aspects. For example, the respondent might mention and develop how student motivation is critical and then link it to how student effort produces achievement rather than IQ alone. A second example could be that a principal develops 1) how to determine if teachers are using best practices in their teaching, and 2) the importance of using individualized instruction, and she/he then connects them by discussing how individualized instruction should be included as a part of best practices.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Spillane, J.P., Pareja, A.S., Dorner, L. et al. Mixing methods in randomized controlled trials (RCTs): Validation, contextualization, triangulation, and control. Educ Asse Eval Acc 22, 5–28 (2010). https://doi.org/10.1007/s11092-009-9089-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11092-009-9089-8

Keywords

Navigation