Skip to main content
Top
Published in: Implementation Science 1/2012

Open Access 01-12-2012 | Methodology

Modeling technology innovation: How science, engineering, and industry methods can combine to generate beneficial socioeconomic impacts

Authors: Vathsala I Stone, Joseph P Lane

Published in: Implementation Science | Issue 1/2012

Login to get access

Abstract

Background

Government-sponsored science, technology, and innovation (STI) programs support the socioeconomic aspects of public policies, in addition to expanding the knowledge base. For example, beneficial healthcare services and devices are expected to result from investments in research and development (R&D) programs, which assume a causal link to commercial innovation. Such programs are increasingly held accountable for evidence of impact—that is, innovative goods and services resulting from R&D activity. However, the absence of comprehensive models and metrics skews evidence gathering toward bibliometrics about research outputs (published discoveries), with less focus on transfer metrics about development outputs (patented prototypes) and almost none on econometrics related to production outputs (commercial innovations). This disparity is particularly problematic for the expressed intent of such programs, as most measurable socioeconomic benefits result from the last category of outputs.

Methods

This paper proposes a conceptual framework integrating all three knowledge-generating methods into a logic model, useful for planning, obtaining, and measuring the intended beneficial impacts through the implementation of knowledge in practice. Additionally, the integration of the Context-Input-Process-Product (CIPP) model of evaluation proactively builds relevance into STI policies and programs while sustaining rigor.

Results

The resulting logic model framework explicitly traces the progress of knowledge from inputs, following it through the three knowledge-generating processes and their respective knowledge outputs (discovery, invention, innovation), as it generates the intended socio-beneficial impacts. It is a hybrid model for generating technology-based innovations, where best practices in new product development merge with a widely accepted knowledge-translation approach. Given the emphasis on evidence-based practice in the medical and health fields and “bench to bedside” expectations for knowledge transfer, sponsors and grantees alike should find the model useful for planning, implementing, and evaluating innovation processes.

Conclusions

High-cost/high-risk industries like healthcare require the market deployment of technology-based innovations to improve domestic society in a global economy. An appropriate balance of relevance and rigor in research, development, and production is crucial to optimize the return on public investment in such programs. The technology-innovation process needs a comprehensive operational model to effectively allocate public funds and thereby deliberately and systematically accomplish socioeconomic benefits.
Appendix
Available only for authorised users
Literature
1.
go back to reference Weiss CH: The Many Meanings of Research Utilization. Public Administration Review. 1979, 39: 426-431. 10.2307/3109916.CrossRef Weiss CH: The Many Meanings of Research Utilization. Public Administration Review. 1979, 39: 426-431. 10.2307/3109916.CrossRef
3.
go back to reference Averch HA: Using Expert Judgment. In Handbook of Practical Program Evaluation. Edited by: Wholey JS, Hatry HP, Newcomer KE. 2004, Jossey-Bass, San Francisco, 292-309. Averch HA: Using Expert Judgment. In Handbook of Practical Program Evaluation. Edited by: Wholey JS, Hatry HP, Newcomer KE. 2004, Jossey-Bass, San Francisco, 292-309.
4.
go back to reference Wholey JS: Evaluability Assessment: Developing Program Theory. In New Directions for Program Evaluation. 1987, Jossey-Bass, San Francisco, 77-92. Wholey JS: Evaluability Assessment: Developing Program Theory. In New Directions for Program Evaluation. 1987, Jossey-Bass, San Francisco, 77-92.
5.
go back to reference Wholey JS: Evaluability Assessment. JS Wholey, HP Hatry, KE Newcomer. 2004, Jossey-Bass, San Francisco, 33-62. Wholey JS: Evaluability Assessment. JS Wholey, HP Hatry, KE Newcomer. 2004, Jossey-Bass, San Francisco, 33-62.
6.
go back to reference McLaughlin JA, Jordan GB: Logic Models: a tool for telling your performance story. Evaluation and Program Planning. 1991, 22: 65-72.CrossRef McLaughlin JA, Jordan GB: Logic Models: a tool for telling your performance story. Evaluation and Program Planning. 1991, 22: 65-72.CrossRef
7.
go back to reference McLaughlin JA, Jordan GB: Using logic models. In Handbook of Practical Program Evaluation. Edited by: Wholey JS, Hatry HP, Newcomer KE. 2004, Jossey-Bass, San Francisco, 7-32. McLaughlin JA, Jordan GB: Using logic models. In Handbook of Practical Program Evaluation. Edited by: Wholey JS, Hatry HP, Newcomer KE. 2004, Jossey-Bass, San Francisco, 7-32.
8.
go back to reference Boruch RF, Petrosino A: Meta-analysis, Systematic Reviews, and Research Syntheses. In Handbook of Practical Program Evaluation. Edited by: Wholey JS, Hatry HP, Newcomer KE. 2004, Jossey-Bass, San Francisco, 176-203. Boruch RF, Petrosino A: Meta-analysis, Systematic Reviews, and Research Syntheses. In Handbook of Practical Program Evaluation. Edited by: Wholey JS, Hatry HP, Newcomer KE. 2004, Jossey-Bass, San Francisco, 176-203.
9.
go back to reference Special Issue. Recognizing the supply of and demand for research in the science of science and innovation policy. Edited by: Pleke R. 2011, Policy Sciences Special Issue. Recognizing the supply of and demand for research in the science of science and innovation policy. Edited by: Pleke R. 2011, Policy Sciences
11.
go back to reference Herzlinger RE: Why Innovation in Health Care is so Hard. 2011, Harvard Business Publishing, Boston Herzlinger RE: Why Innovation in Health Care is so Hard. 2011, Harvard Business Publishing, Boston
13.
22.
go back to reference Yonxiang L: Science & Technology in China: A Roadmap to 2050: In Strategic General Report of the Chinese Academy of Sciences. 2011, Science Press Beijing, Springer Yonxiang L: Science & Technology in China: A Roadmap to 2050: In Strategic General Report of the Chinese Academy of Sciences. 2011, Science Press Beijing, Springer
23.
go back to reference Joint Committee on Standards for Educational Evaluation: The Program Evaluation Standards. 1994, Sage Publications, Thousand Oaks, CA, 2 Joint Committee on Standards for Educational Evaluation: The Program Evaluation Standards. 1994, Sage Publications, Thousand Oaks, CA, 2
24.
go back to reference Stufflebeam DL, Foley WJ, Gephart WJ, Guba EG, Hammond RL, Merriman HO, Provus MM: Educational Evaluation and Decision Making. 1971, Peacock, Itasca, IL Stufflebeam DL, Foley WJ, Gephart WJ, Guba EG, Hammond RL, Merriman HO, Provus MM: Educational Evaluation and Decision Making. 1971, Peacock, Itasca, IL
25.
go back to reference Stufflebeam DL: Evaluation models. In New Directions for Evaluation. 2001, Jossey-Bass, San Francisco, 7-98. Stufflebeam DL: Evaluation models. In New Directions for Evaluation. 2001, Jossey-Bass, San Francisco, 7-98.
26.
go back to reference Stufflebeam DL: The 21stCentury CIPP Model: Origins, Development and Use. In Evaluation Roots. Edited by: Alkin MC. 2004, Sage, Thousand Oaks, CA, 245-266.CrossRef Stufflebeam DL: The 21stCentury CIPP Model: Origins, Development and Use. In Evaluation Roots. Edited by: Alkin MC. 2004, Sage, Thousand Oaks, CA, 245-266.CrossRef
27.
go back to reference Stufflebeam DL: In International Handbook of Educational Evaluation. CIPP model (context, input, process, product). Edited by: Mathison S. 2005, Sage, Thousand Oaks, CA, 60-65. Stufflebeam DL: In International Handbook of Educational Evaluation. CIPP model (context, input, process, product). Edited by: Mathison S. 2005, Sage, Thousand Oaks, CA, 60-65.
28.
go back to reference Stufflebeam DL, Shinkfield AJ: Evaluation Theory, Models, and Applications. 2007, Jossey-Bass, San Francisco, CA Stufflebeam DL, Shinkfield AJ: Evaluation Theory, Models, and Applications. 2007, Jossey-Bass, San Francisco, CA
30.
go back to reference Sudsawad P: Knowledge Translation: Introduction to Models, Strategies, and Measures. 2007, Southwest Educational Development Laboratory, National Center for the Dissemination of Disability Research, Austin, TX Sudsawad P: Knowledge Translation: Introduction to Models, Strategies, and Measures. 2007, Southwest Educational Development Laboratory, National Center for the Dissemination of Disability Research, Austin, TX
31.
go back to reference Graham ID, Logan J, Harrison MB, Straus SE, Tetroe J, Caswell W, Robinson N: Lost in Translation: Time for a Map?. J Contin Educ Heal Prof. 2006, 26: 13-24. 10.1002/chp.47.CrossRef Graham ID, Logan J, Harrison MB, Straus SE, Tetroe J, Caswell W, Robinson N: Lost in Translation: Time for a Map?. J Contin Educ Heal Prof. 2006, 26: 13-24. 10.1002/chp.47.CrossRef
32.
go back to reference Rogers EM: Diffusion of Innovations. (5th Ed.). 2003, Free Press, Simon and Schuster Inc, NY Rogers EM: Diffusion of Innovations. (5th Ed.). 2003, Free Press, Simon and Schuster Inc, NY
34.
go back to reference Bush V: Science: The Endless Frontier. 1960, National Science Foundation, Reprinted by Washington, D.C Bush V: Science: The Endless Frontier. 1960, National Science Foundation, Reprinted by Washington, D.C
36.
go back to reference Stokes DE: Pasteur’s quadrant: Basic science and technological innovation. 1997, Brookings Institution Press, Washington, DC Stokes DE: Pasteur’s quadrant: Basic science and technological innovation. 1997, Brookings Institution Press, Washington, DC
37.
go back to reference Sarewitz D, Pielke RA: The neglected heart of science policy: Reconciling supply of and demand for science. Environmental Science & Policy. 2007, 10: 5-16. 10.1016/j.envsci.2006.10.001.CrossRef Sarewitz D, Pielke RA: The neglected heart of science policy: Reconciling supply of and demand for science. Environmental Science & Policy. 2007, 10: 5-16. 10.1016/j.envsci.2006.10.001.CrossRef
38.
go back to reference Stufflebeam DL: Stufflebeam’s improvement-oriented evaluation. In Systematic Evaluation. Edited by: Stufflebeam DL, Shinkfield AJ. 1985, Kluwere-Nijhoff, Boston, 151-207.CrossRef Stufflebeam DL: Stufflebeam’s improvement-oriented evaluation. In Systematic Evaluation. Edited by: Stufflebeam DL, Shinkfield AJ. 1985, Kluwere-Nijhoff, Boston, 151-207.CrossRef
39.
go back to reference Worthen BR, Sanders JR, Fitzpatrick JL: Program Evaluation: Alternative Approaches and Practical Guidelines. 1997, Longman, New York, 2 Worthen BR, Sanders JR, Fitzpatrick JL: Program Evaluation: Alternative Approaches and Practical Guidelines. 1997, Longman, New York, 2
40.
go back to reference Scriven M: The methodology of evaluation. In Educational Evaluation: Theory and Practice. Edited by: Worthen BR, Sanders JR. 1973, Wadsworth, Belmont, CA, 60-106. Scriven M: The methodology of evaluation. In Educational Evaluation: Theory and Practice. Edited by: Worthen BR, Sanders JR. 1973, Wadsworth, Belmont, CA, 60-106.
41.
go back to reference Scriven M: Evaluation Thesaurus. 1991, Sage, Newbury Park, CA, 4 Scriven M: Evaluation Thesaurus. 1991, Sage, Newbury Park, CA, 4
43.
go back to reference The PDMA Handbook of New Product Development. Edited by: Kahn KB, Castellion G, Griffin A. 2005, John Wiley & Sons, Inc, Hoboken, NJ The PDMA Handbook of New Product Development. Edited by: Kahn KB, Castellion G, Griffin A. 2005, John Wiley & Sons, Inc, Hoboken, NJ
44.
go back to reference Frechtling JA: Logic Modeling Methods in Program Evaluation. 2007, Jossey-Bass, San Francisco Frechtling JA: Logic Modeling Methods in Program Evaluation. 2007, Jossey-Bass, San Francisco
45.
go back to reference Rogers PJ: Logic model. In Encyclopedia of Evaluation. Edited by: Mathison S. 2004, Sage, Newbury Park, CA, 232-4. Rogers PJ: Logic model. In Encyclopedia of Evaluation. Edited by: Mathison S. 2004, Sage, Newbury Park, CA, 232-4.
46.
go back to reference United Way of America: Measuring Program Outcomes: A Practical Approach. 1996, United Way of America, Arlington, VA United Way of America: Measuring Program Outcomes: A Practical Approach. 1996, United Way of America, Arlington, VA
48.
go back to reference Williams VL, Elseman E, Landree E, Adamson DM: Demonstrating and Communicating Research Impact. 2009, Rand Corporation, Santa Monica, CA Williams VL, Elseman E, Landree E, Adamson DM: Demonstrating and Communicating Research Impact. 2009, Rand Corporation, Santa Monica, CA
49.
go back to reference USDE/NIDRR: United States Department of Education/National Institute on Disability Rehabilitation Research: Notice of Final Long-Range Plan for Fiscal years 2005–2009. Federal Register. 2006, 71:31-8166–8200. Notices USDE/NIDRR: United States Department of Education/National Institute on Disability Rehabilitation Research: Notice of Final Long-Range Plan for Fiscal years 2005–2009. Federal Register. 2006, 71:31-8166–8200. Notices
50.
go back to reference Rogers PJ: Purposeful program teory. In workshop material distributed at the Annual Meeting of the American Evaluation Association Conference. 2009, Orlando, FL Rogers PJ: Purposeful program teory. In workshop material distributed at the Annual Meeting of the American Evaluation Association Conference. 2009, Orlando, FL
51.
go back to reference Chen HT: Theory-Driven Evaluation. 1990, Sage, Thousand Oaks, CA Chen HT: Theory-Driven Evaluation. 1990, Sage, Thousand Oaks, CA
52.
go back to reference Chen HT: Theory-driven evaluation. In Encyclopedia of Evaluation. Edited by: Mathison S. 2004, Sage, Newbury Park, CA, 415-419. Chen HT: Theory-driven evaluation. In Encyclopedia of Evaluation. Edited by: Mathison S. 2004, Sage, Newbury Park, CA, 415-419.
53.
go back to reference Donaldson SI, Lipsey MW: Roles for theory in contemporary evaluation practice: developing practical knowledge. 2006, Sage Publications, In The Sage handbook of evaluation, Edited by IF Shaw, JC Greene, MM Mark, Thousand Oaks, CA, 56-75. Donaldson SI, Lipsey MW: Roles for theory in contemporary evaluation practice: developing practical knowledge. 2006, Sage Publications, In The Sage handbook of evaluation, Edited by IF Shaw, JC Greene, MM Mark, Thousand Oaks, CA, 56-75.
54.
go back to reference Rogers PJ: Using programme theory to evaluate complicated and complex aspects of interventions. Evaluation. 2008, 14: 29-48. 10.1177/1356389007084674.CrossRef Rogers PJ: Using programme theory to evaluate complicated and complex aspects of interventions. Evaluation. 2008, 14: 29-48. 10.1177/1356389007084674.CrossRef
55.
go back to reference Rogers PJ, Hasci TA, Petrosino A, Huebner TA: Program theory in evaluation: challenges and opportunities. In New Directions for Evaluation 87. 2000, Jossey-Bass, San Francisco Rogers PJ, Hasci TA, Petrosino A, Huebner TA: Program theory in evaluation: challenges and opportunities. In New Directions for Evaluation 87. 2000, Jossey-Bass, San Francisco
56.
go back to reference Weiss CH: Evaluation: Methods for Studying Programs and Policies. 1998, Prentice Hall, Englewood Cliffs, NJ Weiss CH: Evaluation: Methods for Studying Programs and Policies. 1998, Prentice Hall, Englewood Cliffs, NJ
57.
go back to reference Scriven M, Coryn CLS: The logic of research evaluation. In New Directions for Evaluation. 2008, Jossey-Bass, San Francisco, 89-105. Scriven M, Coryn CLS: The logic of research evaluation. In New Directions for Evaluation. 2008, Jossey-Bass, San Francisco, 89-105.
60.
go back to reference Siegel DS, Waldman D, Link A: Assessing the impact of organizational practices on the relative productivity of university technology transfer offices: an exploratory study. Res Policy. 2003, 32: 27-48. 10.1016/S0048-7333(01)00196-2.CrossRef Siegel DS, Waldman D, Link A: Assessing the impact of organizational practices on the relative productivity of university technology transfer offices: an exploratory study. Res Policy. 2003, 32: 27-48. 10.1016/S0048-7333(01)00196-2.CrossRef
63.
go back to reference Rogers PJ, Williams B: Evaluation for practice improvement and organizational learning. In The Sage handbook of evaluation. Edited by: Shaw IF, Greene JC, Mark MM. 2006, Sage Publications, Thousand Oaks, 76-97. Rogers PJ, Williams B: Evaluation for practice improvement and organizational learning. In The Sage handbook of evaluation. Edited by: Shaw IF, Greene JC, Mark MM. 2006, Sage Publications, Thousand Oaks, 76-97.
64.
go back to reference Jordan GB, Hage J, Mote J: A theories-based systemic framework for evaluating diverse portfolios of scientific work, Part I: micro and meso indicators. In New Directions for Evaluation. 2008, Jossey-Bass, San Francisco, 7-24. Jordan GB, Hage J, Mote J: A theories-based systemic framework for evaluating diverse portfolios of scientific work, Part I: micro and meso indicators. In New Directions for Evaluation. 2008, Jossey-Bass, San Francisco, 7-24.
Metadata
Title
Modeling technology innovation: How science, engineering, and industry methods can combine to generate beneficial socioeconomic impacts
Authors
Vathsala I Stone
Joseph P Lane
Publication date
01-12-2012
Publisher
BioMed Central
Published in
Implementation Science / Issue 1/2012
Electronic ISSN: 1748-5908
DOI
https://doi.org/10.1186/1748-5908-7-44

Other articles of this Issue 1/2012

Implementation Science 1/2012 Go to the issue