Skip to main content
Top

25-04-2024 | Practical Application

An Overview of Data Collection in Health Preference Research

Authors: Semra Ozdemir, Matthew Quaife, Ateesha F. Mohamed, Richard Norman

Published in: The Patient - Patient-Centered Outcomes Research

Login to get access

Abstract

This paper focuses on survey administration and data collection methods employed for stated-preference studies in health applications. First, it describes different types of survey administration methods, encompassing web-based surveys, face-to-face (in-person) surveys, and mail surveys. Second, the concept of sampling frames is introduced, clarifying distinctions between the target population and survey frame population. The discussion then extends to different types of sampling methods, such as probability and non-probability sampling, along with an evaluation of potential issues associated with different sampling methods within the context of health preference research. Third, the paper provides information about different recruitment methods, including web-surveys, leveraging patient groups, and in-clinic recruitment. Fourth, a crucial aspect addressed is the calculation of response rate, with insights into determining an adequate response rate and strategies to improve response rates in stated-preference surveys. Lastly, the paper concludes by discussing data management plans and suggesting insights for future research in this field. In summary, this paper examines the nuanced aspects of survey administration and data collection methods in stated-preference studies, offering valuable guidance for researchers and practitioners in the health domain.
Literature
1.
go back to reference Davino C, Fabbris L. Survey data collection and integration, vol. 1. Springer; 2013.CrossRef Davino C, Fabbris L. Survey data collection and integration, vol. 1. Springer; 2013.CrossRef
2.
go back to reference Buring JE. Primary data collection: what should well-trained epidemiology doctoral students be able to do? Epidemiology. 2008;19(2):347–9.PubMedCrossRef Buring JE. Primary data collection: what should well-trained epidemiology doctoral students be able to do? Epidemiology. 2008;19(2):347–9.PubMedCrossRef
3.
go back to reference Couper MP. The future of modes of data collection. Public Opin Q. 2011;75(5):889–908.CrossRef Couper MP. The future of modes of data collection. Public Opin Q. 2011;75(5):889–908.CrossRef
4.
go back to reference Sylvia ML. Primary data collection. Clinical Analytics and Data Management for the DNP, 2018; p. 87–96. Sylvia ML. Primary data collection. Clinical Analytics and Data Management for the DNP, 2018; p. 87–96.
5.
go back to reference Dillman DA. The design and administration of mail surveys. Ann Rev Sociol. 1991;17(1):225–49.CrossRef Dillman DA. The design and administration of mail surveys. Ann Rev Sociol. 1991;17(1):225–49.CrossRef
6.
go back to reference Stedman RC, et al. The end of the (research) world as we know it? Understanding and coping with declining response rates to mail surveys. Soc Nat Resour. 2019;32(10):1139–54.CrossRef Stedman RC, et al. The end of the (research) world as we know it? Understanding and coping with declining response rates to mail surveys. Soc Nat Resour. 2019;32(10):1139–54.CrossRef
7.
go back to reference Daikeler J, Bošnjak M, Lozar-Manfreda K. Web versus other survey modes: an updated and extended meta-analysis comparing response rates. J Survey Stat Methodol. 2020;8(3):513–39.CrossRef Daikeler J, Bošnjak M, Lozar-Manfreda K. Web versus other survey modes: an updated and extended meta-analysis comparing response rates. J Survey Stat Methodol. 2020;8(3):513–39.CrossRef
8.
go back to reference Watson V, et al. Mode and frame matter: assessing the impact of survey mode and sample frame in choice experiments. Med Decis Making. 2019;39(7):827–41.PubMedPubMedCentralCrossRef Watson V, et al. Mode and frame matter: assessing the impact of survey mode and sample frame in choice experiments. Med Decis Making. 2019;39(7):827–41.PubMedPubMedCentralCrossRef
9.
go back to reference Link MW, et al. A comparison of address-based sampling (ABS) versus random-digit dialing (RDD) for general population surveys. Public Opin Q. 2008;72(1):6–27.CrossRef Link MW, et al. A comparison of address-based sampling (ABS) versus random-digit dialing (RDD) for general population surveys. Public Opin Q. 2008;72(1):6–27.CrossRef
10.
go back to reference Paulsen A, Overgaard S, Lauritsen JM. Quality of data entry using single entry, double entry and automated forms processing—an example based on a study of patient-reported outcomes. PLoS One. 2012;7(4): e35087.PubMedPubMedCentralCrossRef Paulsen A, Overgaard S, Lauritsen JM. Quality of data entry using single entry, double entry and automated forms processing—an example based on a study of patient-reported outcomes. PLoS One. 2012;7(4): e35087.PubMedPubMedCentralCrossRef
11.
12.
go back to reference Vivo S, et al. How accurate is our misinformation? A randomized comparison of four survey interview methods to measure risk behavior among young adults in the Dominican Republic. Dev Eng. 2017;2:53–67.CrossRef Vivo S, et al. How accurate is our misinformation? A randomized comparison of four survey interview methods to measure risk behavior among young adults in the Dominican Republic. Dev Eng. 2017;2:53–67.CrossRef
13.
go back to reference Simmons AD, Bobo LD. Can non-full-probability internet surveys yield useful data? A comparison with full-probability face-to-face surveys in the domain of race and social inequality attitudes. Sociol Methodol. 2015;45(1):357–87.CrossRef Simmons AD, Bobo LD. Can non-full-probability internet surveys yield useful data? A comparison with full-probability face-to-face surveys in the domain of race and social inequality attitudes. Sociol Methodol. 2015;45(1):357–87.CrossRef
14.
go back to reference Rowen D, et al. Comparison of modes of administration and alternative formats for eliciting societal preferences for burden of illness. Appl Health Econ Health Policy. 2016;14:89–104.PubMedCrossRef Rowen D, et al. Comparison of modes of administration and alternative formats for eliciting societal preferences for burden of illness. Appl Health Econ Health Policy. 2016;14:89–104.PubMedCrossRef
16.
go back to reference Veldwijk J, et al. Exploring how individuals complete the choice tasks in a discrete choice experiment: an interview study. BMC Med Res Methodol. 2016;16:1–11.CrossRef Veldwijk J, et al. Exploring how individuals complete the choice tasks in a discrete choice experiment: an interview study. BMC Med Res Methodol. 2016;16:1–11.CrossRef
17.
go back to reference Whitty JA, et al. A think aloud study comparing the validity and acceptability of discrete choice and best worst scaling methods. PLoS One. 2014;9(4): e90635.PubMedPubMedCentralCrossRef Whitty JA, et al. A think aloud study comparing the validity and acceptability of discrete choice and best worst scaling methods. PLoS One. 2014;9(4): e90635.PubMedPubMedCentralCrossRef
18.
go back to reference Harkness J, Pennell BE and Schoua‐Glusberg A. Survey questionnaire translation and assessment. Methods for Testing and Evaluating Survey Questionnaires, 2004; p. 453–73. Harkness J, Pennell BE and Schoua‐Glusberg A. Survey questionnaire translation and assessment. Methods for Testing and Evaluating Survey Questionnaires, 2004; p. 453–73.
19.
go back to reference Rolland L. ‘I’m sure at some point we’ll be switching’: planning and enacting an interview language policy with multilingual participants. J Multiling Multicult Dev. 2023;44(8):702–17.CrossRef Rolland L. ‘I’m sure at some point we’ll be switching’: planning and enacting an interview language policy with multilingual participants. J Multiling Multicult Dev. 2023;44(8):702–17.CrossRef
20.
go back to reference Braekman E, et al. Unit response and costs in web versus face-to-face data collection: comparison of two cross-sectional health surveys. J Med Internet Res. 2022;24(1): e26299.PubMedPubMedCentralCrossRef Braekman E, et al. Unit response and costs in web versus face-to-face data collection: comparison of two cross-sectional health surveys. J Med Internet Res. 2022;24(1): e26299.PubMedPubMedCentralCrossRef
21.
go back to reference Sperber AD et al. Face‐to‐face interviews versus Internet surveys: comparison of two data collection methods in the Rome foundation global epidemiology study: implications for population‐based research. Neurogastroenterol Motility 2023;35(6):e14583. Sperber AD et al. Face‐to‐face interviews versus Internet surveys: comparison of two data collection methods in the Rome foundation global epidemiology study: implications for population‐based research. Neurogastroenterol Motility 2023;35(6):e14583.
22.
go back to reference Bianchi A, Biffignandi S, Lynn P. Web-face-to-face mixed-mode design in a longitudinal survey: effects on participation rates, sample composition, and costs. J Official Stat. 2017;33(2):385–408.CrossRef Bianchi A, Biffignandi S, Lynn P. Web-face-to-face mixed-mode design in a longitudinal survey: effects on participation rates, sample composition, and costs. J Official Stat. 2017;33(2):385–408.CrossRef
23.
go back to reference Huls SP, van Exel J, de Bekker-Grob EW. An attempt to decrease social desirability bias: the effect of cheap talk mitigation on internal and external validity of discrete choice experiments. Food Qual Prefer. 2023;111: 104986.CrossRef Huls SP, van Exel J, de Bekker-Grob EW. An attempt to decrease social desirability bias: the effect of cheap talk mitigation on internal and external validity of discrete choice experiments. Food Qual Prefer. 2023;111: 104986.CrossRef
24.
go back to reference Holbrook AL, Green MC, Krosnick JA. Telephone versus face-to-face interviewing of national probability samples with long questionnaires: comparisons of respondent satisficing and social desirability response bias. Public Opin Q. 2003;67(1):79–125.CrossRef Holbrook AL, Green MC, Krosnick JA. Telephone versus face-to-face interviewing of national probability samples with long questionnaires: comparisons of respondent satisficing and social desirability response bias. Public Opin Q. 2003;67(1):79–125.CrossRef
27.
go back to reference Terris-Prestholt F, et al. How much demand for new HIV prevention technologies can we really expect? Results from a discrete choice experiment in South Africa. PLoS One. 2013;8(12): e83193.PubMedPubMedCentralCrossRef Terris-Prestholt F, et al. How much demand for new HIV prevention technologies can we really expect? Results from a discrete choice experiment in South Africa. PLoS One. 2013;8(12): e83193.PubMedPubMedCentralCrossRef
28.
go back to reference Vass CM, Boeri M. Mobilising the next generation of stated-preference studies: the association of access device with choice behaviour and data quality. Patient. 2021;14:55–63.PubMedCrossRef Vass CM, Boeri M. Mobilising the next generation of stated-preference studies: the association of access device with choice behaviour and data quality. Patient. 2021;14:55–63.PubMedCrossRef
29.
go back to reference Schmidt WC. World-Wide Web survey research: benefits, potential problems, and solutions. Behav Res Methods Instrum Comput. 1997;29(2):274–9.CrossRef Schmidt WC. World-Wide Web survey research: benefits, potential problems, and solutions. Behav Res Methods Instrum Comput. 1997;29(2):274–9.CrossRef
30.
go back to reference Scott A, et al. A randomised trial and economic evaluation of the effect of response mode on response rate, response bias, and item non-response in a survey of doctors. BMC Med Res Methodol. 2011;11(1):1–12.CrossRef Scott A, et al. A randomised trial and economic evaluation of the effect of response mode on response rate, response bias, and item non-response in a survey of doctors. BMC Med Res Methodol. 2011;11(1):1–12.CrossRef
32.
go back to reference Huang H-M. Do print and Web surveys provide the same results? Comput Hum Behav. 2006;22(3):334–50.CrossRef Huang H-M. Do print and Web surveys provide the same results? Comput Hum Behav. 2006;22(3):334–50.CrossRef
33.
go back to reference Determann D, et al. Impact of survey administration mode on the results of a health-related discrete choice experiment: online and paper comparison. Value Health. 2017;20(7):953–60.PubMedCrossRef Determann D, et al. Impact of survey administration mode on the results of a health-related discrete choice experiment: online and paper comparison. Value Health. 2017;20(7):953–60.PubMedCrossRef
34.
go back to reference Leisher C. A comparison of tablet-based and paper-based survey data collection in conservation projects. Soc Sci. 2014;3(2):264–71.CrossRef Leisher C. A comparison of tablet-based and paper-based survey data collection in conservation projects. Soc Sci. 2014;3(2):264–71.CrossRef
35.
go back to reference Oliveri S, et al. Opportunities and challenges of web-based and remotely administered surveys for patient preference studies in a vulnerable population. Patient Prefer Adherence. 2021;15:2509–17.PubMedPubMedCentralCrossRef Oliveri S, et al. Opportunities and challenges of web-based and remotely administered surveys for patient preference studies in a vulnerable population. Patient Prefer Adherence. 2021;15:2509–17.PubMedPubMedCentralCrossRef
36.
go back to reference Weber S. A step-by-step procedure to implement discrete choice experiments in Qualtrics. Soc Sci Comput Rev. 2021;39(5):903–21.CrossRef Weber S. A step-by-step procedure to implement discrete choice experiments in Qualtrics. Soc Sci Comput Rev. 2021;39(5):903–21.CrossRef
37.
go back to reference Janssen EM, Hauber AB, Bridges JF. Conducting a discrete-choice experiment study following recommendations for good research practices: an application for eliciting patient preferences for diabetes treatments. Value Health. 2018;21(1):59–68.PubMedCrossRef Janssen EM, Hauber AB, Bridges JF. Conducting a discrete-choice experiment study following recommendations for good research practices: an application for eliciting patient preferences for diabetes treatments. Value Health. 2018;21(1):59–68.PubMedCrossRef
38.
go back to reference Feroz AS, et al. Using mobile phones to improve young people sexual and reproductive health in low and middle-income countries: a systematic review to identify barriers, facilitators, and range of mHealth solutions. Reprod Health. 2021;18(1):1–13.CrossRef Feroz AS, et al. Using mobile phones to improve young people sexual and reproductive health in low and middle-income countries: a systematic review to identify barriers, facilitators, and range of mHealth solutions. Reprod Health. 2021;18(1):1–13.CrossRef
39.
go back to reference Kazi AM, et al. Characteristics of mobile phone access and usage among caregivers in Pakistan—a mHealth survey of urban and rural population. Int J Med Inform. 2021;156: 104600.PubMedCrossRef Kazi AM, et al. Characteristics of mobile phone access and usage among caregivers in Pakistan—a mHealth survey of urban and rural population. Int J Med Inform. 2021;156: 104600.PubMedCrossRef
40.
41.
go back to reference Li L, et al. Stay-at-home orders and the willingness to stay home during the COVID-19 pandemic: a stated-preference discrete choice experiment. PLoS One. 2021;16(7): e0253910.PubMedPubMedCentralCrossRef Li L, et al. Stay-at-home orders and the willingness to stay home during the COVID-19 pandemic: a stated-preference discrete choice experiment. PLoS One. 2021;16(7): e0253910.PubMedPubMedCentralCrossRef
42.
go back to reference Degeling C, et al. Changes in public preferences for technologically enhanced surveillance following the COVID-19 pandemic: a discrete choice experiment. BMJ Open. 2020;10(11): e041592.PubMedPubMedCentralCrossRef Degeling C, et al. Changes in public preferences for technologically enhanced surveillance following the COVID-19 pandemic: a discrete choice experiment. BMJ Open. 2020;10(11): e041592.PubMedPubMedCentralCrossRef
43.
go back to reference Ansolabehere S, Schaffner BF. Distractions: the incidence and consequences of interruptions for survey respondents. J Survey Stat Methodol. 2015;3(2):216–39.CrossRef Ansolabehere S, Schaffner BF. Distractions: the incidence and consequences of interruptions for survey respondents. J Survey Stat Methodol. 2015;3(2):216–39.CrossRef
45.
go back to reference Wang J, et al. Identifying and preventing fraudulent responses in online public health surveys: lessons learned during the COVID-19 pandemic. PLOS Global Public Health. 2023;3(8): e0001452.PubMedPubMedCentralCrossRef Wang J, et al. Identifying and preventing fraudulent responses in online public health surveys: lessons learned during the COVID-19 pandemic. PLOS Global Public Health. 2023;3(8): e0001452.PubMedPubMedCentralCrossRef
46.
go back to reference DeMatteis JM, et al. Falsification in surveys. Washington: American Association for Public Opinion Research; 2020. DeMatteis JM, et al. Falsification in surveys. Washington: American Association for Public Opinion Research; 2020.
47.
go back to reference Ryan M, et al. Survey modes comparison in contingent valuation: internet panels and mail surveys. Health Econ. 2020;29(2):234–42.PubMedCrossRef Ryan M, et al. Survey modes comparison in contingent valuation: internet panels and mail surveys. Health Econ. 2020;29(2):234–42.PubMedCrossRef
48.
go back to reference Mulhern B, et al. Binary choice health state valuation and mode of administration: head-to-head comparison of online and CAPI. Value Health. 2013;16(1):104–13.PubMedPubMedCentralCrossRef Mulhern B, et al. Binary choice health state valuation and mode of administration: head-to-head comparison of online and CAPI. Value Health. 2013;16(1):104–13.PubMedPubMedCentralCrossRef
49.
go back to reference Boyle KJ, et al. Investigating internet and mail implementation of stated-preference surveys while controlling for differences in sample frames. Environ Resource Econ. 2016;64:401–19.CrossRef Boyle KJ, et al. Investigating internet and mail implementation of stated-preference surveys while controlling for differences in sample frames. Environ Resource Econ. 2016;64:401–19.CrossRef
50.
go back to reference Turner AG. Sampling frames and master samples. United Nations Secretariat Statistics Division, 2003; p. 1–26. Turner AG. Sampling frames and master samples. United Nations Secretariat Statistics Division, 2003; p. 1–26.
51.
go back to reference Kalton G. Introduction to survey sampling. Sage Publications; 2020. Kalton G. Introduction to survey sampling. Sage Publications; 2020.
52.
go back to reference Noor S, Tajik O, Golzar J. Simple random sampling. Int J Educ Language Stud. 2022;1(2):78–82. Noor S, Tajik O, Golzar J. Simple random sampling. Int J Educ Language Stud. 2022;1(2):78–82.
53.
go back to reference Bansback N, et al. Testing a discrete choice experiment including duration to value health states for large descriptive systems: addressing design and sampling issues. Soc Sci Med. 2014;114:38–48.PubMedPubMedCentralCrossRef Bansback N, et al. Testing a discrete choice experiment including duration to value health states for large descriptive systems: addressing design and sampling issues. Soc Sci Med. 2014;114:38–48.PubMedPubMedCentralCrossRef
54.
go back to reference Mostafa SA, Ahmad IA. Recent developments in systematic sampling: a review. J Stat Theory Pract. 2018;12(2):290–310.CrossRef Mostafa SA, Ahmad IA. Recent developments in systematic sampling: a review. J Stat Theory Pract. 2018;12(2):290–310.CrossRef
55.
go back to reference Blasius J, Brandt M. Representativeness in online surveys through stratified samples. Bull Sociol Methodol/Bulletin de Méthodologie Sociologique. 2010;107(1):5–21.CrossRef Blasius J, Brandt M. Representativeness in online surveys through stratified samples. Bull Sociol Methodol/Bulletin de Méthodologie Sociologique. 2010;107(1):5–21.CrossRef
56.
go back to reference Khan MG, Reddy KG, Rao DK. Designing stratified sampling in economic and business surveys. J Appl Stat. 2015;42(10):2080–99.CrossRef Khan MG, Reddy KG, Rao DK. Designing stratified sampling in economic and business surveys. J Appl Stat. 2015;42(10):2080–99.CrossRef
57.
go back to reference Himelein K, Eckman S and Murray S. The use of random geographic cluster sampling to survey pastoralists. World Bank policy research working paper, 2013(6589). Himelein K, Eckman S and Murray S. The use of random geographic cluster sampling to survey pastoralists. World Bank policy research working paper, 2013(6589).
58.
go back to reference Milligan P, Njie A, Bennett S. Comparison of two cluster sampling methods for health surveys in developing countries. Int J Epidemiol. 2004;33(3):469–76.PubMedCrossRef Milligan P, Njie A, Bennett S. Comparison of two cluster sampling methods for health surveys in developing countries. Int J Epidemiol. 2004;33(3):469–76.PubMedCrossRef
60.
go back to reference Tchoubi S, et al. Prevalence and risk factors of overweight and obesity among children aged 6–59 months in Cameroon: a multistage, stratified cluster sampling nationwide survey. PLoS One. 2015;10(12): e0143215.PubMedPubMedCentralCrossRef Tchoubi S, et al. Prevalence and risk factors of overweight and obesity among children aged 6–59 months in Cameroon: a multistage, stratified cluster sampling nationwide survey. PLoS One. 2015;10(12): e0143215.PubMedPubMedCentralCrossRef
61.
go back to reference Vehovar V, Toepoel V and Steinmetz S. Non-probability sampling. Vol. 1. 2016: The Sage Handbook of Survey Methods. Vehovar V, Toepoel V and Steinmetz S. Non-probability sampling. Vol. 1. 2016: The Sage Handbook of Survey Methods.
62.
go back to reference Lehdonvirta V, et al. Social media, web, and panel surveys: using non-probability samples in social and policy research. Policy Internet. 2021;13(1):134–55.CrossRef Lehdonvirta V, et al. Social media, web, and panel surveys: using non-probability samples in social and policy research. Policy Internet. 2021;13(1):134–55.CrossRef
63.
go back to reference Johnson LC, et al. Sampling bias and other methodological threats to the validity of health survey research. Int J Stress Manag. 2000;7(4):247–67.CrossRef Johnson LC, et al. Sampling bias and other methodological threats to the validity of health survey research. Int J Stress Manag. 2000;7(4):247–67.CrossRef
64.
go back to reference Deaton A. The analysis of household surveys: a microeconometric approach to development policy. 1997: World Bank Publications. Deaton A. The analysis of household surveys: a microeconometric approach to development policy. 1997: World Bank Publications.
65.
go back to reference Russell ES, et al. 1604. Predicted Uptake of Novel HIV Treatment Options in the United States. In: Open Forum Infectious Diseases. 2023. Oxford University Press US. Russell ES, et al. 1604. Predicted Uptake of Novel HIV Treatment Options in the United States. In: Open Forum Infectious Diseases. 2023. Oxford University Press US.
66.
go back to reference Quaife M, et al. Divergent preferences for HIV prevention: a discrete choice experiment for multipurpose HIV prevention products in South Africa. Med Decis Making. 2018;38(1):120–33.PubMedCrossRef Quaife M, et al. Divergent preferences for HIV prevention: a discrete choice experiment for multipurpose HIV prevention products in South Africa. Med Decis Making. 2018;38(1):120–33.PubMedCrossRef
67.
go back to reference Vass CM, et al. Matching and weighting in stated preferences for health care. J Choice Modell. 2022;44: 100367.CrossRef Vass CM, et al. Matching and weighting in stated preferences for health care. J Choice Modell. 2022;44: 100367.CrossRef
68.
go back to reference Valliant R, Dever JA. Survey weights: a step-by-step guide to calculation, vol. 1. College Station: Stata Press; 2018. Valliant R, Dever JA. Survey weights: a step-by-step guide to calculation, vol. 1. College Station: Stata Press; 2018.
69.
go back to reference Hensher DA, Rose JM, Greene WH. Applied choice analysis: a primer. Cambridge University Press; 2005.CrossRef Hensher DA, Rose JM, Greene WH. Applied choice analysis: a primer. Cambridge University Press; 2005.CrossRef
70.
go back to reference Arora N, et al. Understanding the importance of non-material factors in retaining community health workers in low-income settings: a qualitative case-study in Ethiopia. BMJ Open. 2020;10(10): e037989.PubMedPubMedCentralCrossRef Arora N, et al. Understanding the importance of non-material factors in retaining community health workers in low-income settings: a qualitative case-study in Ethiopia. BMJ Open. 2020;10(10): e037989.PubMedPubMedCentralCrossRef
71.
go back to reference Phillips CV, LaPole LM. Quantifying errors without random sampling. BMC Med Res Methodol. 2003;3:1–10.CrossRef Phillips CV, LaPole LM. Quantifying errors without random sampling. BMC Med Res Methodol. 2003;3:1–10.CrossRef
72.
73.
75.
go back to reference Corry NH, et al. Assessing and adjusting for non-response in the Millennium Cohort Family Study. BMC Med Res Methodol. 2017;17:1–17.CrossRef Corry NH, et al. Assessing and adjusting for non-response in the Millennium Cohort Family Study. BMC Med Res Methodol. 2017;17:1–17.CrossRef
76.
go back to reference Huber J. CBC design for practitioners: what matters most. In: Sawtooth Software Conference. 2012. Sawtooth Software Orlando, FL. Huber J. CBC design for practitioners: what matters most. In: Sawtooth Software Conference. 2012. Sawtooth Software Orlando, FL.
77.
go back to reference Marshall D, et al. Conjoint analysis applications in health—how are studies being designed and reported? An update on current practice in the published literature between 2005 and 2008. Patient. 2010;3:249–56.PubMedCrossRef Marshall D, et al. Conjoint analysis applications in health—how are studies being designed and reported? An update on current practice in the published literature between 2005 and 2008. Patient. 2010;3:249–56.PubMedCrossRef
78.
go back to reference Johnson FR, et al. Constructing experimental designs for discrete-choice experiments: report of the ISPOR conjoint analysis experimental design good research practices task force. Value Health. 2013;16(1):3–13.CrossRef Johnson FR, et al. Constructing experimental designs for discrete-choice experiments: report of the ISPOR conjoint analysis experimental design good research practices task force. Value Health. 2013;16(1):3–13.CrossRef
79.
go back to reference Bridges JF, et al. Prioritizing strategies for comprehensive liver cancer control in Asia: a conjoint analysis. BMC Health Serv Res. 2012;12:1–12.CrossRef Bridges JF, et al. Prioritizing strategies for comprehensive liver cancer control in Asia: a conjoint analysis. BMC Health Serv Res. 2012;12:1–12.CrossRef
80.
go back to reference Huicho L, et al. Job preferences of nurses and midwives for taking up a rural job in Peru: a discrete choice experiment. PLoS One. 2012;7(12): e50315.PubMedPubMedCentralCrossRef Huicho L, et al. Job preferences of nurses and midwives for taking up a rural job in Peru: a discrete choice experiment. PLoS One. 2012;7(12): e50315.PubMedPubMedCentralCrossRef
81.
go back to reference Yang J-C, et al. Sample size and utility-difference precision in discrete-choice experiments: a meta-simulation approach. J Choice Modell. 2015;16:50–7.CrossRef Yang J-C, et al. Sample size and utility-difference precision in discrete-choice experiments: a meta-simulation approach. J Choice Modell. 2015;16:50–7.CrossRef
82.
go back to reference Ozdemir S, et al. Patient medication preferences for managing dry eye disease: the importance of medication side effects. Patient. 2022;15(6):679–90.PubMedPubMedCentralCrossRef Ozdemir S, et al. Patient medication preferences for managing dry eye disease: the importance of medication side effects. Patient. 2022;15(6):679–90.PubMedPubMedCentralCrossRef
83.
go back to reference Vaanholt MC, et al. Are component endpoints equal? A preference study into the practice of composite endpoints in clinical trials. Health Expect. 2018;21(6):1046–55.PubMedPubMedCentralCrossRef Vaanholt MC, et al. Are component endpoints equal? A preference study into the practice of composite endpoints in clinical trials. Health Expect. 2018;21(6):1046–55.PubMedPubMedCentralCrossRef
84.
go back to reference Mohamed AF, Hauber AB, Neary MP. Patient benefit-risk preferences for targeted agents in the treatment of renal cell carcinoma. Pharmacoeconomics. 2011;29:977–88.PubMedCrossRef Mohamed AF, Hauber AB, Neary MP. Patient benefit-risk preferences for targeted agents in the treatment of renal cell carcinoma. Pharmacoeconomics. 2011;29:977–88.PubMedCrossRef
85.
86.
go back to reference Smith SM, et al. A multi-group analysis of online survey respondent data quality: comparing a regular USA consumer panel to MTurk samples. J Bus Res. 2016;69(8):3139–48.CrossRef Smith SM, et al. A multi-group analysis of online survey respondent data quality: comparing a regular USA consumer panel to MTurk samples. J Bus Res. 2016;69(8):3139–48.CrossRef
87.
go back to reference Callegaro M, et al. Online panel research: a data quality perspective. John Wiley & Sons; 2014.CrossRef Callegaro M, et al. Online panel research: a data quality perspective. John Wiley & Sons; 2014.CrossRef
88.
go back to reference Hauber A, et al. A discrete-choice experiment of United Kingdom patients’ willingness to risk adverse events for improved function and pain control in osteoarthritis. Osteoarthritis Cartilage. 2013;21(2):289–97.PubMedCrossRef Hauber A, et al. A discrete-choice experiment of United Kingdom patients’ willingness to risk adverse events for improved function and pain control in osteoarthritis. Osteoarthritis Cartilage. 2013;21(2):289–97.PubMedCrossRef
90.
go back to reference Weernink MG, et al. Valuing treatments for Parkinson disease incorporating process utility: performance of best-worst scaling, time trade-off, and visual analogue scales. Value Health. 2016;19(2):226–32.PubMedCrossRef Weernink MG, et al. Valuing treatments for Parkinson disease incorporating process utility: performance of best-worst scaling, time trade-off, and visual analogue scales. Value Health. 2016;19(2):226–32.PubMedCrossRef
91.
go back to reference Tauscher JS, et al. Determinants of preference for telehealth versus in-person treatment for substance use disorders: a discrete choice experiment. J Substance Use Addict Treatment. 2023;146: 208938.CrossRef Tauscher JS, et al. Determinants of preference for telehealth versus in-person treatment for substance use disorders: a discrete choice experiment. J Substance Use Addict Treatment. 2023;146: 208938.CrossRef
92.
go back to reference Jonker MF, Roudijk B, Maas M. The sensitivity and specificity of repeated and dominant choice tasks in discrete choice experiments. Value Health. 2022;25(8):1381–9.PubMedCrossRef Jonker MF, Roudijk B, Maas M. The sensitivity and specificity of repeated and dominant choice tasks in discrete choice experiments. Value Health. 2022;25(8):1381–9.PubMedCrossRef
93.
go back to reference van den Broek-Altenburg E, Atherly A. Using discrete choice experiments to measure preferences for hard to observe choice attributes to inform health policy decisions. Heal Econ Rev. 2020;10(1):1–8. van den Broek-Altenburg E, Atherly A. Using discrete choice experiments to measure preferences for hard to observe choice attributes to inform health policy decisions. Heal Econ Rev. 2020;10(1):1–8.
94.
go back to reference van der Worp H, et al. Identifying women’s preferences for treatment of urinary tract infection: a discrete choice experiment. BMJ Open. 2021;11(11): e049916.PubMedPubMedCentralCrossRef van der Worp H, et al. Identifying women’s preferences for treatment of urinary tract infection: a discrete choice experiment. BMJ Open. 2021;11(11): e049916.PubMedPubMedCentralCrossRef
95.
go back to reference Miners A, et al. Preferences for HIV testing services among men who have sex with men in the UK: a discrete choice experiment. PLoS Med. 2019;16(4): e1002779.PubMedPubMedCentralCrossRef Miners A, et al. Preferences for HIV testing services among men who have sex with men in the UK: a discrete choice experiment. PLoS Med. 2019;16(4): e1002779.PubMedPubMedCentralCrossRef
96.
go back to reference Jonker MF. The garbage class mixed logit model: accounting for low-quality response patterns in discrete choice experiments. Value Health. 2022;25(11):1871–7.CrossRef Jonker MF. The garbage class mixed logit model: accounting for low-quality response patterns in discrete choice experiments. Value Health. 2022;25(11):1871–7.CrossRef
97.
go back to reference Gonzalez JM, et al. Did a bot eat your homework? An assessment of the potential impact of bad actors in online administration of preference surveys. PLoS One. 2023;18(10): e0287766.PubMedPubMedCentralCrossRef Gonzalez JM, et al. Did a bot eat your homework? An assessment of the potential impact of bad actors in online administration of preference surveys. PLoS One. 2023;18(10): e0287766.PubMedPubMedCentralCrossRef
98.
go back to reference Mansfield C, Sutphin J, Gallaher K. Response quality in discrete-choice experiments: an extreme example of detecting fraud. The Patient. 2019;12(4):434–5. Mansfield C, Sutphin J, Gallaher K. Response quality in discrete-choice experiments: an extreme example of detecting fraud. The Patient. 2019;12(4):434–5.
99.
go back to reference Rydén A, et al. Discrete choice experiment attribute selection using a multinational interview study: treatment features important to patients with type 2 diabetes mellitus. The Patient. 2017;10:475–87.PubMedCrossRef Rydén A, et al. Discrete choice experiment attribute selection using a multinational interview study: treatment features important to patients with type 2 diabetes mellitus. The Patient. 2017;10:475–87.PubMedCrossRef
100.
go back to reference Janssen EM, et al. Improving the quality of discrete-choice experiments in health: how can we assess validity and reliability? Expert Rev Pharmacoecon Outcomes Res. 2017;17(6):531–42.PubMedCrossRef Janssen EM, et al. Improving the quality of discrete-choice experiments in health: how can we assess validity and reliability? Expert Rev Pharmacoecon Outcomes Res. 2017;17(6):531–42.PubMedCrossRef
101.
102.
go back to reference Finkelstein EA, et al. Understanding factors that influence the demand for dialysis among elderly in a multi-ethnic Asian society. Health Policy. 2018;122(8):915–21.PubMedCrossRef Finkelstein EA, et al. Understanding factors that influence the demand for dialysis among elderly in a multi-ethnic Asian society. Health Policy. 2018;122(8):915–21.PubMedCrossRef
103.
go back to reference Kanninen BJ. Optimal design for multinomial choice experiments. J Mark Res. 2002;39(2):214–27.CrossRef Kanninen BJ. Optimal design for multinomial choice experiments. J Mark Res. 2002;39(2):214–27.CrossRef
104.
go back to reference Coggon D, Barker D, Rose G. Epidemiology for the uninitiated. John Wiley & Sons; 2009. Coggon D, Barker D, Rose G. Epidemiology for the uninitiated. John Wiley & Sons; 2009.
105.
go back to reference Johnson FR, Yang J-C, Reed SD. The internal validity of discrete choice experiment data: a testing tool for quantitative assessments. Value Health. 2019;22(2):157–60.PubMedCrossRef Johnson FR, Yang J-C, Reed SD. The internal validity of discrete choice experiment data: a testing tool for quantitative assessments. Value Health. 2019;22(2):157–60.PubMedCrossRef
106.
go back to reference Coast J, et al. Using qualitative methods for attribute development for discrete choice experiments: issues and recommendations. Health Econ. 2012;21(6):730–41.PubMedCrossRef Coast J, et al. Using qualitative methods for attribute development for discrete choice experiments: issues and recommendations. Health Econ. 2012;21(6):730–41.PubMedCrossRef
107.
go back to reference Vass C, Rigby D, Payne K. The role of qualitative research methods in discrete choice experiments: a systematic review and survey of authors. Med Decis Making. 2017;37(3):298–313.PubMedPubMedCentralCrossRef Vass C, Rigby D, Payne K. The role of qualitative research methods in discrete choice experiments: a systematic review and survey of authors. Med Decis Making. 2017;37(3):298–313.PubMedPubMedCentralCrossRef
108.
go back to reference Kløjgaard ME, Bech M, Søgaard R. Designing a stated choice experiment: the value of a qualitative process. J Choice Modell. 2012;5(2):1–18.CrossRef Kløjgaard ME, Bech M, Søgaard R. Designing a stated choice experiment: the value of a qualitative process. J Choice Modell. 2012;5(2):1–18.CrossRef
109.
go back to reference Ostermann J, et al. Heterogeneous HIV testing preferences in an urban setting in Tanzania: results from a discrete choice experiment. PLoS One. 2014;9(3): e92100.PubMedPubMedCentralCrossRef Ostermann J, et al. Heterogeneous HIV testing preferences in an urban setting in Tanzania: results from a discrete choice experiment. PLoS One. 2014;9(3): e92100.PubMedPubMedCentralCrossRef
110.
go back to reference Veldwijk J, et al. Words or graphics to present a discrete choice experiment: does it matter? Patient Educ Couns. 2015;98(11):1376–84.PubMedCrossRef Veldwijk J, et al. Words or graphics to present a discrete choice experiment: does it matter? Patient Educ Couns. 2015;98(11):1376–84.PubMedCrossRef
111.
go back to reference Mühlbacher AC et al. How to present a decision object in health preference research: attributes and levels, the decision model, and the descriptive framework. The Patient-Patient-Centered Outcomes Research, 2024: p. 1–12. Mühlbacher AC et al. How to present a decision object in health preference research: attributes and levels, the decision model, and the descriptive framework. The Patient-Patient-Centered Outcomes Research, 2024: p. 1–12.
112.
go back to reference Marshall DA et al. Stated-preference survey design and testing in health applications. The Patient-Patient-Centered Outcomes Research, 2024: p. 1–11. Marshall DA et al. Stated-preference survey design and testing in health applications. The Patient-Patient-Centered Outcomes Research, 2024: p. 1–11.
113.
go back to reference Veldwijk J, et al. Mimicking real-life decision making in health: allowing respondents time to think in a discrete choice experiment. Value Health. 2020;23(7):945–52.PubMedCrossRef Veldwijk J, et al. Mimicking real-life decision making in health: allowing respondents time to think in a discrete choice experiment. Value Health. 2020;23(7):945–52.PubMedCrossRef
114.
go back to reference Ozdemir S. Improving the validity of stated-preference data in health research: the potential of the time-to-think approach. The Patient. 2015;8:247–55.PubMedCrossRef Ozdemir S. Improving the validity of stated-preference data in health research: the potential of the time-to-think approach. The Patient. 2015;8:247–55.PubMedCrossRef
115.
go back to reference Özdemir S, Johnson FR, Hauber AB. Hypothetical bias, cheap talk, and stated willingness to pay for health care. J Health Econ. 2009;28(4):894–901.PubMedCrossRef Özdemir S, Johnson FR, Hauber AB. Hypothetical bias, cheap talk, and stated willingness to pay for health care. J Health Econ. 2009;28(4):894–901.PubMedCrossRef
116.
go back to reference Regier DA, et al. Demand for precision medicine: a discrete-choice experiment and external validation study. Pharmacoeconomics. 2020;38:57–68.PubMedCrossRef Regier DA, et al. Demand for precision medicine: a discrete-choice experiment and external validation study. Pharmacoeconomics. 2020;38:57–68.PubMedCrossRef
117.
go back to reference Aguiar M, et al. Designing discrete choice experiments using a patient-oriented approach. The Patient. 2021;14(4):389–97.PubMedCrossRef Aguiar M, et al. Designing discrete choice experiments using a patient-oriented approach. The Patient. 2021;14(4):389–97.PubMedCrossRef
118.
go back to reference Watson V, Becker F, de Bekker-Grob E. Discrete choice experiment response rates: a meta-analysis. Health Econ. 2017;26(6):810–7.PubMedCrossRef Watson V, Becker F, de Bekker-Grob E. Discrete choice experiment response rates: a meta-analysis. Health Econ. 2017;26(6):810–7.PubMedCrossRef
119.
go back to reference Groves RM, Presser S, Dipko S. The role of topic interest in survey participation decisions. Public Opin Q. 2004;68(1):2–31.CrossRef Groves RM, Presser S, Dipko S. The role of topic interest in survey participation decisions. Public Opin Q. 2004;68(1):2–31.CrossRef
120.
go back to reference Tolonen H, et al. Effect on trend estimates of the difference between survey respondents and non-respondents: results from 27 populations in the WHO MONICA Project. Eur J Epidemiol. 2005;20:887–98.PubMedCrossRef Tolonen H, et al. Effect on trend estimates of the difference between survey respondents and non-respondents: results from 27 populations in the WHO MONICA Project. Eur J Epidemiol. 2005;20:887–98.PubMedCrossRef
121.
go back to reference Rockwood K, et al. Response bias in a health status survey of elderly people. Age Ageing. 1989;18(3):177–82.PubMedCrossRef Rockwood K, et al. Response bias in a health status survey of elderly people. Age Ageing. 1989;18(3):177–82.PubMedCrossRef
Metadata
Title
An Overview of Data Collection in Health Preference Research
Authors
Semra Ozdemir
Matthew Quaife
Ateesha F. Mohamed
Richard Norman
Publication date
25-04-2024
Publisher
Springer International Publishing
Published in
The Patient - Patient-Centered Outcomes Research
Print ISSN: 1178-1653
Electronic ISSN: 1178-1661
DOI
https://doi.org/10.1007/s40271-024-00695-6
Live Webinar | 27-06-2024 | 18:00 (CEST)

Keynote webinar | Spotlight on medication adherence

Live: Thursday 27th June 2024, 18:00-19:30 (CEST)

WHO estimates that half of all patients worldwide are non-adherent to their prescribed medication. The consequences of poor adherence can be catastrophic, on both the individual and population level.

Join our expert panel to discover why you need to understand the drivers of non-adherence in your patients, and how you can optimize medication adherence in your clinics to drastically improve patient outcomes.

Prof. Kevin Dolgin
Prof. Florian Limbourg
Prof. Anoop Chauhan
Developed by: Springer Medicine
Obesity Clinical Trial Summary

At a glance: The STEP trials

A round-up of the STEP phase 3 clinical trials evaluating semaglutide for weight loss in people with overweight or obesity.

Developed by: Springer Medicine
Webinar | 06-02-2024 | 20:00 (CET)

Mastering chronic pancreatitis pain: A multidisciplinary approach and practical solutions

Severe pain is the most common symptom of chronic pancreatitis. In this webinar, experts share the latest insights in pain management for chronic pancreatitis patients. Experts from a range of disciplines discuss pertinent cases and provide practical suggestions for use within clinical practice.

Sponsored by: Viatris

Developed by: Springer Healthcare