Abstract
Background
Literature reviews show stated-preference studies, used to understand the values individuals place on health and health care, are increasingly administered online, potentially maximising respondent access and allowing for enhanced response quality. Online respondents may often choose whether to use a desktop or laptop personal computer (PC), tablet or smartphone, all with different screen sizes and modes of data entry, to complete the survey. To avoid differences in measurement errors, frequently respondents are asked to complete the surveys on a PC despite evidence that handheld devices are increasingly used for internet browsing. As yet, it is unknown if or how the device used to access the survey affects responses and/or the subsequent valuations derived.
Method
This study uses data from a discrete choice experiment (DCE) administered online to elicit preferences of a general population sample of females for a national breast screening programme. The analysis explores differences in key outcomes such as completion rates, engagement with the survey materials, respondent characteristics, response time, failure of an internal validity test and health care preferences for (1) handheld devices and (2) PC users. Preferences were analysed using a fully correlated random parameter logit (RPL) model to allow for unexplained scale and preference heterogeneity.
Results
One thousand respondents completed the survey in its entirety. The most popular access devices were PCs (n = 785), including Windows (n = 705) and Macbooks (n = 69). Two-hundred and fifteen respondents accessed the survey on a handheld device. Most outcomes related to survey behaviour, including failure of a dominance check, ‘flat lining’, self-reported attribute non-attendance (ANA) or respondent-rated task difficulty, did not differ by device type (p > 0.100). Respondents accessing the survey using a PC were generally quicker (median time to completion 14.5 min compared with 16 min for those using handheld devices) and were significantly less likely to speed through a webpage. Although there was evidence of preference intensity (taste) or variability (scale) heterogeneity across respondents in the sample, it was not driven by the access device.
Conclusion
Overall, we find that neither preferences nor choice behaviour is associated with the type of access device, as long as respondents are presented with question formats that are easy to use on small touchscreens. Health preference researchers should optimise preference instruments for a range of devices and encourage respondents to complete the surveys using their preferred device. However, we suggest that access device characteristics should be gathered and included when reporting results.
Similar content being viewed by others
References
Louviere J, Hensher D, Swait J. Stated choice methods: analysis and application [Internet]. Cambridge University Press; 2000.
Smith RD, Sach TH. Contingent valuation: what needs to be done? Heal Econ Policy Law [Internet]. 2010;5:91–111.
Soekhai V, Whichello C, Levitan B, Veldwijk J, Pinto CA, Donkers B, et al. Methods for exploring and eliciting patient preferences in the medical product lifecycle: a literature review. Drug Discov. Today. 2019. p. 1324–31.
Clark M, Determann D, Petrou S, Moro D, de Bekker-Grob EW. Discrete choice experiments in health economics: a review of the literature. Pharmacoeconomics [Internet]. 2014;32:883–902. https://doi.org/10.1002/hec.1697/full.
Soekhai V, de Bekker-Grob EW, Ellis AR, Vass CM. Discrete choice experiments in health economics: past, present and future. Pharmacoeconomics. 2019. p. 201–26.
Vass CM, Davison NJ, Vander Stichele G, Payne K. A picture is worth a thousand words: the role of survey training materials in stated-preference studies. Patient [Internet]. 2020;13:163–73. https://doi.org/10.1007/s40271-019-00391-w.
Lim SL, Yang JC, Ehrisman J, Havrilesky LJ, Reed SD. Are videos or text better for describing attributes in stated-preference surveys? Patient. 2020.
Callegaro M, Lozar Manfreda K, Vehovar V. Web survey methodology. London: Sage; 2015.
Watson V, Porteous T, Bolt T, Ryan M. Mode and frame matter: assessing the impact of survey mode and sample frame in choice experiments. Med Decis Mak. 2019;0272989X1987103.
Ryan M, Mentzakis E, Matheson C, Bond C. Survey modes comparison in contingent valuation: internet panels and mail surveys. Heal Econ (United Kingdom). 2020;29:234–42.
Determann D, Lambooij MS, Steyerberg EW, de Bekker-Grob EW, de Wit GA. Impact of survey administration mode on the results of a health-related discrete choice experiment: online and paper comparison. Value Heal. 2017;20:953–60.
Statcounter. GlobalStats [Internet]. 2020. https://gs.statcounter.com/. Accessed 10 Jul 2020.
Lugtig P, Toepoel V. The use of PCs, smartphones, and tablets in a probability-based panel survey: effects on survey measurement error. Soc Sci Comput Rev. 2016;34:78–94.
Antoun C, Couper MP, Conrad FG. Effects of mobile versus PC web on survey response quality [Internet]. Public Opin Q. 2017. p. 280–306.
Fuchs M, Busse B. The coverage bias of mobile web surveys across European countries. Int J Internet Sci. 2009;4:21–33.
Vaportzis E, Clausen MG, Gow AJ. Older adults perceptions of technology and barriers to interacting with tablet computers: A focus group study. Front Psychol. 2017;8:1687.
Wells T, Bailey JT, Link MW. Comparison of smartphone and online computer survey administration. Soc Sci Comput Rev. 2014;32:238–55.
Struminskaya B, Weyandt K, Bosnjak M. The effects of questionnaire completion using mobile devices on data quality. Evidence from a probability-based general population panel. Methods Data Anal. 2015;9:261–92.
de Bruijne M, Wijnant A. Mobile response in web panels. Soc Sci Comput Rev. 2014;32:728–42.
Liebe U, Glenk K, Oehlmann M, Meyerhoff J. Does the use of mobile devices (tablets and smartphones) affect survey quality and choice behaviour in web surveys? J Choice Model. 2015;14:17–31.
Couper MP, Peterson GJ. Why do web surveys take longer on smartphones? Soc Sci Comput Rev. 2017;35:357–77.
Hartman JD, Craig BM. Does device or connection type affect health preferences in online surveys? Patient. 2019;12:639–50.
Vass CM, Rigby D, Payne K. Investigating the heterogeneity in women’s preferences for breast screening: does the communication of risk matter? Value Heal. 2018;21:219–28.
Vass C, Rigby D, Payne K. “I was trying to do the maths”: exploring the impact of risk communication in discrete choice experiments. Patient. 2019;12:113–23.
Swait J, Louviere J. The Role of the Scale Parameter in the Estimation and Comparison of Multinomial Logit Models. J Mark Res. 1993;30:305–14.
Thurstone L. A law of comparative judgment. Psychol Rev. 1927;34:273–86.
McFadden D. Conditional logit analysis of qualitative choice behaviour. Zarembka P, editor. Front. Econom. New York: Academic Press INC; 1974. p. 105–42.
Hauber AB, González JM, Groothuis-Oudshoorn CGM, Prior T, Marshall DA, Cunningham C, et al. Statistical methods for the analysis of discrete choice experiments: a report of the ISPOR conjoint analysis good research practices task force. Value Heal [Internet]. 2016;19:300–15. https://doi.org/10.1016/j.jval.2016.04.004.
Hensher D, Greene W. The mixed logit model: the state of practice. Transport [Internet]. 2003;30:133–76. https://doi.org/10.1023/A:1022558715350.
Hess S, Rose JM. Can scale and coefficient heterogeneity be separated in random coefficients models? Transportation (Amst). 2012;39:1225–39.
Hess S, Train K. Correlation and scale in mixed logit models. J Choice Model. 2017;23:1–8.
Train K. Discrete choice methods with simulation [Internet]. 2nd ed. Cambridge University Press; 2009 [cited 2015 May 25].
StataCorp. Stata Statistical Software: Release 16. College Station: StataCorp LP. 2019.
Hess S, Palma D. Apollo: A flexible, powerful and customisable freeware package for choice model estimation and application. J Choice Model. 2019;32:100170.
Mavletova A, Couper MP. Sensitive topics in PC web and mobile web surveys: is there a difference? Surv Res Methods. 2013;7:191–205.
Toninelli D, Revilla M. Smartphones vs PCs: does the device affect the web survey experience and the measurement error for sensitive topics? A replication of the mavletova & Couper’s 2013 experiment. Surv Res Methods. 2016;10:153–69.
Glushkova S, Belotserkovich D, Morgunova N, Yuzhakova Y. The role of smartphones and the Internet in developing countries. Espacios. 2019;40:27.
Revilla M, Toninelli D, Ochoa C. PCs versus Smartphones in answering web surveys: does the device make a difference? Surv Pract. 2016;9:1–6.
Acknowledgements
We are grateful to Professors Katherine Payne, Dan Rigby, and Stephen Campbell of the University of Manchester for their feedback during the development of the stated-preference survey.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Funding
Caroline Vass was in receipt of a National Institute for Health Research (NIHR) School for Primary Care Research (SPCR) PhD studentship between October 2011 and 2014. The views expressed are those of the author(s) and not necessarily those of the NIHR or the Department of Health and Social Care.
Conflict of interest
CV and MB have no conflicts of interests.
Ethics approval
This paper was prepared in compliance with ethical standards. Ethical approval for the study was granted by The University of Manchester’s Research Ethics Committee.
Consent to participate
Informed consent was obtained from all individual participants included in the study.
Consent for publication
Before consenting, participants were informed that their data would be analysed for a PhD thesis and may be published in a journal article.
Data availability
The data sets generated during and/or analysed during the current study are still undergoing further analyses and are unavailable.
Code availability
Example code is available from the authors upon request.
Author contributions
CV collected the data, conceived the research question, and conducted the analysis. MB conceived the research question, designed the analysis, and interpreted the results. Both authors contributed to drafting, revising, and finalising the manuscript.
Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Vass, C.M., Boeri, M. Mobilising the Next Generation of Stated-Preference Studies: the Association of Access Device with Choice Behaviour and Data Quality. Patient 14, 55–63 (2021). https://doi.org/10.1007/s40271-020-00484-x
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40271-020-00484-x