ABSTRACT
Two alternative user interface designs were subjected to user testing to measure user performance in a database query task. User performance was also estimated heuristically in three different ways and by use of formal GOMS modelling. The estimated values for absolute user performance had very high variability, but estimates of the relative advantage of the fastest interface were less variable. Choosing the fastest of the two designs would have a net present value more than 1,000 times the cost of getting the estimates. A software manager would make the correct choice every time in our case study if decisions were based on at least three independent estimates. User testing was 4.9 times as expensive as the cheapest heuristic method but provided better performance estimates.
- 1.Bellotti, V. (1988). implications of current design practice for the use of HCI techniques. In Jones, D.M., and Winder, R. (Eds.), People and Computers IV, Cambridge University Press, Cambridge, U.K., 13-34. Google ScholarDigital Library
- 2.Card, S.K., Moran, T.P., and Newell, A. (1983). The Psychology of Human-Computer Interaction. Erlbaum Associates, Hillsdale, NJ. Google ScholarDigital Library
- 3.Gray, W.D., John, B.E., and Atwood, M.E. (1992). The precis of project Ernestine, or, an overview of a validation of GOMS. Proc. ACM CHI'92 (Monterey, CA, 3-7 May), 307-312. Google ScholarDigital Library
- 4.Jeffries, R., Miller, J.R., Wharton, C., and Uyeda, K.M. (1991). User interface evaluation in the real world: A comparison of four techniques. Proc. ACM CHI' 91 (New Orleans, LA, 27 April-2 May), 119-124. Google ScholarDigital Library
- 5.Karat, C., Campbell, R., and Fiegel, T. (1992). Comparisons of empirical testing and walkthrough methods in user interface evaluation. Proc. ACM CHI'92 (Monterey, CA, 3-7 May), 397--404. Google ScholarDigital Library
- 6.Mack, R.L., and Nielsen, J. (1993). Usability inspection methods. ACM SIGCHI Bulletin 25, I (January). Google ScholarDigital Library
- 7.Mantei, M.M., and Teorey, T.J. (1988). Cost/benefit analysis for incorporating human factors in the software lifecycle. Communications of the ACM 31, 4 (April), 428-439. Google ScholarDigital Library
- 8.Nielsen, J. (1992a). The usability engineering life cycle. IEEE Computer 25, 3 (March), 12-22. Google ScholarDigital Library
- 9.Nielsen, J. (1992b). Finding usability problems through heuristic evaluation. Proc. ACM CHI' 92 (Monterey, CA, 3-7 May), 373-380. Google ScholarDigital Library
- 10.Nielsen, J. (1992c). Reliability of severity estimates for usability problems found by heuristic evaluation. In Digest of Posters and Short Talks, ACM CHI'92 Conference (Monterey, CA, 3-7 May), 129-130. Google ScholarDigital Library
- 11.Nielsen, J. (1993a). Usability Engineering, Academic Press, San Diego, CA. Google ScholarCross Ref
- 12.Nielsen, J. (1993b). Heuristic evaluation. In Nielsen. J., and Mack, R.L. (Eds.), Usability Inspection Methods, book under preparation. Google ScholarDigital Library
- 13.Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces. Proc. ACM CHI'90 (Seattle, WA, 1-5 April), 249-256. Google ScholarDigital Library
- 14.Olson, J.R., and Nilsen, E. (1988). Analysis of the cognition involved in spreadsheet software interaction. Human- Computer Interaction 3, 4, 309-349.Google ScholarDigital Library
- 15.Olson, J.R., and Olson, G.M. (1990). The growth of cognitive modeling in human-computer interaction since GOMS. Human-Computer Interaction 5, 2&3, 221-265.Google Scholar
Index Terms
- Estimating the relative usability of two interfaces: heuristic, formal, and empirical methods compared
Recommendations
Usability evaluation methods: mind the gaps
SAICSIT '09: Proceedings of the 2009 Annual Research Conference of the South African Institute of Computer Scientists and Information TechnologistsThe strengths and weaknesses of heuristic evaluation have been well researched. Despite known weaknesses, heuristic evaluation is still widely used since formal usability testing (also referred to as empirical user testing) is more costly and time ...
Implications for designing the user experience of DVD menus
CHI EA '05: CHI '05 Extended Abstracts on Human Factors in Computing SystemsDVD menus often miss out on usability and are complex and difficult to navigate through. One of the main problems is the lack of design standards. By conducting an expert walkthrough we identified typical usability issues of DVD menus and verified them ...
Detailed Usability Heuristics: A Breakdown of Usability Heuristics to Enhance Comprehension for Novice Evaluators
HCI International 2020 - Late Breaking Papers: User Experience Design and Case StudiesAbstractHeuristic evaluation (HE) is one of the most commonly used usability evaluation methods. In HE, 3–5 evaluators evaluate a certain system guided by a list of usability heuristics with the goal of detecting usability issues. Although HE is popular ...
Comments